Intro Mind Notes, Weeks 5-6: Consciousness and Materialism
(HMW, Ch. 2, pp. 131-148
and Churchland, Matter and Consciousness, pp. 26-42)
A. The Issue: How is Consciousness Possible in a Natural World?
-
The fundamental position of the naturalist (materialist) is that nothing
exists other than what is found among the objects, events and features
of the natural world, that is, the world studied by the sciences.
-
Churchland outlines three different naturalist strategies for explaining
the mind, the last two of which are already familiar: Reductive Materialism
, Functionalism , and Eliminative Materialism. I will explain
these approaches in more detail and then look at some of the major objections
to them. Most of the objections are related to the intuition that consciousness
is something special, and cannot be accounted for by naturalism.
B. Reductive Materialism (a.k.a. The Identity Theory)
-
The fundamental idea of reductive materialism is that every kind of mental
state, including consciousness, just is a corresponding kind of
physical state. For example, pain just is a certain kind of neuron firing,
namely the firing of c-fibers. (Actually, pain is more complex than c-fiber
firing, but philosophers use the phrase 'c-fiber firing' as a code word
for the more complex kind of neural firing that is responsible for pain.)
-
The claim that each type of mental state is a corresponding physical state
is similar to other reductions which have been established in science.
Water just is H2O. Light just is electromagnetic radiation. Heat just is
average molecular kinetic energy. The sciences of chemistry and physics
have taught us what water, light, and heat are, and have given us immense
insight into their natures. Similarly, the science of psychology will show
how to reduce mental events to the corresponding physical events that occur
in our brain.
-
Reductive materialism has gone out of fashion as a naturalist theory of
the mind (although there has been a bit of a revival lately). One of the
main reasons has been the Multiple Realizability Objection . The
objection is that if the reductive materialist is right, then only a brain
can have mental features. But imagine that the technology of the future
allows us to replace small portions of the brain with silicon circuits
that perform exactly the same function. Imagine that patients are given
silicon replacements for parts of the brain that are lost due to stroke
or disease. Since the replacement parts do exactly the same job as the
regions of the brain that were lost, it is hard to see how the replacement
could matter to whether the person continued to remain conscious. In short,
it is not the material that matters to consciousness, but the functioning
of that material. As long as the brain's former function is preserved,
it cannot matter whether the functions are performed by neurons or silicon
circuits. Even a person whose entire brain was replaced due to a series
of replacement operations would still be conscious. Mental states can be
realized in many different physical systems, in the same way that
computers can be manufactured using very different physical principles:
vacuum tubes, transistors, .. etc.. On the other hand, if reductive materialism
is right, mental sates can only occur in a brain, so people with replacement
parts are zombies; though they may act conscious they are not. That conclusion
seems too harsh.
C. Functionalism (a.k.a. Non-Reductive Materialism)
-
Functionalists believe that the nature of the material does not matter
to whether a system counts as a mind. So they do not identify types of
mental state with brain states. Instead they adopt a more liberal policy:
any system with the function of a brain would count as having a
conscious mind. Consciousness does not require a brain, since it is a function
of the brain that can be performed in by other mechanisms.
-
As I have explained, computationalists are functionalists, for they believe
that any system no matter how constructed would count as having a mind
as long as it runs the right kind of program.
-
Functionalism explains how mental features reduce to features of the natural
world. However the functionalist account of that reduction differs somewhat
from the vie preferred by reductive materialists. Functionalists agree
with reductive materialists that each instance (or token ) of a
mental state is identical with a physical state. For example, in a patient
before pain fiber replacement therapy, pain is c-fiber firing. After the
replacement, pain is the firing of the replacement wiring.
-
However, functionalists deny one idea crucial to reductive materialism:
that each type of mental state is identical to a corresponding
type
in the natural world. Although each token mental state (say pain)
at a give time in a given person is a corresponding physical state, the
type of physical state that pain is can vary - it may be multiply
realized in different physical systems: c-fibers, replacement wiring, silicon
chips, etc..
-
Philosophers express the difference between reductive materialists and
functionalists by saying that both believe in token identity of
mental and physical states, but reductive materialism does and functionalism
does not accept their type identity .
D. Eliminative Materialism
-
Eliminative Materialists think that many of the folk concepts we
use to explain actions of others are not scientifically acceptable. Instead
of trying to explain mental states by reducing them in some way to physical
states, they believe that talk of such mental states is simply not good
science. In a sense there are no mental states, so there is no need to
reduce them to the physical world.
-
Eliminative materialists differ from behaviorists in thinking that is it
acceptable to use theoretical terms that lack observational definitions.
However, they also believe that the present set of theoretical terms we
use to describe or characterize the mind are no better than outmoded concepts
such as the idea of celestial spheres. These ideas may even be useful on
occasion, but they do not describe what is really there. To understand
what we call the "mind", we need to craft a new set of concepts that are
evolving from the discoveries of cognitive science.
E. Consciousness: The Easy and Hard Problems
-
We all know what it feels like to be in pain. It is hard to imagine how
such feelings could arise in brain tissue - what amounts to chunks of meat.
This intuition has lead some philosophers to object to naturalism, or at
least to complain that pain and other conscious feelings have not been,
or cannot ever be explained by science. This leads some of them to dualistic
conclusions.
-
Pinker suggests that 'consciousness' has three rather different meanings:
self-knowledge
, access-consciousness , and sentience . In the first two
cases, consciousness is explained in terms of cognitive abilities, or mental
functions. Explaining what these functions are and how the brain manages
to perform them is what some philosophers call the
easy problem of consciousness
(not that it is really all that easy). It would seem that the easy problem
is, at least in principle, something that the natural sciences would be
able to resolve.
-
On the other hand, explaining sentience (how things feel) in natural terms
seems a greater challenge. Most people find it difficult to believe that
the specially marvelous taste and feel you have when you eat coffee ice
cream could be the activity of neurons. Philosophers call such experiences
qualia.
(The singular is 'quale'). Explaining how qualia can be part of the natural
world is the hard problem of consciousness .
F. Access-Consciousness
-
Pinker believes that cognitive science is well on its way to understanding
access-consciousness. The idea is that an efficient information processing
system needs to limit access to information to the various modules. Each
module does its own job while sharing relatively little information with
the others.
-
However, for intelligence, a brain needs bind together the relevant information
delivered by many modules in order to formulate plans and take action.
The information available to systems that perform this function makes up
our conscious life.
-
Pinker distinguishes 4 functions of this central system: restriction of
sensory information, the spotlight of attention, emotions, and the will.
The central system needs only the most highly processed information delivered
by the senses. For example, we hear words, not phonemes when listening
to a conversation. Second, the central system is able to direct attention
from one aspect to another in the sensory field. Ann Triesman's work has
revealed that attention may be involved in binding together different aspects
(color, shape) in the visual field. Consider also our ability to shift
attention from one conversation to another at a cocktail party. Third,
our emotions play a role in formulating goals and plans, for example, emotion
is crucial to the direction of attention. For example, consider how hearing
the voice of a loved one is a magnet for our attention. Fourth, our will
is responsible for the final decision making that makes it possible for
us to act.
G. Qualia, Emotions and Consciousness
-
Qualia include the conscious experience of redness, the taste of almonds,
or emotional states such as feeling fear. Computational theories do not
seem to even begin to explain these conscious experiences. It seems that
these feelings lie outside the range of any computational of physiological
theory. If so, cognitive science is essentially incomplete. It will never
yield a full account of our mental lives. (These ideas are naturally aligned
with some brand of dualism.) Responses to this challenge can be divided
into three according to the three naturalist strategies outlined in B-D
above.
-
Reductive Materialism claims qualia may be accounted for via the
particular sensory vectors that are provided by the sensory equipment.
(For example, emotions might be the sensation of our own body states.)
Consciousness can be explained, for example, via the synchronization between
neural groups controlled by the thalamus. Reductive materialists will insist
that consciousness requires certain kinds of physical states to be present.
So they may tend to view the presence of a brain as a requirement for consciousness.
-
Functionalism claims that qualia, emotions and consciousness in
general just are certain functional features of the brain. The feeling
of pain, for example, just is the state with certain typical relations
to inputs (damage to the body), outputs (wincing, saying "Ow") and other
mental states (fear, distress, avoidance plans). Any state in any physical
system that sets up this set of relationships would count as pain. So Martians,
and even computers can be conscious if they are set up the right way. Feelings
in humans are the results of the functioning of the brain, but anything
that functions the way the brain does would also be conscious. For example,
consciousness can be explained as the functioning of the attentional system
- the system that binds information together to obtain a unified view of
the world. Or perhaps consciousness is also related to the contents of
short-term memory, and mid-level processing during perception. Pinker calls
this functional account access-consciousness. Functionalists will explain
that access-consciousness amounts to sentience; that is, an explanation
of the functional features that are involved in conscious states just is
an explanation of why they feel the way they do.
-
Eliminative Materialism responds to the problem of consciousness
by saying that cognitive science just isn't about the way things feel.
Feelings are a kind of myth that science should not take seriously. Although
cognitive science has as a matter of fact tended to ignore feelings, the
idea that it can not because it should not seems excessively harsh. Shouldn't
a theory of our cognitive life account for them in some way? For this reason
the eliminative materialist answer to the problem of consciousness is not
very popular, so we will lay it aside in this class.
-
Churchland discusses objections to each of these three naturalist theories
of the mind. Those objections and his replies for Reductive Materialism
and Functionalism follow. Note that Reductive Materialism and Functionalism
are alike in many ways, so objections and replies to an objection to the
Functionalist position may also apply to Reductive Materialism, and vice
versa.
H. Objections and Replies: Reductive Materialism
-
Introspective Evidence. Our subjective experience shows us conclusively
that sensations such as pain are radically different from anything to be
found in the physical world. It should be obvious that activities of neurons
are not at all like feelings.
Reply: Our intuitions about what is identical to what are not
to be trusted. It is also hard to believe that light is the oscillation
of electro-magnetic waves, but we have discovered that this is so. It also
seemed once that life was something radically different from what could
be explained in naturalistic terms, but now we have a nearly complete naturalistic
explanation of life.
-
The Category Mistake Objection . It is a category mistake
to say fear is a neural net activity. Neural net activations have properties
that pains do not have, for example they are located in space, while fears
are not. Another difference is that neural net activities can be publicly
inspected, but my fears are private: only I can have my fears or know how
they feel.
Reply: It is very dangerous to argue from what we think we know
about mental states and physical states to draw conclusions about what
is identical to what. For we can be ignorant about what is identical to
what and so our ideas can lead us astray. This sort of reasoning would
lead people to deny that heat was molecular motion on the grounds that
heat is a feeling while molecular motion is not a feeling. The answer is
that science shows that (since heat is molecular motion) certain molecular
motions are feelings.
-
The Problem of Intentionality. The objector claims that there is
no way to give an naturalistic explanation for how humans can understand
or mean anything by their use of words.
Reply: Churchland responds with the same answer that Pinker
gave: a combination of the Causal Role and Inferential Role answers.
-
What Mary Didn't Know. Imagine Mary the super-scientist who
happens to know all there is to know about the physical world. Imagine
also that she has never had a certain experience. (For example, she has
never seen the color red, or any colors). When she first sees redness,
she is getting information about how things feel that is brand new. Since
she already knows everything physical, that information cannot be about
the physical world. So Mary's sensation of redness cannot be explained
as a physical event in the brain.
Reply: Churchland responds that even if Mary has a new kind
of knowledge when she sees red for the first time, the fact that she has
new knowledge does not show that feelings are not physical events in the
brain. The very same thing can be known in two or more different ways:
by direct experience (as when Mary first experiences redness) or by scientific
study (as when she learns all the physical things there are to know about
the brain). Since there may be these two different ways of knowing the
sensation of redness, it is possible for Mary to have known redness in
one of these and not the other. Just because there are two different ways
of knowing that sensation, it does not follow that experience of redness
is not an event in the brain.
I. Objections and Replies: Functionalism
-
The Inverted Spectrum. Imagine you and I have exactly the opposite
color experiences: white is swapped with black, red with green, etc. You
will have a very different set of sensations from mine. However, imagine
also that no one will know this by any outward sign, since my green experience
of a ripe tomato is one you call red, and associate with the color of fire
engines, and so on. In fact our two brains could both function in exactly
the same way, and still your sensations might be inverted compared to mine.
This shows that sensations cannot be explained by the functional features
of our brains.
Reply: Churchland answers that the qualitative differences between
you and I could be explained by the physical differences in our perceptual
equipment. So even if functionalists can't explain the qualitative difference,
there is still a naturalist explanation: the difference between your feelings
and mine is found in the difference in our eyes and visual processors.
Note this is not a functionalist answer to the objection. The most
popular functionalist answer to the objection is to deny that there could
be two individuals with the same functional arrangements who have their
colors reversed. The reversal would turn up in at least some difference
between all the relations between all the mental states.
-
Absent Qualia. The simplest version of the objection is to imagine
a computer executing the program for the human mind which the functionalist
claims exists. Clearly, says the objector, the computer doesn't have any
qualia (feelings). So functionalism does not account for qualia. A more
sophisticated version of this objection is the Chinese nation objection.
Imagine the Chinese nation following exactly the commands of a computer
program functionally identical to your brain's during the time you are
experiencing a nice cool beer on a sunny patio in Houston. But the Chinese
nation could not be experiencing anything of the kind! Even if the Chinese
nation and your brain were identical in function, they would not have the
same experience, so the functional account of experience must be wrong.
Adding in sensory systems to the computational theory of qualia does not
seem to help very much. Imagine now that we have a robot, (or even a brainless
cadaver) that is radio controlled by the Chinese nation. The sensory systems
could be the same as the ones that cause qualia in me. But that wouldn't
make it any more plausible that the Chinese nation plus robot has the experience
of enjoying a cool beer. Another version of the objection asks us to imagine
a zombie who has all the functional features I exhibit, but which
simply lacks and conscious experience.
Reply: Churchland replies that the Chinese nation does have
qualia, although since its qualia are embodied in a different way, they
might be different from mine. Its qualia are the states that allow it to
discriminate one taste from another and to report: 'boy that was a great
cool tasty beer!'. You might object that it is nonsense to say the Chinese
nation has my Houston-beer-sunny-patio experience. What evidence could
one use to deny that consciousness to the Chinese nation? Doesn't displaying
the right functional behavior serve as adequate evidence of consciousness
in other humans, such as our friends and family? So why deny consciousness
to this thing? Is it even coherent to deny this thing conscious states,
and to say that despite its proper behavior, there is, nevertheless, "nothing
going on inside". Consider the zombie (or the computer). Given that it
is functionally identical to a person, what conceivable test could you
use to come to know it is a zombie (or lacks qualia)? Note: zombies will
claim they have consciousness and talk a good story about their internal
states, etc..