Steven Pinker is a New Mysterian

Is the Hard Problem of consciousness solvable by science? Will we ever come up with a meaningful explanation as to how squirts of neurotransmitter and minor jolts of electricity create subjective experience? As far as I'm concerned, this is the major philosophical question hovering over neuroscience. If the new Mysterians are right, and we will never understand how the texture of experience arises from neural computation, then neuroscience has a very profound limitation. The most important question in the field will always remain an ineffable mystery. There will always be a big void in the center of our knowledge.

Nobody can predict the future of scientific progress, but I'm doubtful that the Hard Problem will ever be solved. Steven Pinker shares my skepticism:

And then there is the theory put forward by philosopher Colin McGinn that our vertigo when pondering the Hard Problem is itself a quirk of our brains. The brain is a product of evolution, and just as animal brains have their limitations, we have ours. Our brains can't hold a hundred numbers in memory, can't visualize seven-dimensional space and perhaps can't intuitively grasp why neural information processing observed from the outside should give rise to subjective experience on the inside. This is where I place my bet, though I admit that the theory could be demolished when an unborn genius--a Darwin or Einstein of consciousness--comes up with a flabbergasting new idea that suddenly makes it all clear to us.

This, of course, is a truism of psychology: our brains are bounded by evolutionary constraints. We aren't natural epistemologists, and we don't perceive anything straight. All of our truths depend upon three pounds of fatty membrane. Given these innate limitations, I think it's eminently reasonable to believe that some questions will always lie beyond our purview. We can't expect a product of biological evolution to solve every paradox.

What do you think? Is the Hard Problem of consciousness beyond our grasp? What about the ultimate nature of the universe? I believe it's possible to be a pure materialist - to not digress into discussions of the soul or God - but still believe that some mysteries are impregnable.

More like this

In his latest New Yorker article (an otherwise problematic discussion of Enron), Malcolm Gladwell makes an interesting distinction between "puzzles" and "mysteries": Osama bin Laden's whereabouts are a puzzle. We can't find him because we don't have enough information. The key to the puzzle will…
Over at Seed, V.S. Ramachandran shares his thoughts on how science can solve consciousness. Color me unimpressed: We know that awareness is not a property of the whole brain, so the problem can be reduced to, "What particular neural circuits are involved in consciousness? And what's so special…
I admire David Brooks for trying to expand the list of topics written about by Times columnists. (To be honest, I'm a little tired of reading about presidential politics.) His latest column, on "The Neural Buddhists," tries to interject modern neuroscience into the current debate over New Atheists…
Over at Salon, there's a quite interesting interview with UC-Berkeley philosopher Alva Noe, author of Out of Our Heads. (I reviewed the book in the SF Chronicle last month.) Q: Maybe I'm naive but it seems kind of obvious that the brain is the mechanism that -- in the context of a person's life and…

It may well be right that there are things that we will never able to understand. But that does not mean we shouldn't try. Just because we can't expect a product of biological evolution to solve every paradox, that does not mean that we are still able to. Although our brains evolved in an environment where understanding very complex problems do not seem to be useful, like mathematics and quantum mechanics, we nevertheless are able to wrap our brains around them. Just saying 'oh, we will never understand' is giving up, just like saying 'god did it' for things we do not yet understand.

Did you really want?

The first step is to define the problem. What exactly is meant by consciousness and subjective experience? I suspect that the question won't be answered until it is bounded by a precise, unambiguous and objective definition. Once that is done, I suspect that the question will, indeed, be answered.

The true limits of the mind are also beyond our comprehension: we are left with no choice other than trying to ever pose some questions and suggest some answers. The future fate of the hard problem depends on the dynamics of the collective voice of those who are concerned, and so will be a matter of our coming history.

Plato wrote a fair bit about this, as did Kant in response to Plato. How do we conceive of a true alien thing - we can't, as an alien by its nature is absolutely outside of our realm of experience. The term alien in that sense (not in the LGM sense) is forever a term that can't be experienced. That's semiotics though, and yakking about categories.

The great thing is that human beings seem to have a way of seeing old things in new ways - it won't be that a new Newton or Darwin or Einstein of consciousness necessarily needs to come along with a brain mutation that allows them to comprehend the mechanism of consciousness - we just need to shake off the shackles of how we've seen it before and look at it in a new light.

Call me an optimist, but I think we do that all the time - everyone struggles with a problem, lets it settle overnight, suddenly the answer comes from nowhere (you're still considering it, though you're not considering it consciously). History is full of sudden leaps like this, and our lives are full of little leaps along the same lines. The nature of consciousness is just the latest problem. As is multi-dimensionality - we all used to think the world was flat. Not many of us have experienced the feeling that it isn't flat, but the truth of the matter, even though it was an alien concept, just makes logical sense.

The tricky thing is that consciousness is considering consciousness at the moment, but I'm not sure that is going to be an issue in retrospect, once someone looks at it in a new light.

While materialism has worked so far as a foundation system for scientific progress it is hard to imagine how one could even get started on the problem of consciousness within this system. Since everything and thought (including the idea of materialism) exists in the state of consciousness is the difficulty that we can't extricate ourselves from it to consider it?

Is the Hard Problem the most important question in the field? I don't see how progress on any scientific problem (how things work, cause and effect, prediction, improvement of the human condition etc) is impeded by by not having the answer to this problem.

Perhaps neuroscience will eventually bioengineer an entity that will have an answer to this problem, but will we be able to understand the explanation?

By michael o'connor (not verified) on 24 Jan 2007 #permalink

If Chalmers easy problems are solved, it remains to show that "free will" and "experience" has any objective definition and is needed to understand consciousness.

Ideally, we would have other systems "experiencing" consciousness to compare with; animals, AI's, small green men, ... If not solving the problem perhaps, if it remains, but to set it in perspective.

By Torbjörn Larsson (not verified) on 24 Jan 2007 #permalink

If the hard problem is unsolvable (and I don't know that it is, but it is possible) then it seems to me that might say a lot about whether qualia are reducible to matter ala physics. Even Kim seems to be buying into qualia of late. If qualia are irreducible that would seem to have fairly significant implications for science and scientific epistemology. I don't think that the existence of irreducible qualia would render it unsusceptible to scientific inquiry. But certainly it would affect how it is engaged with. Although I'd argue that as a practical matter it already is dealt with differently. As a practical matter there is a huge gap between the physical sciences and the social sciences. Neurology and some aspects of psychology cross the bridge. But by and large there always has been a gap and if the hard problem can't be solved then it merely cements what is already the case.

"Identifying it with information processing would go too far in the other direction and grant a simple consciousness to thermostats and calculators--a leap that most people find hard to stomach"

Pinker glides over what may very well be a valid answer to the Hard Problem here. There aren't really any good refutations of the information processing perspective; it just feels wrong because it involves attributing varying levels of consciousness to all kinds of objects and living things. But that's no basis for ruling it out - we have no idea what the subjective of experience of being a calculator or thermostat is like. In fact, we have no way of even determining that parts of our subconscious mind dont have their own subjective experience when they process their information because our conscious mind only concerns itself with the outputs.

I definitely think the ingrained assumptions in our minds impede the acceptance of this or other potentially valid but counterintuitive explanations. Were highly fixated on classifying objects as animate or inanimate, which, while useful for making practical decisions about how to interact with our environment, is spurious distinction and impairs our ability to understand phenomena where the distinction isnt clearly applicable. The facts about consciousness may never make sense to us, but that doesnt do anything to diminish whether theyre true. Physics and mathematics have long since ventured into territory where the truth is no longer agreeable with our intuitions in the way that Pinker is looking for in an answer to the Hard Problem, yet by grounding our study of them in a system where the immediate relations between phenomena are objective and comprehensible, we can build an effective understanding of systems that we cannot wrap our minds around when considered as wholes. We probably will eventually be able to do this for investigating consciousness by describing the physical relationships present in the brain, but I doubt that we will uncover anything more satisfying that what we already know.

Our brains can't hold a hundred numbers in memory, can't visualize seven-dimensional space

There are tools to do these things. We can develop a tool that flashes a light when the test subject exercises "free will." It's likely that in building this tool, we'll find a usable (if not entirely correct) definition that many, many people will find disturbing.

The brain has a lot of very specific structure to manage attention, episodic memory, sensory integration, etc. All our evidence for consciousness, qualia, etc. depends on just the kinds of things these structures handle. So I don't see any reason whatsoever to attribute consciousness to anything that doesn't have these structures (such as calculators or current computers) any more than we need to attribute visual experience like ours to anything that doesn't have at least a rudimentary visual cortex.

So the interesting question is, given all these structures and their interactions, can we reasonably hope for a scientific account of consciousness? I don't see why not. Baars has proposed a very reasonable functional role for consciousness. Continuing investigation has greatly refined our questions about attention, self-awareness, etc. and given us a lot of detailed knowledge about how these are supported in the brain (hint: they are very complicated).

Chalmers (and other dualists and/or mysterians) tend not to engage with all this knowledge, but instead raise metaphysical objections, claiming (in Chalmers' case specifically) that zombies are "logically possible" so dualism must be true. Whether or not this makes sense at all (and I think it doesn't) the "logical possibility" of zombies is no barrier whatsoever to scientific understanding of how consciousness actually works in real humans and other animals with the right structures in their brains.

Chalmers (and other dualists and/or mysterians) tend not to engage with all this knowledge, but instead raise metaphysical objections, claiming (in Chalmers' case specifically) that zombies are "logically possible" so dualism must be true.

That is a valid and good claim. On top of which I don't think philozombies are claiming anything physically.

By Torbjörn Larsson (not verified) on 25 Jan 2007 #permalink

How many puzzles besides the "grand puzzles" survived the "mysterian" categorization -- not many. The only grand puzzles that still inhabit this realm are theories of ultimate dimensionality or boundary problems such as fundamental particle limits (planck?), the quantum, alpha/omega points, beginning of time, prime movers, entropy, etc. I don't believe that consciousness is part of these "hard problem" - its a very complex problem - more a function of the astronomical number of interconnects and extreme parallelism.

A reptilian neural system that was brilliantly co-opted (as evolution usually does) by a massive memory framework adaption called the neocortex. I sincerely believe that this illusion of consciousness (mind, etc) is merely a function of tremendous amounts of sensory input, that gets processed, cateogorized and recorded (if deemed special enough) in our memory machine.... and thats all.

Just like a 30/frame a second movie provides the illusion of continuity ... so does the deluge of parallel sensory information that gets processed by our brains. It trick us into believing in this illusory ethereal consciousness ... information theory/simulation is gonna crack this nut in the next 15 years.

I must be missing something. Strictly speaking isnt it correct to say that I have no knowledge of the consciousness (thoughts, feelings, etc.) of anybody or anything except by analogy they are made like I am (brain, body, etc) and/or they behave like I do?

If a nonconscious extraterrestrial learned all our science and observed all our behavior, it would seem it would have no idea of what consciousness was or that it existed except in terms of brain activity.

All the other past mysteries have been solved within the framework of materialism and mathematical theory. Every advance in the various brain sciences tells us more about how the brain works but does not take us one inch closer to understanding how this piece of meat gives us the smell of the orchid.

By michael o'connor (not verified) on 29 Jan 2007 #permalink

What if the problem doesn't need to be solved? With an answer non unlike the ether problem: "There no consciousness". And I'm serious; deadly so. What if consciousness is just, as they say, a figment of our imagination? And free will, and...
Maybe our brain freed us from the constraint of nature (if after all we ARE free) just inventing a skyhook, patterns in the mind the allow us to understand slightly better the world itself but not the brain. Useful tools, I concede, coming from two walls of mirror neurons fronting each other, but just...
Now, wait a minute. I just described consciousness.
Ok, forget about it 8-)

Jonah, you haven't come out and explicitly said whether or not you endorse Chalmers definition of the "hard problem." Do you subscribe to his theory that "consciousness isn't physical?" If so, I'd like to hear your rationale. I too remain unconvinced by Chalmers zombie scenario. Why do you find it compelling?

If you do, in fact, reject the materialist's premise that consciousness is a function of the brain's circuitry than I can see why you would find the problem of consciousness insoluble. I'm a materialist myself--a firm believer that consciousness is physiological in nature. Consequently, I think that ferreting out the nature of consciousness is a matter of correlating anatomy with awareness. This may be difficult; it may take a long time; but I see no reason why it wouldn't be possible.

I don't endorse Chalmers' squishy notion of consciousness as information. I've never even really understood what he's talking about. Nor do I subscribe to any notion of mind or consciousness that isn't materialist. I have no doubt that we will one day "correlate anatomy with awareness."

But what will that correlation actually explain? It will be a neural map of consciousness - and that will be very cool - but it still won't explain how those electric neurons give rise to subjective experience. (We shouldn't confuse the map of a place for the place itself.) I believe that any explanation of our subjective experience or qualia or consciousness solely in terms of our neurons will never explain our experience, because we don't experience our neurons. It's really that simple for me.

So I'm a mysterian like Colin McGinn, not Chalmers, although his hard/easy formulation has certainly proved useful.

A lot depends on what you mean by 'solving' the problem. Emergent properties of complex systems rarely submit to a 'solution' the way gravity solves the problem of planetary motion. A good example that is understood but not 'solved' is the theory of liquids. Given the properties of H20, predict the viscosity and density of water at room temperature. It can't be done in any analytic way or using any first principles theory--but molecular dynamics simulations of samples of virtual H20 molecules get answers that agree sufficiently accurately with measurements that it is hard to get funding to study this problem any more.

So in one sense I agree that the human mind can never conceptualize a 'solution' to its own consciousness--almost by definition such a solution can not be represented in a human mind. But if we learn enough about the human brain to create a model system whose simulation reproduces all the key properties of a human brain, then I'd be willing to call that a solution. We will never be able to 'get inside' the simulation in the same way we can't 'get inside' someone else's brain. So there will still be people who say 'its not conscious'. But this 'getting inside' the feeling of something is not what scientific solutions are about. Once we can reproduce the measurable properties of something, then we say we understand it. We may be centuries away from doing this with the human brain, but I don't see any fundamental barriers in our way.

If we have this kind of simulation, then we probably will be able to manipulate it and detect the outward manifestations of what we experience as consciousness, and so we might develop a 'solution' that describes the kind of processes that control these experiences. However, this solution will be more like the handwaving about hydrogen bonds and cluster rearrangements that we use to describe our 'solution' of the liquid problem than the elegant equations that describe our solution of gravity.

Conceding the hard problem is an untidy business. What assumptions can we make about consciousness, then? By default, essentially all of us project our conscious experience onto other humans, but what about other animals and organisms? And what can be said about the potential consciousness of computers, weather systems, and galaxies?

Among these questions, I think the issue of whether animals can experience pain, pleasure, and desire is especially important due to ethical considerations.

By Jose Lopez (not verified) on 08 Jul 2010 #permalink

believe that any explanation of our subjective experience or qualia or consciousness solely in terms of our neurons will never explain our experience, because we don't experience our neurons. It's really that simple for me.