Evolution Icon Evolution

Neuroscience Tried Wholly Embracing Naturalism, But Then the Brain Got Away

Researcher-test2.jpg

In "Darwin’s ‘Horrid Doubt’: The Mind" I noted:

Ironically, while Darwin may have doubted the fully naturalized mind and felt horrid about it, most of his latter-day supporters believe and feel good. And, on its own terms, their faith cannot be disconfirmed.

That is well for them because the evidence does not support them. But the culture does.

The clearest exponent of the culture is Duke University philosopher Alex Rosenberg, who proudly advocates scientism: "Science provides all the significant truths about reality, and knowing such truths is what real understanding is all about." Elsewhere he writes:

Ultimately, science and scientism are going to give up as illusory the very thing conscious experience screams out at us loudest and longest: the notion that when we think, our thoughts are about anything at all, inside or outside of our minds.

Many neuroscience postdocs wouldn’t quite put it that way. They would just say, "The mind is what the brain does." Their claim cannot be directly assailed because it is "science," that is, it supports naturalism.

In 2009, New York Times weathervane David Brooks informed us that:

When you go to an academic conference you expect to see some geeks, gravitas and graying professors giving lectures. But the people who showed up at the Social and Affective Neuroscience Society’s conference in Lower Manhattan last weekend were so damned young, hip and attractive. The leading figures at this conference were in their 30s, and most of the work was done by people in their 20s. When you spoke with them, you felt yourself near the beginning of something long and important.

Such as his hope:

I’m free to speculate that this work will someday give us new categories, which will replace misleading categories like "emotion" and "reason."

Why are those categories misleading? Because they rely on human experience, not brain claims.

Science-Fictions-square.gifIn the field of "long and important" neuroscience, new pop books have flooded the market, simplifying complex human behaviour by the magic of brain scanning: Nudge (getting one’s way) and Blink (thinking without thinking) give some sense of it.

Both the United States and the European Union are throwing billions of dollars at new projects to map the human brain. Yet many neuroscientists worry that more is promised than can be performed. For one thing, fMRI (brain imaging) shows which brain areas have high oxygen levels when a person is thinking something. It simply cannot tell us what people are thinking, because many brain centers are active and those that are active may be activated for many reasons. Each brain is unique so data from studies must be averaged. But thoughts are not averaged; they belong to the individual.

Two hundred and fifty scientists are protesting the European Human Brain Project on the grounds that a proposed computer simulation isn’t realistic for understanding brain function. Indeed, the main practical effect of more and better neuroscience has been — not to cement — but to blow up conventional neuroscience assumptions or pop legends:

  • The brain cannot regenerate. In reality the brain is constantly fine-tuned by its uses throughout life.
  • The "reptilian brain" explains objectionable human behavior. The idea originated in the 1970s, based on the fact that we share a brain area near the stem with reptiles. So some attributed "reptilian" behavior to that area. But in reality, socially objectionable behavior is mediated by many areas of the brain.
  • Humans can be divided into left- and right-brained thinkers. Such claims have no solid basis in science. As Stephen M. Kosslyn and G. Wayne Miller put it in Wall Street Journal recently:

The brain doesn’t work one part at a time, but rather as a single interactive system, with all parts contributing in concert, as neuroscientists have long known. The left brain/right brain story may be the mother of all urban legends: It sounds good and seems to make sense — but just isn’t true.

  • Mirror neurons are crucial to how primates (including humans) understand the actions of others. This 1990s idea resulted in many papers, but later:

Journals published shoddy studies, and speculation about the ability of mirror neurons to inform the primate brain’s "action understanding" ran amok. Since then, several neuroscientists, Hickok among them, have reevaluated the roles played by these neurons.

  • Dyslexia is strongly associated with high intelligence. Verdict: "Mr. Gladwell enjoys a reputation for translating social science into actionable insights. But the data behind the surprising dyslexia claim is awfully slim."
  • Then there was the neuroscientist who discovered that, according to his interpretation of brain scans, he himself was a psychopath, which led him to rethink his approach.

A surprising turn of events is that pushback ("neuroskepticism") is increasing within the discipline. The Scientist admitted that there are issues in a respectful review of a?book addressing some of them:

"Brains are hot," Sally Satel and Scott O. Lilienfeld acknowledge in Brainwashed, their "expos� of mindless neuroscience" (mostly practiced not by neuroscientists, they stress, but by "neuropundits," among others). The "mediagenic" technology of fMRI imaging has made the brain, aglow with metabolic hotspots, into a rainbow emblem of the faith that science will soon empower us to explain, control, expose, exploit, or excuse every wayward human behavior from buying to lying, from craving to crime.

"Neuropundits?" Terminology like that and neuroessentialism ("using the terms of neuroscience as evidence for claims made in psychological or sociological frameworks") now abounds — when neuroscientists are being polite.

When they are not so inclined, one may also hear neurohypeneuro-nonsenseneurotrash (at New Humanist), and, from over the Pond, of course, "neurobollocks." To say nothing of cargo cult science and pseudo-profundity.

Actually, it is somewhat unusual during the reign of naturalism for so much dissent to come from within the discipline, as it has in neuroscience. Most cosmologists accommodate untestable multiverse cosmologies, origin-of-life researchers guard naturalism zealously despite its fruitlessness, and human paleontologists are more numerous than important fossils, with the usual results (much ado about nothing). Why might human brain research be different?

First, there are not-easily-reducible anomalies. For example, the man rendered significantly brain-absent in an accident who achieved a remarkable recovery. And the normal 88-year-old man whose brain had never had a connection between the two halves (corpus callosum). Other patients with conditions as anomalous as his may function normally but never present themselves for a brain scan. These cases warn us that a brain is not just a material object like a car, and a mind is not just an illusion it creates.

There is also the fact that most people think of neuroscience as part of medicine. That tends to ground its clinical practice in the real world.

So, it is still conventional to assert that the mind is not merely the brain — even David Brooks, who wrote an "evolutionary psychology" novel a few years ago, is backing off a bit.

Does neuroscience’s stubbornness in the face of materialism portend big changes? Hard to say. If all we want from an explanation is that it be naturalist, all other values can be fudged or ignored. Next we must address the conundrum of consciousness itself.

Editor’s note: Here is the "Science Fictions" series (the human mind) to date at your fingertips.

Image source: Wikipedia.

Denyse O'Leary

Denyse O'Leary is a freelance journalist based in Victoria, Canada. Specializing in faith and science issues, she is co-author, with neuroscientist Mario Beauregard, of The Spiritual Brain: A Neuroscientist's Case for the Existence of the Soul; and with neurosurgeon Michael Egnor of the forthcoming The Human Soul: What Neuroscience Shows Us about the Brain, the Mind, and the Difference Between the Two (Worthy, 2025). She received her degree in honors English language and literature.

Share

Tags

Continuing SeriesscienceScience Fictions