This article was first published by Culture11 in September 2008. Reprinted with permission on TheNewAtlantis.com.
In the Wizard of Oz, the scarecrow famously expresses his desire to become human by pining for a brain, suggesting that one of the characteristic marks of humanity is this particular biological endowment. Little did he know how prescient his longing was prefiguring the recent obsession to find in neuroscience the key to unlock every obstinate human mystery, to finally lay bare the nebulous inner workings of human nature, to bring light to the many dim corridors of human consciousness. In fact, the path from straw man to human being could have been accomplished with even greater facility if the scarecrow settled for the latest software that mimics the operation of the brain; neuroscientist Michael Gazzaniga, in his newest book, Human: The Science Behind What Makes Us Unique, helpfully queries “Who needs flesh?” Either way, whether it’s the brain or the computational processes brain matter supports that ultimately make us human, neuroscience has become the latest and most promising scientific candidate to replace centuries of philosophical debate with demonstrative certainty and a collective sigh of relief.
The enthusiasm surrounding the explanatory powers of neuroscience is attested to by its numerous progeny — neuroeconomics, neuropharmacology, neuropolitics, neurocriminology, neuropolicy — if the popularity of a prefix is any barometer of a science’s success then neuroscience is undoubtedly in the ascendant. However, what is most interesting about neuroscientific research today is not its many impressive medical and technological advances but its increasingly confident claims to architectonic status among the other sciences; the truly exciting promise of neuroscience is not its capacity to diagnose this or that medical malady but to render transparent with reassuring finality the complex nature of human thought and behavior. Neuroscience today is animated by a kind of philosophical eschatology: one gets the distinct impression that we’re standing on the precipice of a comprehensive account of man, that a neuroscientific version of the End of History is finally upon us. Whether it’s the ultimate root of religious belief, or the causes and character of love, brain research can now substitute hard, empirical fact for what Gazzaniga derisively calls “long windbag discussions about art.”
However, the claims neuroscience make to theoretical comprehensiveness exact a predictable toll on its sensitivity to the complexity of human experience; time and again its scientific rigor is overtaken by a hyper-exuberant scientism that reduces the language of ordinary moral life to the remarkably counterintuitive vernacular of neurophysiologic research. In a recent book devoted to the lessons neuroscience can impart regarding the nature of political judgment, The Heart of Judgment: Political Wisdom, Neuroscience, and Narrative, Paul Leslie Thiele consistently resorts to depictions of the political and moral dimensions of human life that can only be described as perplexing distortions: his most memorable definition of prudence is that which “occurs when the frontal lobes marshal other brain regions into service, utilizing diverse capacities and orchestrating their integrated effort.” While he reasonably counsels that accumulated experience is indispensable for the proper cultivation of good judgment he also redefines that experience as the collection of “brain maps” that “constitute a neural inventory of an individual life.” Even his discussion of the influence of our ancestry is reduced to the “genetic inheritance” that “has congealed in the form of inherited brain circuits or strong propensities for their formation.” Political wisdom, once the rarified reserve of a small few, is now available to anyone who can properly “educate” their amygdala and engender the appropriate set of habits and skills understood as the “behavioral expressions of neural remappings.” This is a far cry from Aristotle’s prescriptions (or just about anyone else’s) regarding moral education.
In fact, the tendency of neuroscience to unrestrained reductionism eventually requires it to enlist postmodern narrative as its unlikely partner in philosophical presumptuousness; its mechanistic descriptions of socio-political behavior are so transparently unsatisfactory it eventually requires some sort of explanatory bridge that reconnects it to the intuitive experiences we have of ourselves. If our dispositions and actions are largely the sum result of neural hardwiring and synaptic fireworks, the experience we have of ourselves as actual selves, or conscious and free willing agents capable of moral deliberation and responsibility, turn out to be chimerical. From the perspective of neuroscience, this inner experience of personhood can be debunked as the “user illusion,” or the salutary myth that we are the transcendent authors of our own lives, or as Thiele puts it, that there is an “enduring teller behind the neurological tale.” According to Gazzaniga, the self is merely narrative fabrication or gossamer construction that “liberates us from the sense of being tied to the demands of the environment and produces the wonderful sensation that our self is in charge of our destiny.”
Ultimately, the ambition of neuroscience to provide a synoptic anthropology creates an intractable dualism between the stories we unconsciously weave about ourselves and the reality that our life is a fiction written by our brains in which we merely appear as protagonists. Apparently, Nietzsche was right when he remarked that science itself is always going to be long on the “how” but thrifty with the “why” — the nihilistic truth of our neurophysiologic slumber can be tolerated as long as we sing to ourselves the right therapeutic lullaby. For all its pretensions to a kind of hard-nosed realism, the advocates of a complete neuroscientific account of human life refuse to acknowledge the preponderance of evidence right before their very eyes that no neural Darwinism could ever capture the full mystery of human consciousness. Further, to blithely begin with such an assumption in one’s pocket is already an unpardonable departure from a defensible realism that takes seriously the world as we experience it. The shockingly narrow purview of such an approach suggests that neuroscience itself is two parts “reflective mythologizing,” to borrow another phrase from Thiele, and one part science.
Part of what makes the current crop of neuroscientific propaganda so disappointing is the attention such hyperbole draws away from the genuinely impressive advancements it has made medically and technologically. Unfortunately, like so much of science today when it ventures beyond the laboratory and engages the great philosophical debates, neuroscience celebrates the ultimate transparency of man at the price of his considerable diminution. It now seems that ignorance of this trade off can no longer be presented as an excuse, given that the bearers of unbridled optimism in the edification neuroscience promises have turned to postmodern poetry to satisfy those who have the appetite for more recognizably human sustenance. And if science relies upon poetry to account for those phenomena that are most fundamental to human life, does this mean that it is finally more empirical than science? Has science deceived itself about what discipline precisely is the great manufacturer of myth? At least for now, one can suggest that it would benefit neuroscience to understand that the scarecrow’s yearning to become human, even before had possession of a brain, was some evidence that he had a soul.
Exhausted by science and tech debates that go nowhere?