You, Me, and The Bell Curve — Part Four: Eschatological Hopes and Obdurate Realities

Editors note: this is the final installment in my four part series of the Bell Curve controversy. Here are links to the preceeding posts: Part One; Part Two; Part Three

______________________________________________________

At this point we appear to have left gruff old Naureckaus and comrades to their Manichean fantasyland where racist pseudo-scientists, abetted by those unwitting shills in the mainstream media, churn out reams of dubious research in furtherance of their nefarious social agenda.  It’s been good nostalgic fun, but a lot has happened since 1995.

Time to survey the damage.

Remember Arthur Jensen? It was his seminal article, “How Much Can We Boost Scholastic IQ?” that ignited the first wave of public furor over these matters way back in 1969, right around the time when Charles Manson was Helter Skeltering the Decade of Love down the shitter. Because his analysis broke with academic decorum in considering the possibility that genes contribute to black-white differences in educational achievement, Jensen was pilloried by public intellectuals and campus radicals. His article became a lightning rod, attracting public denunciations far worse than the fulmination later visited upon The Bell Curve.

But rather than abscond into less distasteful academic territory, the Berkley psychologist weathered the calumnies and death threats and quietly pressed on with his meticulous research. His name remains a watchword in public discourse, but Arthur Jensen now ranks among the most prolific – and respected – scholars in the behavioral sciences. Just type his name into Google Scholar and follow the cites. You’ll see.

In 1998, just as the Bell Curve backlash was beginning to ebb, Jensen published his magnum opus, The g Factor. Although mainstream critics took little notice of the technically dense graph-laden treatise, scholars knew it was a bombshell. In addition to providing a comprehensive account of the accumulated evidence for the resilience of racial differences in cognitive ability, Jensen’s book brought to bear the weight of a century of research establishing that intelligence tests really do signal the empirical reality of a “general factor” that transcends the statistical bounds of standardized psychometric models.

On this pivotal point, Jensen’s précis for the American Psychological Association is worth quoting at length:

The g factor arises from the empirical fact that scores on a large variety of independently designed tests of extremely diverse cognitive abilities all turn out to be positively correlated with one another. The g factor appears to be a biological property of the brain, highly correlated with measures of information-processing efficiency, such as working memory capacity, choice and discrimination reaction times, and perceptual speed. It is highly heritable and has many biological correlates, including brain size, evoked potentials, nerve conduction velocity, and cerebral glucose metabolic rate during cognitive activity. It remains to investigate and explain its neurobiological basis.

Throughout this article, I have made fast and loose reference to “IQ” and “intelligence.” I offer no apology, as such terms have the merit of providing a fairly reliable shorthand. Nevertheless, standardized IQ scores must ultimately be understood as but a useful proxy, and “intelligence” as merely a powerful heuristic cue. Jensen’s g is the genuine article.  The mathematical models may be daunting, but there is converging agreement among scholars that g makes intelligence more intelligible.

Once cognitive measures are understood in terms of their relation to a general factor, it becomes possible to escape the limits of statistical constructs and get closer to the man behind the curtain. Tests with higher “g loadings” are better at predicting academic and vocational performance. Tests of g also turn out to reveal the strongest and most resilient racial differences. g provides an empirical means of testing fashionable theories of “multiple intelligence,” “practical intelligence,” and “emotional intelligence.” And only by examining the independent weight of g have psychometricians been able to establish that men and women really are endowed with roughly equal – if differently flavored – average brainpower. There’s still some controversy, but Jensen’s research in particular suggests that rather than being an artifact of standardization, sexual equality, to paraphrase Gould, really is a "contingent fact of history." Which is good news, to be sure. Finally, while there continues to be fascinating debate as to the nature of the mysterious and much publicized "secular increase" in IQ scores over the past century (usually referred to as the "Flynn effect"), the observed increase appears to be of diminishing value as the general factor is brought into the picture.

With the cultural clutter and statistical artifice pushed away, g is what remains. And g is what explains. 

Whether out of nescience or negligence, some scholars from competing disciplines continue to misapprehend the importance of Jensen’s work. I think this will change as the biological correlates keep on filing in. Just as lithium did more to alleviate the suffering of the psychotically afflicted than all of Freud’s horses, pharmaceutical or even genetic therapies may one day offer a practical means of assisting the cognitively disadvantaged. I’m not banking on a panacea. The point is that only by understanding the biogenetic infrastructure of intelligence can we hope to prod at the possibilities. If hope is to spring, we will be indebted to the dispassionate investigations pioneered by men like Arthur Jensen. And we will be indebted, as ever, to science. The sociological gymnastics and intellectually couched denials have gotten us nowhere.

Could be there was just too much psychic baggage attached to his name, or perhaps his arguments were too technically abstruse for public consumption. But for whatever reason, Jensen’s groundbreaking book never really registered on the public radar. “The recent efflorescence of the sciences of mind, brain, genes, and evolution.” would find a less divisive messenger in Harvard’s shaggily coiffed PR-savvy neuropsychologist cum polymath, Steven Pinker.

With envy-inducing erudition and surefooted authority, Pinker’s magisterial synthesis, The Blank Slate, brought the emerging Darwinian zeitgeist – and our collective apprehension – into sober perspective. With reams of evidence in tow, Pinker’s book argued that much of the received wisdom about human nature is founded upon ideological premises that are demonstrably false. The titular “blank slate” refers to the belief that the human mind is shaped primarily by experience and nurture – a belief that is demolished by behavior genetics alone.

Although The Blank Slate carefully sidestepped explicit discussion of the racial implications of The Bell Curve, Pinker did stick up for the science underlying Murray and Herrnstein’s central arguments. Rather than providing fodder for conservative notions of class and meritocracy, Pinker points out that Murray and Herrnstein’s claims might just as easily serve to revitalize Rawlesian theories of social justice. “If social justice consists of seeing to the well-being of the worst off,” Pinker notes, “then recognizing genetic differences calls for an active redistribution of wealth.”

Interesting as it may be, Pinker’s speculation about the prospect of “Bell Curve liberals,” is of tangential relevance. The point I want to emphasize is that despite his tacit endorsement of hereditarian ideas in general and Murray and Herrnstein in particular, Pinker’s book was widely received as an important, even seminal, contribution to the popular understanding of human nature. Rather than being stigmatized and marginalized according to the approved script, Pinker was praised for his courage and erudition.

The times, they were a-changing.

The lines had to be redrawn because science wasn’t waiting around. When they cracked the human genome years ahead of schedule on the cusp of the over-hyped millennium, the know-nothing rhetoric was already wearing thin. Blank slate parlor chat might have gone down in simpler times, but shifting paradigms have a way of reshuffling our sensibilities. To paraphrase biologist Gregory Cochran, believing in the blank slate in the post-genomic era is like believing in a flat earth after seeing pictures from outer space. 

The view remains foggy from down here among the algebraically-challenged rabble, but a few points of  show-stopping significance scream out. First, the genome turned out to be much smaller than expected. While scientists had long estimated our genetic architecture to consist of around 100,000 genes, the initial ego-deflating HGP verdict put the figure at no more than 30,000, and the number has since been lowered, making our genetic constitution only marginally more complex than that of a fucking flatworm. What’s worse, as science correspondent Nicholas Wade noted in the New York Times, a substantial number of our genes may have been “acquired directly from bacteria.”

And as if that weren’t enough to raise the usual anthropocentric hackles, no sooner had the human genome been sequenced and published than researchers racing to identify disease-causing genes began running into the problem of race. Er, I mean “population structure” – or is it “self-reported population ancestry”? Whatever. The point is that however politely it might be euphemized, the “specter of difference” loomed over the hopes of science. In sniffing out the genetic roots of disease, the fact that sickle cell anemia and lactose intolerance refused to evolve in deference to our ideological sensibilities suggested that ethnic variables might be ignored only at the expense of biomedical progress.

A spate of scientific articles begin sniffing around the edges of the racial-genetic architecture. Then, in late 2002 the journal Science published a major study that looked at DNA markers from 52 human groups throughout the world. As summarized in one popular account, the researchers reported that “people belong to five principal groups corresponding to the major geographical regions of the world: Africa, Europe, Asia, Melanesia and the Americas.”

This may have spelled bad news for “race is a social construct” crowd, but it was good news for people suffering from “racist” diseases. “Self reported population ancestry,” the Science article concluded, “likely provides a suitable proxy for genetic ancestry.”

Similar concerns provided the impetus for the development of a corollary genetic investigation known as the Human Haplotype Map, which was completed in October of 2005. The “HapMap” was based on the observation that much of the human genome was passed on in "unshuffled" blocks known as haplotypes. By analyzing the variance of such blocks in different ethnic groups, researchers have been able to accelerate the identification of genetically-based diseases.

But the HapMap also provides a unique opportunity to chart the dynamics of recent human evolution, which will inevitably beg the questions at the core of our discomfiture. Aware of such potential, the HapMap consortium framed their research with an oddly cautionary note: “We urge conservatism and restraint in the public dissemination of and interpretation of such studies,” they advised, “especially if nonmedical phenotypes are explored.”

Nonmedical phenotypes? What do you suppose they mean by that?

With the genome and HapMap providing the necessary backdrop, the shibboleth shattering discoveries have continued apace. A few recent snapshots will suffice:

  • The FDA approves BiDil, a drug shown to be dramatically effective in the treatment of heart failure for “self-identified black” patients while being of little benefit to other races. Meanwhile, scientists in Iceland detect genes that more than double the risk of heart attack for African-Americans. The working theory is that the same gene in white and Asian carriers is less of a risk factor because other genes have had time to evolve to keep it in check. Clinical trials are underway to measure the efficacy of a new drug designed to inhibit the gene’s deleterious effects for blacks.

  • The Journal of BioSocial Science publishes “The Natural History of Ashkenazi Intelligence,” a cross-disciplinary examination of the theory that markedly high average IQ among Jews of Ashkenazi descent resulted from recent natural selection favoring cognitive ability for European Jews who gravitated to mentally demanding occupations after being excluded from traditional agrarian trades. The researchers argue that the disproportionately high  incidence of a cluster of genetic diseases like Tay-Sachs within the Jewish population may be understood as having resulted from a fitness trade-off for the adaptive enhancements conferred by high intelligence.

  • In December of 2005, researchers at the University of California unveil a detailed analysis of Human Haplotype data showing that more than 1,800 genes have been under strong selective pressure over the past 50,000 years. The paper, "Global landscape of   recent inferred Darwinian selection for Homo sapiens, concludes by stating that “such recently selected alleles may provide useful ‘markers’      for investigating the evolutionary migrations of our species, as an adjunct to studies using neutral markers. We also propose that many of these alleles, because of their high prevalence and recent selection,      should be considered likely ‘functional candidates’ for association with human variability and the common disorders afflicting humankind.”

What’s the common them here? Anyone?

It’s evolution, baby. And not some safely warmed-over textbook version where the action stalled along at a glacial pace over millions of years before conveniently stagnating into innocuous drift with the emergence of human culture. No, we’re talking real-time Darwinian razzmatazz – running right through the biosocial matrix of modern times, with brains and genes and germs adapting under intense selective pressure in crazy dynamic confluence with ever-fluctuating cultural and ecological forces. This is the universal acid nipping at the hems of our epoch-centric identity complex.

And you don’t need a calculator to do the math. Just keep in mind that our neural machinery is still evolving. Add to that the fact that these big partially-inbred families we call races have been doing their thing in relative isolation for the better part of eons. Then drop by the dog park and reflect on that Border Collie – the one sniffing that Great Dane’s ass.

Woof, woof. It’s no surprise the egalitarians are getting nervous.

For his part, Steven Pinker seems to have thrown caution to the wind, lending his endorsement to the Ashkenazi study, then throwing down the gauntlet in a recent Edge symposium, where he spells out the challenge in no uncertain terms:

Advances in genetics and genomics will soon provide the ability to test hypotheses about group differences rigorously. Perhaps geneticists will forbear performing these tests, but one shouldn’t count on it. The tests could very well emerge as by-products of research in biomedicine, genealogy, and deep history which no one wants to stop.

The human genomic revolution has spawned an enormous amount of commentary about the possible perils of cloning and human genetic enhancement. I suspect that these are red herrings. When people realize that cloning is just forgoing a genetically mixed child for a twin of one parent, and is not the resurrection of the soul or a source of replacement organs, no one will want to do it. Likewise, when they realize that most genes have costs as well as benefits (they may raise a child’s IQ but also predispose him to genetic disease), "designer babies" will lose whatever appeal they have. But the prospect of genetic tests of group differences in psychological traits is both more likely and more incendiary, and is one that the current  intellectual community is ill-equipped to deal with.

Fasten you’re seatbelts, because the fun is just getting started.

In responding to the initial Bell Curve backlash — a voluminous literature to which poor Naureckas, now fading in our rearview mirror, contributed but a feisty footnote — surviving author Charles Murray made a bold prediction. In his 1995 essay “The Bell Curve and Its Critics” he outlined a three stage process. “In the first stage” he wrote, “a critic approaches The Bell Curve absolutely certain that it is wrong. He feels no need to be judicious or to explore our evidence in good faith. He seizes upon the arguments that come to hand to make his point and publishes them, with the invective and dismissiveness that seem to be obligatory for a Bell Curve critic.”

But Murray foresaw unintended consequences:

In the second stage, the attack draws other scholars to look at the issue. Many of them share the critic’s initial assumption that The Bell Curve is wrong. But they nonetheless start to look at evidence they would not have looked at otherwise. They discover that the data are interesting. Some of them back off nervously, but others are curious. They look farther. And it turns out that there is much more out there than Herrnstein and I try to claim.

In stage three, these scholars start to produce new material on the topics that had come under attack in the first place. I doubt that many will choose to defend The Bell Curve, but they will build on its foundation and ultimately do far more damage to the critics’ "eschatological hope" than The Bell Curve itself did.

Q.E.D?  Yes, I’m afraid so.  Are you looking forward to the third act?

Listen. 

More than almost anything, I wish I were smarter. I don’t know my IQ.  Don’t want to. But I’m not being humble or self-effacing when I tell you mine isn’t the brightest bulb in the marquee. I may put on a decent bluff, but the truth is a source of constant doubt and anxiety.  I’m always missing the point, botching the directions, getting the measurements wrong. I barely got through high school algebra, and loath as I am to admit it, I’m easily confused by a task as simple as counting change.  Crossword puzzles puzzle me.  And when things get technical, well, I tend to get sleepy.

No sir, I don’t expect I’ll be sticking around for Final Jeopardy, if you know what I mean.

I mention my own lingering unease only because I believe the misapprehension and discomfiture many of us experience on a personal level has a  great deal to do with  the penumbral agitations surrounding the IQ debate in the broader scheme of human affairs. I’ve always been acutely aware of my cognitive limitations, thus for the longest time I was an easy mark for just about any palliative pop-psych fad that seemed suited to quell the self doubt.  I remember experiencing a quick rush of excitement when I first read about "multiple intelligences" in some 101 text I’ve since forgotten. Made me feel all warm and fuzzy and hopeful.

Here is a tip: be very, very wary of "facts" that evoke feelings of warmth and fuzziness.

Do you believe Thomas Edison struggled with differential equations?  Did you hear the one about that Nobel Laureate who "flunked" an IQ test? Do you find that intellectuals lack common sense?

Do you believe oatmeal can cure rabies?

While they may serve to allay your troubled mind, the nostrums and myths and pop-palliatives will never obviate the unchanging realities that we are stuck with. Namely, that intelligence is real; that it is largely fixed, and that Mother Nature is a fascist cunt who doesn’t give a fuck what you or I think.

Despite populist and humanist cant to the contrary, all societies place a premium on brain power for the same reason I covet it — because it is a scarce commodity of immeasurable value.  Virtually every amenity, every modern luxury, and every trivial new-fangled device or gadget or software application we take for granted in our all-too-comfortable existence can be traced in origin to the ever-churning mechanics of superior human brainwork.

Next time you flip on your TV or queue up another porn download, take a moment to reflect on the caliber of gray matter capable conceiving (not merely comprehending) the electro-mechanical complexities that sustain your hedonistic complacency.  Choose any prop of civilization — fuel combustion, pasteurization, vaccination, refrigeration, constitutional republicanism; whatever — and I defy you, lowly beneficiary that you are, not to marvel over the synaptic superiority to which its actuality is owed.  Think about the individuals who draw paradigm-shifting conclusions; who connect the cosmic dots and make possible the changes that change the world. The great architects and chemists and philosophers and neurosurgeons, and that small panthoen of true artists

Think about them.

Then think about that retarded kid who sat in the back of class.

Sorry to disappoint you, Virginia, but there is no Santa Claus.  And all things being equal, people aren’t even close.  May as well get used to it.  And enjoy those parting gifts, because they’re all you’re gonna get.

The temptation is probably hard-wired, this all-too-human predisposition to turn the grand narrative inward; to believe we are the center of creation. The Apple in God’s Eye.  It provides a sense of hope, of purpose and essential meaning. For most people, the notion that God might not share in our collective conceit; or worse, that  the very idea of God may add up to nothing more than a grandiose psychological projection adapted to quell our innate fear of death and take the edge of a hard day’s work, is about as easy to take as a sandpaper catheter.

Take it away, my horse-hugging hero:

Are we not perpetually falling? Backward, sideward, forward, in all directions? Is there any up or down left? Are we not straying as through an infinite nothing? Do we not feel the breath of empty space? Has it not become colder? Is it not more and more night coming on all the time? Must not lanterns be lit in the morning? Do we not hear anything yet of the noise of the gravediggers who are burying God? Do we not smell anything yet of God’s decomposition? Gods too decompose. God is dead. God remains dead.

Yet the disappointments — the dethronements, as the John Derbyshire aptly puts it — keep filing in.  And every time the ethereal promise of transcendence is undermined by the cold, disinterested eye of science, people tend to get squeamish.  And, as ever, the bearers of unwelcome news bear the burden of our unease.  Galileo is persecuted for questioning our place in the universe.  And Darwin is abominated for questioning our place in nature. These are tough breaks for a species with a fragile ego; being evicted from the center of the cosmos and pronounced the kin of common beasts. Gotta blame someone.

Of course, most thinking people have resigned to swallow the bitter pills of heliocentrism, and Galileo has been excused from the Papal shitlist. And while there is continuing debate over the tethers of Darwinian theory, evolution is no longer a subject of serious controversy.

Yet we still nurse and cherish our conceits.  And we continue to punish heretics.  The names may have changed, but the narrative is all-too-familiar. For centuries, the tension has been most acute where Judeo-Christian dogmas are confronted with the light of reason. Now it is the secular egalitarian dogmatist whose ox is being gored.  You will excuse me if I admit to being amused by the plot twist.

The idea of human biodiversity, of empirically grounded racial taxonomy, of innate — or intractable — inequality between individuals and groups, reverberates with the same ominous force that once weighed upon guardians of monotheistic dogma.  Liberal humanists are the new creationists, only they’re too smug to see the connection.

It’s only as bad as you want it to be, kids.  Once you shake off the pacifying buzz of those moralistic fallacies and noble lies, you may look up to find that the world hasn’t changed at all.  And by confronting the hobgoblin of human difference, you may find that the the unspeakable can be spoken quite easily. Perhaps you will be liberated to think more clearly, and with the requisite imagination to make sense of the problems that remain.  Even the end of the world can be reduced it to a manageable curiosity.   

Nature has had her say, but there’s no reason she should get the final word. Got a cigarette?  I’ll pick up the tab for the next round.  We can work this out.