fbpx

The Coming War over Intelligence

When I was a child—aged seven or eight—I was diagnosed with dyslexia, something known in the trade as a “specific learning disorder.” My problems were identified in the usual way for dyslexics—I was good at maths but couldn’t seem to learn to read. And, as is obvious from my appearance in Law & Liberty and successful legal and literary careers, they were easily fixed. My parents hired a tutor who taught reading using phonics rather than the then-fashionable “look-say” method, and I moved from the bottom to the top of the class with fair rapidity.

One of the side-effects of a dyslexia diagnosis, at least in the 70s and 80s, was regular IQ testing. Once or twice a year I’d traipse up to the administration block to be asked a series of questions by people who I later learned were educational psychologists and, occasionally, psychiatrists. The first few tests were wholly verbal and involved looking at pictures. Later, they progressed to the more familiar pencil and paper sort. By the end of primary school—when I was 11 or so—they were inevitably followed by anxious conferences between the principal, the testing psychologist, my classroom teacher, and my parents. I did wonder what was going on, but I was bribed to sit still and wait with Freddo Frogs and only afterwards learned the source of everyone’s disquiet.

My IQ had stabilised at 148, which was (and is) considered freakishly high. The last test, the WAIS-III (taken before I went up to Oxford) produced the same figure. I still have it sitting around the house somewhere. I say this not to boast, because I have no problem admitting that I inherited excessive cleverness in the same way other people inherit a stock portfolio or a country estate: from my mum and dad.

Of course, various unearned advantages of social class went with the IQ. My parents could afford a phonics tutor, for example. They impressed on me that, as someone who had been given so much, my country was within its rights to make significant demands on me. “Otherwise,” in mum’s pithy formulation, “it’s like landing on ‘Free Parking’ in Monopoly.” My father sat me down and said this explicitly, something he also did with my three siblings. I don’t know their IQs—none of them are dyslexic, so I suspect they were never tested—but they all enjoy lucrative professional careers. But dad was particularly concerned with me. “I don’t want my child falling off the nerd cliff,” he said in his distinctive Aberdeenshire accent. “And I don’t want her thinking cleverness buys her the right to tell other people what to do.”

What my parents were describing was, I suppose, the idea of “intellect plus character,” and the purpose of the throat-clearing introduction above is to foreground the book I think makes the best case for it: Richard Herrnstein and Charles Murray’s The Bell Curve.

I did not plan to write about a book my partner and I have come—over the last month—to call “the bad book” or “the naughty book,” as though it were a bodice-ripper to be wrapped in brown packing paper before one can safely read it on the tube. The Bell Curve came to my attention because it forms the basis of one section in another book I reviewed for the wonkish British magazine CapX: British commentator David Goodhart’s Head Hand Heart: Why Intelligence Is Over-Rewarded, Manual Workers Matter, and Caregivers Deserve More Respect.

Goodhart contends that much of the developed world requires a major change in the way we measure and reward social status. Part of this involves stripping cognitive elites of both wealth and power. “We have reached Peak Head,” Goodhart argues. “All too often, cognitive ability and meritocratic achievement is confused with moral worth.” He is upfront about the fact no great ethical tradition going back to antiquity considers high intellect a per se good.

I expected Goodhart to disagree with the arguments set out in The Bell Curve, to make claims for long-debunked ideas like “multiple intelligences” or “emotional intelligence,” but he doesn’t. He accepts the core of the earlier book. What he does do is demand a change of educational emphasis. Like my parents (and like Herrnstein and Murray, as I discovered) he argues that because so much of an individual’s IQ amounts to unearned merit, the intellectually gifted “owe one” to everyone else. We should not be in the business of rewarding people materially or socially simply because they are clever. That—to pinch one of Adam Smith’s insights—is like holding people in high esteem simply because they’re rich.

The Bell Curve has become irreducibly associated with average IQ differences between racial groups (more accurately, populations), and the extent to which intellectual differences are heritable for everyone, regardless of race. This is unfortunate for two reasons: first, ordinary members of the public often think it’s been debunked (it hasn’t). Secondly, people who are clever and who find cognitive activities remunerative often assume they’re automatically “worth it,” deserving of wealth and accolades simply because of their intelligence (they aren’t). It’s as though one could hop in the nearest Tardis, go back in time, and choose one’s parents: a lot of smart people truly believe they did it all by themselves.

The latter phenomenon has become pervasive on the political left, and fuels contemporary policies aimed at producing “equity” (equality of outcomes) rather than equality of opportunity. Many otherwise bright people focus on systemic disadvantage such that they’re blind to their personal, inherited advantages, as well as the extent to which they enjoy benefits from the cognitive class stratification both Head Hand Heart and The Bell Curve identify. I do sometimes wonder if their commitment to equality of outcomes is also borne of the realisation that genuine equality of opportunity means any variations in intellectual attainment can only be explained by genetic variation and heritability. Remove or attenuate poverty and ensure all children have a good diet (the latter is very important), and many of the environmental differences between people that bear on IQ disappear. This process does not, however, produce equality of outcomes, and it’s naïve to think it would.

A number of unusually stable and prosperous countries—Norway and Australia come to mind—have come quite close to achieving equality of opportunity for the great bulk of their populations. And if you got a representative sample of Australians and Norwegians to sit an IQ test, you would get a similar bell curve with a distribution akin to what one sees in more unequal countries like the US or UK. This holds even though Australia has probably shifted its curve to the right thanks to a system of immigration that favours the educated and middle class (both are proxies for IQ, although IQ is more predictive of outcomes than either educational attainment or social class). Even with equality of opportunity and points-based immigration, it’s not possible to turn entire countries into Lake Wobegon, where all the children are above average.

It always struck me as odd that people accepted without qualm obvious differences in sporting ability while noting the importance of qualities like discipline (for training) or character (for pushing through the pain barrier). Regular folk understood that no amount of effort was going to turn them into Usain Bolt or Serena Williams, all the while acknowledging that if Usain and Serena sat on the sofa all day eating takeaway pizza, neither would be a champion athlete. These days, however, even sport is under assault, and in much the same way as IQ came to be in 1994, when The Bell Curve was published. Think, for example, of the claim that women can compete—particularly in events requiring speed and power—with biological males.

Even with equality of opportunity and points-based immigration, it’s not possible to turn entire countries into Lake Wobegon, where all the children are above average.

A number of recent books and a great deal of commentary blame weird academic fashions and shoddy scholarship—both products of a higher education sector that’s grown like kudzu in the last 40 years—for absurd claims like, say, differences in educational and sporting achievements being the consequence solely of racism or sexism.

This argument is true as far as it goes—the universities are loaded to the gunwales with pseudoscientific nonsense—but it isn’t the whole story. Governments in developed countries all over the world have spent trillions improving equality of opportunity, often naïvely assuming it would produce “equity” or something close to it. To my mind, the academic pseudoscience one sees all around us is as much a product of bitter disappointment at the failure to achieve a greatly desired policy goal as it is a cause in its own right. It’s the intellectual equivalent of hiding under the bedcovers, sticking fingers in one’s ears, and shouting “lalalalala.”

In addition, The Bell Curve reminded me that failure to generate equality of outcomes on the back of equality of opportunity hasn’t just damaged the political left: it’s also knocked some right-leaning traditions into a cocked hat as well. It turns out discipline and personal responsibility aren’t enough, which is a hard thing for conservatives to hear. Deontological libertarianism, meanwhile (never popular outside the US, to be fair), also struggles in the face of the reality of human inequality.

“Many contemporary libertarians who draw their inspiration from Locke,” Herrnstein and Murray note, “are hostile to the possibility of genetic differences in intelligence because of their conviction that equal rights only apply if in fact people at birth are tabulae rasae.” It doesn’t matter that this isn’t quite what Locke said (although he was talking out of his alternative orifice when it comes to tabula rasa). The point is that entire intellectual traditions on both sides of the aisle have evolved over centuries on the basis that certain things are true, when they’re not. For me, this helped explain why—while some libertarians have dived into QAnon conspiracies—others have become worryingly woke.

This has all been brought to a head by the realisation that there are scientists out there (albeit not in liberal democracies) who are undoubtedly figuring out how to manipulate human genetics in order to make people smarter or faster or able to see in the dark. On that point, one could do worse than read Stuart Ritchie’s Intelligence: All that Matters. Published in 2015—21 years after The Bell Curve—it benefits from the simple fact that science marches on.

Among other things, it’s frank about the extent to which many of the most able don’t like even the idea of IQ. “Mention it in polite company,” Ritchie notes, “and you’ll be informed (sometimes rather sternly) that IQ tests don’t measure anything real, and reflect only how good you are at doing IQ tests.” This, I suspect, is a legacy of The Bell Curve and its reception, especially given Herrnstein died shortly before the book was published. Murray had to bear public opprobrium alone.

Not only is Ritchie’s book small enough to hide in the palm of your hand (as opposed to the wrist-spraining 600-pages-plus printed on Bible paper of The Bell Curve), his section on genetics is certain where Herrnstein and Murray are tentative. And where Ritchie is tentative, he is alarming. Scientists have known for decades that genes contribute to differences in intelligence. The Bell Curve discusses this issue in detail and Ritchie adds only a little to the earlier book. However, progress is now happening in a related but scientifically distinct area, known as “molecular genetics.” Molecular genetics is concerned with what combination of genes cause intelligence differences. As Richard Dawkins once commented, the problem with eugenics isn’t that it doesn’t work, but that it does.

I used to be one of those people who was opposed to researching the genetic basis of human inequality, whether it concerned intelligence or sporting ability. Like Herrnstein, Murray, and Ritchie I was well aware of its dreadful history: as much as the Holocaust, Nazi Germany’s eugenics program is what has made Adolf Hitler a sort of modern folk-devil for the non-religious. However, I’ve changed my mind, and not just because communist regimes—with their blank-slate idealism and concomitant failed attempts at social engineering—killed more than Hitler. I’ve changed my mind because, if this isn’t tackled head on—with honesty and rigour and humanity—the authoritarian states will get there first, and they have far fewer scruples. “Given the rapid advance of GWAS [Genome-Wide Association Study],” Ritchie observes, “we need a measured, informed debate over the ethics and legality of prenatal selection for intelligence, and we need it soon.”

We have already seen what China can do in terms of social order and pandemic control with artificial intelligence and its “social credit” system. Part of me suspects that country’s regime is using GATTACA as an instruction manual and not a warning. I wrote two novels about what such a society would look like (also warnings and not instruction manuals, note). This reality is closer now, no longer confined to science fiction.

Careful with that test-tube, Eugene.