The Not-So-Hidden Cause Behind the A.D.H.D. Epidemic
New York Times Magazine
By MAGGIE KOERTH-BAKER
October 15, 2013
Between the fall of 2011 and the spring of 2012, people across the United States suddenly found themselves unable to get their hands on A.D.H.D. medication. Low-dose generics were particularly in short supply. There were several factors contributing to the shortage, but the main cause was that supply was suddenly being outpaced by demand.
The number of diagnoses of Attention Deficit Hyperactivity Disorder has ballooned over the past few decades. Before the early 1990s, fewer than 5 percent of school-age kids were thought to have A.D.H.D. Earlier this year, data from the Centers for Disease Control and Prevention showed that 11 percent of children ages 4 to 17 had at some point received the diagnosis — and that doesn’t even include first-time diagnoses in adults. (Full disclosure: I’m one of them.)
That amounts to millions of extra people receiving regular doses of stimulant drugs to keep neurological symptoms in check. For a lot of us, the diagnosis and subsequent treatments — both behavioral and pharmaceutical — have proved helpful. But still: Where did we all come from? Were that many Americans always pathologically hyperactive and unable to focus, and only now are getting the treatment they need?
Probably not. Of the 6.4 million kids who have been given diagnoses of A.D.H.D., a large percentage are unlikely to have any kind of physiological difference that would make them more distractible than the average non-A.D.H.D. kid. It’s also doubtful that biological or environmental changes are making physiological differences more prevalent. Instead, the rapid increase in people with A.D.H.D. probably has more to do with sociological factors — changes in the way we school our children, in the way we interact with doctors and in what we expect from our kids.
Which is not to say that A.D.H.D. is a made-up disorder. In fact, there’s compelling evidence that it has a strong genetic basis. Scientists often study twins to examine whether certain behaviors and traits are inborn. They do this by comparing identical twins (who share almost 100 percent of the same genes) with fraternal twins (who share about half their genes). If a disorder has a genetic basis, then identical twins will be more likely to share it than fraternal twins. In 2010, researchers at Michigan State University analyzed 22 different studies of twins and found that the traits of hyperactivity and inattentiveness were highly inheritable. Numerous brain-imaging studies have also shown distinct differences between the brains of people given diagnoses of A.D.H.D. and those not — including evidence that some with A.D.H.D. may have fewer receptors in certain regions for the chemical messenger dopamine, which would impair the brain’s ability to function in top form.
None of that research yet translates into an objective diagnostic approach, however. Before I received my diagnosis, I spent multiple sessions with a psychiatrist who interviewed me and my husband, took a health history from my doctor and administered several intelligence tests. That’s not the norm, though, and not only because I was given my diagnosis as an adult. Most children are given the diagnosis on the basis of a short visit with their pediatrician. In fact, the diagnosis can be as simple as prescribing Ritalin to a child and telling the parents to see if it helps improve their school performance.
This lack of rigor leaves room for plenty of diagnoses that are based on something other than biology. Case in point: The beginning of A.D.H.D. as an “epidemic” corresponds with a couple of important policy changes that incentivized diagnosis. The incorporation of A.D.H.D. under the Individuals With Disabilities Education Act in 1991 — and a subsequent overhaul of the Food and Drug Administration in 1997 that allowed drug companies to more easily market directly to the public — were hugely influential, according to Adam Rafalovich, a sociologist at Pacific University in Oregon. For the first time, the diagnosis came with an upside — access to tutors, for instance, and time allowances on standardized tests. By the late 1990s, as more parents and teachers became aware that A.D.H.D. existed, and that there were drugs to treat it, the diagnosis became increasingly normalized, until it was viewed by many as just another part of the experience of childhood.
Stephen Hinshaw, a professor of psychology at University of California, Berkeley, has found another telling correlation. Hinshaw was struck by the disorder’s uneven geographical distribution. In 2007, 15.6 percent of kids between the ages of 4 and 17 in North Carolina had at some point received an A.D.H.D. diagnosis. In California, that number was 6.2 percent. This disparity between the two states is representative of big differences, generally speaking, in the rates of diagnosis between the South and West. Even after Hinshaw’s team accounted for differences like race and income, they still found that kids in North Carolina were nearly twice as likely to be given diagnoses of A.D.H.D. as those in California.
Hinshaw, as well as sociologists like Rafalovich and Peter Conrad of Brandeis University, argues that such numbers are evidence of sociological influences on the rise in A.D.H.D. diagnoses. In trying to narrow down what those influences might be, Hinshaw evaluated differences between diagnostic tools, types of health insurance, cultural values and public perceptions of mental illness. Nothing seemed to explain the difference — until he looked at educational policies.
The No Child Left Behind Act, signed into law by President George W. Bush, was the first federal effort to link school financing to standardized-test performance. But various states had been slowly rolling out similar policies for the last three decades. North Carolina was one of the first to adopt such a program; California was one of the last. The correlations between the implementation of these laws and the rates of A.D.H.D. diagnosis matched on a regional scale as well. When Hinshaw compared the rollout of these school policies with incidences of A.D.H.D., he found that when a state passed laws punishing or rewarding schools for their standardized-test scores, A.D.H.D. diagnoses in that state would increase not long afterward.
Nationwide, the rates of A.D.H.D. diagnosis increased by 22 percent in the first four years after No Child Left Behind was implemented.
To be clear: Those are correlations, not causal links. But A.D.H.D., education policies, disability protections and advertising freedoms all appear to wink suggestively at one another. From parents’ and teachers’ perspectives, the diagnosis is considered a success if the medication improves kids’ ability to perform on tests and calms them down enough so that they’re not a distraction to others. (In some school districts, an A.D.H.D. diagnosis also results in that child’s test score being removed from the school’s official average.) Writ large, Hinshaw says, these incentives conspire to boost the diagnosis of the disorder, regardless of its biological prevalence.
Rates of A.D.H.D. diagnosis also vary widely from country to country. In 2003, when nearly 8 percent of American kids had been given a diagnosis of A.D.H.D., only about 2 percent of children in Britain had. According to the British National Health Service, the estimate of kids affected by A.D.H.D. there is now as high as 5 percent. Why would Britain have such a comparatively low incidence of the disorder? But also, why is that incidence on the rise?
Conrad says both questions are linked to the different ways our societies define disorders. In the United States, we base those definitions on the Diagnostic and Statistical Manual of Mental Disorders (D.S.M.), while Europeans have historically used the International Classification of Diseases (I.C.D.). “The I.C.D. has much stricter guidelines for diagnosis,” Conrad says. “But, for a variety of reasons, the D.S.M. has become more widely used in more places.” Conrad, who’s currently researching the spread of A.D.H.D. diagnosis rates, believes that America is essentially exporting the D.S.M. definition and the medicalized response to it. A result, he says, is that “now we see higher and higher prevalence rates outside the United States.”
According to Joel Nigg, professor of psychiatry at Oregon Health and Science University, this is part of a broader trend in America: the medicalization of traits that previous generations might have dealt with in other ways. Schools used to punish kids who wouldn’t sit still. Today we tend to see those kids as needing therapy and medicine. When people don’t fit in, we react by giving their behavior a label, either medicalizing it, criminalizing it or moralizing it, Nigg says.
For some kids, getting medicine might be a better outcome than being labeled a troublemaker. But of course there are also downsides, especially when there are so many incentives encouraging overdiagnosis. Medicalization can hurt people just as much as moralizing can. Not so long ago, homosexuality was officially considered a mental illness. And in a remarkable bit of societal blindness, the diagnosis of drapetomania was used to explain why black slaves would want to escape to freedom.
Today many sociologists and neuroscientists believe that regardless of A.D.H.D.’s biological basis, the explosion in rates of diagnosis is caused by sociological factors — especially ones related to education and the changing expectations we have for kids. During the same 30 years when A.D.H.D. diagnoses increased, American childhood drastically changed. Even at the grade-school level, kids now have more homework, less recess and a lot less unstructured free time to relax and play.
It’s easy to look at that situation and speculate how “A.D.H.D.” might have become a convenient societal catchall for what happens when kids are expected to be miniature adults. High-stakes standardized testing, increased competition for slots in top colleges, a less-and-less accommodating economy for those who don’t get into colleges but can no longer depend on the existence of blue-collar jobs — all of these are expressed through policy changes and cultural expectations, but they may also manifest themselves in more troubling ways — in the rising number of kids whose behavior has become pathologized.