Tuesday, September 30, 2014

Most People With Addiction Simply Grow Out of It: Why Is This Widely Denied?



There are many paths to recovery—and if we want to help people get there, we need to explore all of them.
 
 
When I stopped shooting coke and heroin, I was 23. I had no life outside of my addiction. I was facing serious drug charges and I weighed 85 pounds, after months of injecting, often dozens of times a day.

But although I got treatment, I quit at around the age when, according to large epidemiological studies, most people who have diagnosable addiction problems do so—without treatment. The early to mid-20s is also the period when the prefrontal cortex—the part of the brain responsible for good judgment and self-restraint—finally reaches maturity.

According to the American Society of Addiction Medicine, addiction is “a primary, chronic disease of brain reward, motivation, memory and related circuitry.” However, that’s not what the epidemiology of the disorder suggests. By age 35, half of all people who qualified for active alcoholism or addiction diagnoses during their teens and 20s no longer do, according to a study of over 42,000 Americans in a sample designed to represent the adult population.

The average cocaine addiction lasts four years, the average marijuana addiction lasts six years, and the average alcohol addiction is resolved within 15 years. Heroin addictions tend to last as long as alcoholism, but prescription opioid problems, on average, last five years. In these large samples, which are drawn from the general population, only a quarter of people who recover have ever sought assistance in doing so (including via 12-step programs). This actually makes addictions the psychiatric disorder with the highest odds of recovery.

While some addictions clearly do take a chronic course, these data, which replicates earlier research, suggests that many do not. And this remains true even for people like me, who have used drugs in such high, frequent doses and in such a compulsive fashion that it is hard to argue that we “weren’t really addicted.” I don’t know many non-addicts who shoot up 40 times a day, get suspended from college for dealing and spend several months in a methadone program.

Moreover, if addiction were truly a progressive disease, the data should show that the odds of quitting get worse over time. In fact, they remain the same on an annual basis, which means that as people get older, a higher and higher percentage wind up in recovery. If your addiction really is “doing push-ups” while you sit in AA meetings, it should get harder, not easier, to quit over time. (This is not an argument in favor of relapsing; it simply means that your odds of recovery actually get better with age!)

So why do so many people still see addiction as hopeless? One reason is a phenomenon known as “the clinician’s error,” which could also be known as the “journalist’s error” because it is so frequently replicated in reporting on drugs. That is, journalists and rehabs tend to see the extremes: Given the expensive and often harsh nature of treatment, if you can quit on your own you probably will. And it will be hard for journalists or treatment providers to find you.

Similarly, if your only knowledge of alcohol came from working in an ER on Saturday nights, you might start thinking that prohibition is a good idea. All you would see are overdoses, DTs, or car crash, rape or assault victims. You wouldn’t be aware of the patients whose alcohol use wasn’t causing problems. And so, although the overwhelming majority of alcohol users drink responsibly, your “clinical” picture of what the drug does would be distorted by the source of your sample of drinkers.

Treatment providers get a similarly skewed view of addicts: The people who keep coming back aren’t typical—they’re simply the ones who need the most help. Basing your concept of addiction only on people who chronically relapse creates an overly pessimistic picture.

This is one of many reasons why I prefer to see addiction as a learning or developmental disorder, rather than taking the classical disease view. If addiction really were a primary, chronic, progressive disease, natural recovery rates would not be so high and addiction wouldn’t have such a pronounced peak prevalence in young people.

But if addiction is seen as a disorder of development, its association with age makes a great deal more sense. The most common years for full onset of addiction are 19 and 20, which coincides with late adolescence, before cortical development is complete. In early adolescence, when the drug taking that leads to addiction by the 20s typically begins, the emotional systems involved in love and sex are coming online, before the cognitive systems that rein in risk-taking are fully active.

Taking drugs excessively at this time probably interferes with both biological and psychological development. The biological part is due to the impact of the drugs on the developing circuitry itself—but the psychological part is probably at least as important. If as a teen you don’t learn non-drug ways of soothing yourself through the inevitable ups and downs of relationships, you miss out on a critical period for doing so. Alternatively, if you do hone these skills in adolescence, even heavy use later may not be as hard to kick because you already know how to use other options for coping.

The data supports this idea: If you start drinking or taking drugs with peers before age 18, you have a 25% chance of becoming addicted, but if your use starts later, the odds drop to 4%. Very few people without a prior history of addiction get hooked later in life, even if they are exposed to drugs like opioid painkillers.

If we see addiction as a developmental disorder, all of this makes much more sense. Many kids “age out” of classical developmental disorders like attention deficit/hyperactivity disorder (ADHD) as their brains catch up to those of their peers or they develop workarounds for coping with their different wiring. One study, for example, which followed 367 children with ADHD into adulthood found that 70% no longer had significant symptoms.

That didn’t mean, however, that a significant minority didn’t still need help, of course, or that ADHD isn’t “real.” Like addiction (and actually strongly linked with risk for it), ADHD is a wiring difference and a key period for brain-circuit-building is adolescence. In both cases, maturity can help correct the problem, but doesn’t always do so automatically.

To better understand recovery and how to teach it, then, we need to look to the strengths and tactics of people who quit without treatment—and not merely focus on clinical samples. Common threads in stories of recovery without treatment include finding a new passion (whether in work, hobbies, religion or a person), moving from a less structured environment like college into a more constraining one like 9 to 5 employment, and realizing that heavy use stands in the way of achieving important life goals. People who recover without treatment also tend not to see themselves as addicts, according to the research in this area.

While treatment can often support the principles of natural recovery, too often it does the opposite. For example, many programs interfere with healthy family and romantic relationships by isolating patients. Some threaten employment and education, suggesting or even requiring that people quit jobs or school to “focus on recovery,” when doing so might do more harm than good. Others pay too much attention to getting people to take on an addict identity—rather than on harm related to drug use—when, in fact, looking at other facets of the self may be more helpful.

There are many paths to recovery—and if we want to help people get there, we need to explore all of them. That means recognizing that natural recovery exists—and not dismissing data we don’t like.

Source

No comments:

Post a Comment