June 2, 2010

IQ, ADD, Depression

It's amusing that IQ testing is always being accused of "reification" and general pseudoscienceness when intelligence is one of the least fuzzy, most consistently measurable of all concepts in psychology. Compare IQ to, for example, more popular categories like Attention Deficit Disorder or depression.

ADD (or ADHD -- note the vagueness of terminology) is only now beginning to be measured objectively:
Last fall the National Institutes of Health awarded Dr. Teicher a $1 million grant from the federal stimulus package to delve further into the quest for a definitive test or biomarker for the disorder. He plans to focus his research on three detective strategies: his Quotient system, magnetic resonance imaging to compare blood flows in different brain regions, and the ActiGraph, an activity monitor widely used by medical researchers.

James M. Swanson, a developmental psychologist and attention researcher at the University of California, Irvine, praised Dr. Teicher’s research, echoing his concerns about the need for a more objective test to detect the disorder. But he questioned whether the Quotient system produces more reliable diagnoses than a doctor’s dogged questioning of a child’s parents and teachers, and also whether it is an appropriate way to figure out the right dose of medication.

“It’s essentially a dull, boring task,” he said of the Quotient system, “so do you want to medicate your child to pay attention to dull, boring tasks?”

I bet Elena Kagan always paid attention to her dull, boring tasks.

This is interesting:
The key to his system, he said, is what he suspects will eventually be confirmed as a valid biological marker for A.D.H.D.: an unstable control of head movements and posture, particularly while paying attention to a boring task. 

I wouldn't have guessed that.

In another NYT article, Benedict Carey discusses how the medical profession has tried to stamp out the term "nervous breakdown" in favor of "depression," in part because they can make more money off depression.
Decades ago modern medicine all but stamped out the nervous breakdown, hitting it with a combination of new diagnoses, new psychiatric drugs and a strong dose of professional scorn. The phrase was overused and near meaningless, a self-serving term from an era unwilling to talk about mental distress openly.

But like a stubborn virus, the phrase has mutated.

In recent years, psychiatrists in Europe have been diagnosing what they call “burnout syndrome,” the signs of which include “vital exhaustion.” A paper published last year defined three types: “frenetic,” “underchallenged,” and “worn out” (“exasperated” and “bitter” did not make the cut).

This is the latest umbrella term for the kind of emotional collapses that have plagued humanity for ages, stemming at times from severe mental difficulties and more often from mild ones. There have been plenty of others. In the early decades of the 20th century, many people simply referred to a crackup, including “The Crack-Up,” F. Scott Fitzgerald’s 1936 collection of essays describing his own. And before that there was neurasthenia, a widely diagnosed and undefined nerve affliction causing just about any symptom people cared to add.

Yet medical historians say that, for versatility and descriptive power, it may be hard to improve upon the “nervous breakdown.” Coined around 1900, the phrase peaked in usage during the middle of the 20th century and echoes still....
Never a proper psychiatric diagnosis, the phrase always struck most doctors as inexact, pseudoscientific and often misleading. But those were precisely the qualities that gave it such a lasting place in the popular culture, some scholars say. “It had just enough medical sanction to be useful, but did not depend on medical sanction to be used,” said Peter N. Stearns, a historian at George Mason University near Washington, D.C. ...

The vagueness of the phrase made it impossible to survey the prevalence of any specific mental problem: It could mean anything from depression to mania or drunkenness; it might be the cause of a bitter divorce or the result of a split. And glossing over those details left people who suffered from what are now well-known afflictions, like postpartum depression, entirely in the dark, wondering if they were alone in their misery.

But that same imprecision allowed the speaker, not medical professionals, to control its meaning. People might be on the verge of, or close to, a nervous breakdown; and it was common enough to have had “something like” a nervous breakdown, or a mild one. The phrase allowed a person to disclose as much, or as little, detail about a “crackup” as he or she saw fit. Vagueness preserves privacy....

“People accepted the notion of nervous breakdown often because it was construed as a category that could handled without professional help,” concluded a 2000 analysis by Dr. Stearns, Megan Barke and Rebecca Fribush. The popularity of the phrase, they wrote, revealed “a longstanding need to keep some distance from purely professional diagnoses and treatments.”

Many did just that, and returned to work and family. Others did not. They needed a more specific diagnosis, and targeted treatment. By the 1970s, more psychiatric drugs were available, and doctors directly attacked the idea that people could effectively manage breakdowns on their own.
Psychiatrists proceeded to slice problems like depression and anxiety into dozens of categories, and public perceptions shifted, too.

But that doesn't mean that the term "depression" is all that much more scientific. The basic term fails, for example, to distinguish between two important types of depression: the kind with obvious causes and the terrible kind without. 

For example, I've been depressed several times in my life, but it was always for really obvious reasons: I had cancer and might die, the company I was working for was obviously going to go under and I would lose my job, and so forth. Those things were depressing, but there wasn't a whole lot the field of psychiatry could do about it. Solve the underlying problems and the depression would go away. 

In contrast, other people have been hit by depression out of the blue with no obvious cause, and that is a highly appropriate field for mental health.

Compared to ADD/ADHD and nervous breakdown / depression, IQ testing is like dropping a heavy ball and a light ball off the Leaning Tower of Pisa and seeing which hits the ground first.

24 comments:

dearieme said...

Answer: it depends on their diameters.

John Seiler said...

As to ADHD, or whatever it's called, why didn't this exist until recently? On depression, better termed melancholy, that one's obviously been around for a long time, as in Burton's "Anatomy of Melancholy."

On IQ, here's a quick test: Ask people if they know the definition of "reification" without looking it up? (Rule out anyone younger than 22.) Would that itself be reification? No.

Max said...

People have known about AD(H)D for a very long time, they just called it by a different name. Ancient Greek medical books described the disorder in children (I'm not kidding). More recently it was called a "nerve disorder." If you watch the director's cut of the Exorcist, a shrink diagnoses Reagan with a nerve disorder and prescribes her ritalin.

Anyway, here's an interesting anecdote for HBD aficianados to chew on. I'm a gay man with ADD - the inattentive type, without the hyperactive component. This is the type more commonly diagnosed in girls and women. I know a few other gay men with the inattentive type and a few lesbians with the hyperactive type, but no gays with the type you'd expect them to have based on their sex. Neat, huh?

Anyway, it's also worth noting that people with the hyperactive type are much more likely to outgrow the disorder than are people with the inattentive type (though I have a sister who outgrew her inattentive ADD).

B322 said...

Answer: it depends on their diameters.

You mean, assuming they have the same density?

______


A lot of depression is functionally identical to learned helplessness.

Bullying works, not just because bullies ignore the rules against violence, but also because the victims obey them. The victims learn to sets of rules - the formal ones, prohibiting violence, and the ... other set of rules, and cannot reconcile them, because they understand the abstract reasons violence is bad. Bullies don't, and have an easier time of it.

Thus, people with a better ability to understand abstractions learn to be helpless in institutional schools. Prussian education is simply the instruction manual to taught helplessness.

Moving up in elite institutions requires abstract reasoning ability.

The elites (not counting the bad guys) seem helpless to do anything about Invade the World, Invade the World, In Hock to the World. In reality, learned helplessness can also be considered a character flaw, or even a form of evil.

For who, nowadays, can draw a strict moral distinction between the violent actions of the bad people and the silence and indifference of the good?

l said...

Not everybody can be a genius. Axis I disorders are equal opportunity.

Anonymous said...

John Seiler,
I'm skeptical of most ADHD claims, but I don't believe that the recent vintage of the diagnosis itself is really all that persuasive. The last few years have been a time of unprecedented change on earth, and it is possible that some new problems have come to the forefront.

Contemporary brains are bombarded with stimuli at a level that the ancients (to say nothing of prehistoric man) were not. The human brain did not evolve to deal with the flashing lights and constant sound undulations of (say) Pokemon.

Also, there is the fact that we're synthesizing all sorts of chemicals and putting them into food, water, and air without fully understanding what the long-term effects are in all cases, specifically on the least well-understood of all the body's organs, the brain.

Add in the effects that magnetic fields from cell-phones are alleged to have on the brain, and you have a LOT of potential variables other than bad parenting and medical-profession cupidity.

Anonymous said...

John Seiler,
I'm skeptical of most ADHD claims, but I don't believe that the recent vintage of the diagnosis itself is really all that persuasive. The last few years have been a time of unprecedented change on earth, and it is possible that some new problems have come to the forefront.

Contemporary brains are bombarded with stimuli at a level that the ancients (to say nothing of prehistoric man) were not. The human brain did not evolve to deal with the flashing lights and constant sound undulations of (say) Pokemon.

Also, there is the fact that we're synthesizing all sorts of chemicals and putting them into food, water, and air without fully understanding what the long-term effects are in all cases, specifically on the least well-understood of all the body's organs, the brain.

Add in the effects that magnetic fields from cell-phones are alleged to have on the brain, and you have a LOT of potential variables other than bad parenting and medical-profession cupidity.

nooffensebut said...

But that doesn't mean that the term "depression" is all that much more scientific. The basic term fails, for example, to distinguish between two important types of depression: the kind with obvious causes and the terrible kind without.

The Diagnostic and Statistical Manual of Mental Disorders (DSM) has a variety of categories for disorders of depressed mood. The broad category of adjustment disorder is manifested by depressed mood as a response to life difficulties. So is bereavement. Major depressive disorder (MDD) is the classic form of depression, in which the patient has had depressive episodes, which can have a number of symptoms, and depressed mood is just one such symptom.

I think the real test of the MDD diagnosis is the study by Fournier et al that found that only patients with high Hamilton Depression Rating Scale scores benefited from SSRIs more than placebo. Most people who take SSRIs do not have such high scores.

DSM-V is on its way, but the emphasis of subjective categorization is frustrating. I really believe that behavioral genetics could transform psychiatry. It is such a shame that more people are not plugged into the data on the genetics of violence. I think impulsive violence deserves a DSM category and new treatment research.

Anonymous said...

First paragraph of Moby Dick

Whenever I find myself growing grim about the mouth; whenever it is a damp, drizzly November in my soul; whenever I find myself involuntarily pausing before coffin warehouses, and bringing up the rear of every funeral I meet; and especially whenever my hypos get such an upper hand of me, that it requires a strong moral principle to prevent me from deliberately stepping into the street, and methodically knocking people's hats off--then, I account it high time to get to sea as soon as I can.

Anonymous said...

Instead of getting to sea after my first nervous breadvan, I changed my diet and swerved away from the gruesome food pyramid and its array of addictive carbs towards more protein, good fats and veges. No sugar.
Late autumn gloominess warned me of the need for Vit D and to seek light and so became a farmer.

The change in diet and environment helps in spotting when one's defences are low and to take early action.

Charlie said...

You're right, Steve, but you could probably expand this to include - oh, about every diagnosed "psychological disorder".

For instance, if there's one undisputably "real" psychological disorder it would be schizophrenia. But aside from auditory hallucinations (which most schizophrenics don't have) the definition of this "disease" is more vague than most people think, and shades off into "schizoaffective" or "schizophrenoform" disorders - or into manic-depression, another "well-defined" illness. I figured out a while ago that I myself have one of these things, but I have no idea which, and after a lot of research I suspect psychiatrists would not really reach a consensus either. It hardly seems worth the money.

Or sociopathy, another disorder that we all think we know about - except that some sociopaths are not habitual liars, some of them do abide by moral principles, and many otherwise elude the common idea of a sociopath.

Interestingly, a recent study in Sweden showed that schizophrenia and manic depression are strongly correlated in hereditary terms. IMO this casts doubt on the entire edifice of "diagnosis" of these diseases; possibly Eysenck (and some psychologists before him) was right to lump all of these disorders (Eysenck even included anti-social personality if I remember right) under a general scale of "psychoticism".

This is why different eras have wildly different theories of deviant psychology - they see a bunch of crazy people, and try to fit the craziness onto some Procrustean bed. Do you remember those old British novels where all the crazy people think they are a poached egg, or something of that kind? It's a delusion I've never heard of in real life, but those Victorians seemed convinced that it was a grave psychological menace.

OTOH it's maybe inaccurate to say that "g theory" is that much more solid. "g-loaded" test results are stable over lifetimes, correlated with certain measurable life outcomes and somewhat heritable, but that isn't to say we really know what they measure. I know that you, and HBD enthusiasts in general like to think of it as measuring overall brain function. But this is far from proven, and the usual argument that it makes more sense than "multiple intelligence" theories is a straw man. One real competing theory is that IQ largely measures working memory, or a person's capacity to hold a lot of things in his head at the same time, which doesn't imply any "multiple intelligence" theory so much as that linear "intelligence" is not a very real property - except inasmuch as it refers to working memory.

Steve Sailer said...

Charlie:

Thanks.

The working memory idea is interesting. Working memory certainly has a lot to do with trainability.

As working memory declines with age, ability to learn new techniques declines. On the other hand, there remain differences in ability to work with techniques you already know.

Charlie said...

Yes, I started to have trouble learning new techniques...some time in high school.

One last thought about g is that it certainly does have a "common sense" appeal, unlike a lot of psychological categories. But I wonder if that's a bit like our naive, common-sense idea of a car engine - like "this engine is about to die". If I take my "dying" car to a mechanic he will reduce this linear scale of value to something concrete, and probably related to just a few parts of the engine. IQ could be the same thing...or not.

CJ said...

I personally believe that I have to cope with elements of what is referred to as ADHD. However, it's often constructive to engage strong contrary arguments, and the best-expressed denial that ADHD is real is that of Peter Hitchens, Christopher's smarter brother.

The ADHD Fantasy - posted once more

Dutch Boy said...

ADHD and ADD diagnoses have risen in tandem with diagnoses of autism and the disorders likely have a common source (toxic exposures in utero and during infancy).

stephen said...

The two balls are going to fall at the same rate as long as the air resistance is the same. Approximately 9.8 m/s^2. Other then that, yeah I agree.

R. J. Stove said...

Whatever happened to the diagnosis of "dyslexia"? In the 1970s it was everywhere, just as ADHD is now.

Funny thing is, it was never medicalized with anything like Ritalin, either because in the 1970s people had less spending money for popping pills, or because there were simply far fewer pills to pop, or because it was generally accepted that some kids would have reading difficulties anyway, stuff happens, so Get Over It.

Pace Steve Sailer's remark "The basic term fails, for example, to distinguish between two important types of depression: the kind with obvious causes and the terrible kind without", I think - on the basis of some experience - that the former can be just as terrible as the latter.

Somewhere, I remember, it has been alleged (a Google search hasn't pointed it to me yet) that various doctors interviewed (1980 or so?) a particular subset of World War II victims: persons who (a) had survived Third Reich death camps and (b) had subsequently, in peacetime, lost a loved one to suicide. If I recollect rightly, a large majority of those interviewed said that they actually suffered still worse misery with (b) than with (a). The greatest horror, for them, occurred when the loved one who'd died by his own hand was a child.

Well, no-one has ever put me in a death camp; unlike Steve Sailer I've never suffered a potentially lethal physical illness; and I've never had a child, let alone a child who took his own life. But my father, about whom Steve Sailer has written, did hang himself in 1994.

For about 18 months afterward I alternated between volcanic despair and complete zombiehood, against which medicines were - perhaps fortunately - useless. (William Styron's Darkness Visible describes the condition better than I could ever do.)

In that period I only twice shed a single tear. Once was when I re-heard Pablo Casals' recording of the Bach Cello Suites (the resemblance between Casals's tone and my father's speaking voice was uncanny). The other time was the news of the Oklahoma City bombing. But simultaneously my outbursts of verbal temper (most people who meet me think me very shy and mild-mannered) were pitiless and sarcastic. I really don't think that I could have either felt any worse myself, or made others feel any worse, if my state had been endogenously caused.

l said...

B Lode said...

A lot of depression is functionally identical to learned helplessness.


Or inculcated helplessness. The mental health biz/medicine works hard to make people feel that they're unable to help themselves. It used to be that if someone was feeling blue they were told "quit feeling sorry for yourself." Now they're told "you need to see someone."

I think people are more miserable than ever, thanks to the 'mental health' profession. Therapy encourages people to be obsessive about their negative feelings.

Gene Berman said...

John Seiler:

Life (I'd suspect especially civilized, educated life) is chock-full of reification--no way to do without it; there's no other way to get a "handle" on what it is we're talking about. In a sense, its form and purpose are similar to
what we call "figure of speech," but nmerely less obviously figurative.

I comment on this now because I'm aware that you and I are "on the same page" in the area of economics and that subject is enormously complicated by the necessary fact that men actually require reification in order to rationalize their actions, policies, adoption of goals, etc. Much of the problems in Economics arise simply from the fact that there are better and worse reifications, not because reification is an inherently erroneous process or even one with--seeking to be "more realistic"--which we might dispense.

wild chicken said...

Got another one for you. My grandmother died the year I was born and all I my life I heard that she had been "sick" a lot in the 1930s and 40s. Finally I tried to find out what she had, and was told "emotional nerves." Ie she was a neurotic I guess.

But I agree, the shrinks don't know what to do most the time. The antidepressants they push act as a placebo as often as not. It's all voodoo, still.

Paul Mendez said...

Personally, I am very envious of people who lived back when having a "nervous breakdown" was an acceptable response to stress or disappointment.

Who wouldn't want to be able to stay in bed a couple of days straight, or go on a week-long bender, or throw a major hissy fit in public and then -- once you got whatever it was out of your system -- simply go back to your life and explain your behavior away as a "nervous breakdown"?

Try the same stunt today, and your friends, family, co-workers will demand you go into therapy or rehab to solve your "problem" before they let you resume your life. Furthermore, once they've made you acknowledge your "problem" you'll never be able to repeat it.

Anonymous said...

This topic is depressing.

O'Brien said...

Re: Working Memory

In at least one theory of IQ, intelligence is broken down into four categories, Math, Verbal, Spatial, and Working Memory.

The SATs, ACTs, GREs, and similar tests measure mainly math and verbal.

I did very well on the GRE, 800 Math and 680 Verbal. This jives with the stratospheric math and verbal scores I received on an IQ test in 5th grade, I think around 155 and 145 respectively. This goes well with the 151 estimate one web site gave based on my GRE scores.

On the other hand, my spatial is relatively mediocre (low 120s) and my working memory is barely above the median (mid 100s). This certainly cannot be ascribed to old age; I was only 11 at the time of the test. On the other hand, people often called me "a little old man."

I wonder if many people with attention problems actually have working memory problems, or at least a much lower working memory than one's overall IQ score would predict.

I would guess that East Asians tend to score high on working memory, and thus tend to have a high tolerance for "boring" material. OTOH I would guess that blacks and Hispanics score lower on WM than their overall IQs would predict.

Unfortunately for me, increased specialization in the job market probably means that WM is of increasing importance in general.

holmegm said...

Steve, you might want to check out Peter Kramer's book "Against Depression".

He makes a pretty solid case that it is a disease - and he does so in order to de-romanticize it, not to romanticize it. Fascinating read.