Hooray, They Did it Right!

By | Jan 22 2013       0 Comments      Print


I ran across the this article recently. It's not really mental health related, but it is notable for something that's seen all too rarely in health and medical reporting. I'll show you the first sentence, and you see if you can figure out what is noteworthy:

Regular aspirin use was associated with an elevated risk for neovascular age-related macular degeneration, an Australian study suggested, but actual causality remains uncertain. 

What caught my eye was the very last bit: "actual causality remains uncertain."  In this case, they come right out and say it, but in much reporting - even scholarly articles - that caveat is missing or buried deep in a paragraph long after the "pay-off" is described.

The article actually goes on to say how they controlled statistically for all kinds of potential biases, but they still don't assume there is a causal link. That is the right way to report correlational results. The next step would be a true experiment, with one group getting aspirin and the other not.

I've beaten this drum before, but it's worth repeating: correlation does not imply causality.  When you read about research - even in textbooks - it's important to be an informed consumer.  Another example: class size is related to academic achievement. But class size is also related to community economic status, which is related to all kinds of other factors. And then there are all the media announcements around diet, vitamins, etc.:  "People who take supplement X are less likely to have disease Y!"  Usually, they haven't randomly picked two groups and said, group A, you take it, and group B, you don't.  More likely, they did a survey.  What if people who take supplements generally take better care of their health?  What if they are more affluent (and we know poverty is associated with poorer health)? Et cetera.

The gold standard is an actual experiment - in its purest form, people are randomly assigned to a treatment group and one or more "control" groups who don't get the treatment. If you see research that says no more than "people who got X got better," you have not seen evidence of causation. Mental illness and substance use disorders tend to go up and down, and people tend to seek services when things are worst. Many studies show that around a third of the "placebo" group who got no treatment are better at the end than when they started.

So, as a consumer of research - both in the popular media, and in academic literature - look for clear evidence of causation, not just correlation.





0 Likes Like Dislike

No Comments



You need to login to comment.