Bad Science: Quacks, Hacks, and Big Pharma Flacks
I have just read this fascinating book, Bad Science: Quacks, Hacks, and Big Pharma Flacks, by British Medical Doctor Ben Goldacre. This book is chock full of examples of why Greenberg’s Law Of The Media is true.
I wrote the law based on my reading of the media and my 6-years of post high school mathematics training. I have never claimed that I was particularly good with the statistical end of mathematics, yet I could recognize a lot of the flaws in news stories. Because Ben Goldacre does seem to be a statistics expert, he is able to point out flaws in the media far beyond what I would have suspected. His explanations will make it easy to recognize these errors when I see them in the future.
In addition, he points out that flawed interpretations of statistics and probabilities extend far beyond the media and have consequences far more severe than a misinformed public.
It is not only that people (big pharma flacks) lie with statistics, it is also that people just don’t understand how to apply statistics nor understand how to interpret them. So, even with the best intentions in the world, if you don’t know what you are doing with statistics, what seem like perfectly reasonable conclusions to you are just not borne out by the numbers when they are understood correctly.
There are many experts who do understand all this and know how to figure out what is statistically significant and what is not. Their interpretations might surprise you, until the reasoning is explained.
In Chapter 11 titled Bad Data, Goldacre pulls together quite a few of ways that people get fooled.
He writes in a very entertaining way, and I can hardly do justice to his ideas here, but I will try to give you a hint at some of what you should know.
- Using relative risk instead of natural frequencies.
Let’s say the risk of having a heart attack in your fifties is 50 percent higher if you have high cholesterol. That sounds pretty bad. Let`s say the extra risk of having a heart attack if you have high cholesterol is only 2 percent. That sounds OK to me. But they’re the same (hypothetical) figures. Let’s try this. Out of a hundred men in their fifties with normal cholesterol, four will be expected to have a heart attack, whereas out of a hundred men with high cholesterol, six will be expected to have a heart attack. That’s two extra heart attacks per hundred. Those are called natural frequencies.
- Choosing your figures
He quotes from an article in the UK’s Independent explaining a change of heart in their policy on cannabis.In 1997, this newspaper launched a campaign to decriminalise the drug. If only we had known then what we can reveal today . . . Record numbers of teenagers are requiring drug treatment as a result of smoking skunk, the highly potent cannabis strain that is 25 times stronger than resin sold a decade ago.
By the time he has finished, he shows that the number from the government report that the paper uses as its authority for the information says no such thing. He shows that you might conclude that the number is double, not 25 times higher. Even the doubling is a misinterpretation of the data. More over, this scare about the multiplying strength had been used years before by Ronald Reagan. Had you multiplied together the increase noted by Reagan in the 80s with the increase mentioned by the paper in the 90s,
It would require more THC to be present in the plant than the total volume of space taken up by the plant itself. It would require matter to be condensed into superdense quark-gluon plasma cannabis. For God’s sake don’t tell the newspapers such a thing is possible.
- Misunderstanding statistical significance
For his example he uses…an article in The Times (London) in March 2006 headed: COCAINE FLOODS THE PLAYGROUND.
Use of the addictive drug by children doubles in a year,
said the subheading. Was this true?If you read the press release for the government survey on which the story is based, it reports
almost no change in patterns of drug use, drinking or smoking since 2000.
He goes through the process of explaining how you get from some initial numbers used by the newspaper to the quite correct analysis in the government report to show that there was really almost no change.
- Poorly chosen questions in a survey where the respondents choose whether or not to respond to the survey
This one is so obvious, I’ll let you go to the book to read his instructive example. - Misunderstanding the math of predicting very rare events
The examples are very revealing, but too hard to summarize here. One of the example he uses shows the futility of a psychiatrist’s trying to predict which of the psychiatrist’s patients is likely to commit a murder. - The prosecutor’s fallacy
This is related to the above, but in this case the prosecutor uses statistics to show that an innocent explanation for the crime is unlikely without telling you that the criminal explanation is even more unlikely. He uses an actual case to demonstrate this problem. - Using the occurrence of an unlikely event to prove that something weird has happened
In the introduction to this discussion of an actual criminal case, he uses a quotation from renowned physicist Richard Feynman to start you thinking about the absurdity he is about to describe.
You know, the most amazing thing happened to me tonight. I was coming here, on the way to the lecture, and I came in through the parking lot. And you won’t believe what happened. I saw a car with the license plate ARW 357. Can you imagine? Of all the millions of license plates in the state, what was the chance that I would see that particular one tonight? Amazing . . .
If you don’t catch the absurdity of the point Feynman was making, then this comment by Goldacre, might help:
There is also an important lesson here from which we could all benefit: unlikely things do happen. Somebody wins the lottery every week; children are struck by lightning. It’s only weird and startling when something very, very specific and unlikely happens if you have specifically predicted it beforehand.
Later he explains what is wrong with the court case he uses as an example.
First he makes an analogy about blindly firing thousands of bullets from a machine gun at a barn and then finding and circling three bullet holes close together to prove that you are an excellent shot. He ties the analogy to the prosecution he is describing
You would, I think, disagree with both my methods and my conclusions for that deduction. But this is exactly what has happened in Lucia’s case: the prosecutors found seven deaths on one nurse’s shifts, in one hospital, in one city, in one country, in the world and then drew a target around them.
He generalizes the problem with what the prosecutor did.
This breaks a cardinal rule of any research involving statistics: you cannot find your hypothesis in your results. Before you go to your data with your statistical tool, you have to have a specific hypothesis to test. If your hypothesis comes from analyzing the data, then there is no sense in analyzing the same data again to confirm it.