Thursday, March 15, 2012

Science of Human Health - It Is Not Rocket Science

The BLUF:  When prospective/epidemiological studies are used to establish causative relationships, they are no longer scientific, they become editorial.  The fact that most of the science of human health is bad science leads many to accept as evidence science which should be considered interesting at best.

Zoe does a brilliant job of showing why this is true, using the data in this study.

Here's a choice cut from her study:
- The two studies combined, therefore, covered 2,958,416 person years and there were 23,926 deaths in total: 5,910 attributed to CVD and 9,464 to cancer.
The first point to make, therefore, is that the overall death rate was very small:
- In the HPFS, in 758,524 person years the overall death rate was 1.18% and the CVD death rate was 0.36% and the cancer death rate was 0.41%. Over a 22 year period, just over one in a hundred members of the study died.
- In the NHS, in 2,199,892 person years the overall death rate was 0.68% and the CVD death rate was 0.15% and the cancer death rate was 0.29%. Over a 28 year period, approximately one out of 150 members of the study died.
- In the two studies combined, in 2,958,416 person years the overall death rate was 0.81% and the CVD death rate was 0.2% and the cancer death rate was 0.32%. In the combined studies, fewer than one person in one hundred died in a 28 year period.

Does this scare you as much as the headline would have?  Me neither.  Lies, damned lies, and statistics.  I encourage you to read the entire article to put your mind at ease about the supposed risks to you and your loved ones.

Here's another of Zoe's great points:
This is what led to the big news story: “adding an extra portion of unprocessed red meat to someone’s daily diet would increase the risk of death by 13%. The figures for processed meat were higher, 20% for overall mortality…”

But what Zoe highlights well in her analysis is that the authors are doing a bunch of fancy math and then guessing.  They guess some corrective factors for other assumed risk factors.  Are their assessments of the other "risk factors" correct?  Yes, and if you don't believe them, well, just ask the authors. 

I'll admit I don't understand this kind of statisitical analysis, but I know this - they are assuming and guessing.  That is what you do to explore conjecture, and to determine what might be done to refine or test a conjecture.  Guessing is not the means by which to establish causality.  Therefore, all that language of "risk" that you see in all of these types of studies (prospective observational studies) is a code word that means "our mathematical games, if correct, predict an outcome that we don't know to be true."  Frankly, if the answer were known, these types of studies would be even "more useless" than they are. 

I have a confession.  I don't really care if you choose not to eat meat.  If you make that choice, to some degree it may lower the price that I pay for meat - the exception being that if you choose to eat grass fed animals, the economies of scale may at some point tip the economics towards better pricing for same.  But I would be sad if, even though you love red meat, you choose not to eat it because you believe these near ludicrous guesses about mortality. 

When you see words like "linked", "associated with" and "risk factors", you are reading the language of statistical gaming.  Make some guesses, re-run the numbers, and "wow, that looks interesting."  If I was an epidemiologist, and I could get a paying gig running numbers like this, I would love it.  Hopefully, though, I would not pretend that the numbers mean what these folks take the numbers to mean.

Human science is nearly impossible to do well, it is much, much harder than rocket science.  That does not mean we should pretend that bad science, isn't.

No comments:

Post a Comment