Headlines such as "Soda kills 180,000 people a year" from last month's annual meeting of the American Heart Association are screaming at consumers.
Science is a complex process and reporting on studies presented at professional meetings or published in peer-reviewed journals is difficult, but the media has failed consumers in their race to report findings fast without first examining the facts.
This study shed strong light on the worst flaws in reporting breaking science news. Technically, what was presented was an "Abstract" from an observational epidemiological study done by Harvard's School of Public Health. Great brand name, lousy science and a prime example of health data distortion.
The Abstract was nothing more than an outline of a science study designed to make news. However, the full data set was unavailable. The conclusions were strictly their own. Most importantly, it did not show that consuming sugary drinks would kill anybody. This study does not mean you should change your diet or lifestyle; it shows a statistical association, not cause and effect, which are quite different things.
Science showed that cigarettes caused cancer. No such evidence exists here. It is as if a tombstone might read,"His stillness is associated with death, but there is no cause-and-effect relationship. Your results may be different."
The current system of marketing science is incestuous. Scientists must publish to gain peer respect. Academics must publish to get grants and gain promotions. The media has hours of airtime and pages of white space to fill. Keep in mind the person writing the headline has little knowledge about the subject matter. Moreover, you will not see Brian Williams of NBC News or The New York Times say, "Sorry folks, nothing new today. We have nothing but a dark screen and white space." The competition to be first, to drive ratings, is too intense.
As a consumer searching for health news for yourself and your family, beware of stories that cite studies that use a lot of "mays," "mights," and two of my favorites, "these conclusions need to be replicated" or "additional research is needed."Most observational studies never make it to peer-reviewed journals.
Causational studies are very expensive; to be credible must involve many participants over a long time and account for many variables, while observational studies are a simpler matter of crunching the numbers enough to get the results one expects. No one commissions research anticipating a negative response.
In any event, health decisions should not be made based on any one study but on the totality of the science. That is especially difficult when we have the media distorting health data and academics, with no or little practical experience, publishing reams of paper to support their view of how the world should work.
In short, check the facts with several sources and ask your healthcare provider for advice about a particular supplement, ingredient, or product that implies it can prevent, mitigate, cure, or treat illness or disease.
Jeff Nedelman has more than 30 years of experience in various industries, including a stint as a Chief of Staff to a U.S. Senator and chief lobbyist for the nation's largest food trade association. In all those years, Jeff has learned that the shortest distance between two political points is not a straight line.