Assertive headline mis-characterizes the reality of a medical research study

A press release on EurekAlert caught my eye because it looked suspicious. The headline: “Children living near toxic waste sites experience higher blood lead levels resulting in lower IQ.” That assertive headline implies a rigorous study that tested the blood lead levels of many children, but, as we will see, the assertion is an assumption based on computer modeling, not testing.

We often see ominous headlines similar to the one above in the mainstream media. They can cause great concern. But it pays to look at the details. The headline is qualified in the first sentence of the press release with the phrase “may experience higher blood lead levels” but many of the media reports went with the headline. This headline came from a press release by the  Mount Sinai School of Medicine (see the entire press release here).  So, what methods did the researchers use to justify even the modified description?

The operative paragraph in the press release is this one:

“Researchers measured lead levels in soil and drinking water at 200 toxic waste sites in 31 countries then estimated the blood lead levels in 779,989 children who were potentially exposed to lead from these sites in 2010. The blood lead levels ranged from 1.5 to 104 µg/dL, with an average of 21 µg/dL in children ages four years and younger. According to Dr. Chatham-Stephens, first author of the study, these higher blood lead levels could result in an estimated loss of five to eight IQ points per child and an incidence of mild mental retardation in 6 out of every 1,000 children.”

There are many precise numbers implying rigorous research. But, the phrases “estimated the blood levels,” “potentially exposed,” and “could result” should raise a red flag.

The whole thesis of this research is based on guesswork and assumption. The researchers did not measure lead levels in children’s blood; nor did they test IQ levels; and they did not interact with 779,989 children. Further investigation reveals that all the numbers, including the reported blood lead levels, are extrapolations from computer modeling.

The paper title is “The Pediatric Burden of Disease from Lead Exposure at Toxic Waste Sites in Low and Middle Income Countries.” I could not find the full published paper, but I did find the abstract here. From the abstract we find that the numbers reported in the press release are indeed products of computer modeling, not from actual measurement. The paper in question is apparently a subset of a larger study by the same authors:“Burden of disease from toxic waste sites in India, Indonesia, and the Philippines in 2010” which is available online. The methodology described there confirms that no blood tests were performed. So, all those impressive looking precise numbers in the press release are mere artifacts of the assumptions used in a computer model.

The point here is that while the contention of the research might be correct, the reality is that we don’t know because the researchers, as far as I can tell, never validated the modeling with ground truth. We know no more now than we did before the research was conducted and the relationship postulated.

I think stories like this reflect poor practice in both journalism and science. I also noted that the study was done in conjunction with the Blacksmith Institute, an advocacy group, so there may be some promotional incentive for the press release headline.

The principle danger from this kind of study, besides worrying the public, is that policy makers may read only the headlines and propose inappropriate solutions to problems that may not exist.

A version of this article first appeared in the Arizona Daily Independent.

See also:

Be wary of statistical traps

 

Advertisements