Last week Nature borrowed the tried-and-true women’s magazine come-on, offering 20 tips. Hey, if it’s worked for years selling articles on flat abs and mind-blowing sex, why not public understanding of science?
The writers of these tips include conservation biologist William Sutherland, mathematician David Spiegelhalter, and Mark Burgman, whose affiliation includes a school of botany and a center for excellence for biosecurity risk analysis.
In a preamble, the authors say the tips are aimed at politicians, civil servants, policy advisors and journalists.
“We suggest that the immediate priority is to improve policy makers’ understanding of the imperfect nature of science. The essential skills are to be able to intelligently interrogate experts and advisers and understand the quality, limitations and biases of evidence.”
The tips all seem geared for evaluating individual medical and social science claims. Some of the tips are good for pointing to ideas that are not obvious to the general public but might be to people trained in science. The first tip is that differences and chance cause variation. That’s an important one, except that their statement “the real world varies unpredictably” is weird. Is the gravitational constant or the speed of light or the mass of the electron not part of the real world? I think I understand what they meant, but there is a better way to express the role of statistical noise, especially since high school science deals largely with the relatively predictable laws of Newton and Kepler.
Also useful was the tip that no measurement is ever exact. “Results should be presented with a precision that is appropriate for the associated error.” Yes! If my scale varies by a pound or so each time I weight the same object, then it’s misleading to say my cat weighed 12.3562 pound on the said scale. This tip is good common sense and it’s a mystery to me that scientists don’t always get it.
Another useful tip is Effect Size Matters. That goes back to Langmuir, whose definition of pathological science includes signals that are marginally above the noise. The tip, data can be dredged and cherry picked is also useful, but we need more examples. This is one that should be aimed at scientists too, but perhaps reframed not as a tip but as a “don’t”.
A couple of examples involved animal populations but most came from social sciences – connecting speed cameras to the number of accidents, comparing educational achievement of children whose parents adopt a health program with those whose parents do not, a school intervention, consumption of yogurt during pregnancy and its effect on asthma in the offspring, effectiveness of schools.
Unfortunately, some of the tips could backfire in promoting public understanding of science, not because they’re wrong but because they’re presented as universally applicable when they only apply to certain fields. None seem particularly geared to fields with a strong theory-experiment synergy. The examples are not from chemistry or astrophysics but from social science, health and ecology. In those fields there’s great value in the tips: bigger usually better for sample size, regression to the mean can mislead, controls are important, randomization avoids bias, and extreme measurements may mislead.
But how would one evaluate Darwin against these? How would one evaluate the connection between chlorofluorocarbons and thinning ozone? The link between carbon dioxide and global warming? It would be nice if we had a large sample of Earths to experiment on. But we just have one.
And did they really need to tell us that scientists are human? Though the authors seemed to intend something slightly less obvious than what’s written, telling us that scientists may have financial conflicts of interest is nothing new and the way it’s phrased gives fodder to the creationists, AIDS deniers and other conspiracy-theory-prone nuts.
And out of 20 tips, there’s nothing on the power of theory in science. The tips apply mostly to areas of science that do not have a theoretical backbone. But theory matters. When people talk about extraordinary claims requiring extraordinary evidence, they often use theory to evaluate what is considered extraordinary. Electrons behaving like waves – Not extraordinary. Neutrinos moving faster than the speed of light – Extraordinary.
Policymakers in the U.S. may deal with claims that “intelligent design” or other forms of creation science should be taught in schools. A tip regarding the scientific understanding of theory, or on methodological naturalism could have helped there.
The tips were re-written on a website called RealClearScience. I don’t know much about this site. Is it political? I can see that some of th authors have criticized Richard Dawkins and the Union of Concerned Scientists.
The re-write gives a somewhat different spin, as can be seen with the way they rephrased Nature’s tip regarding risk: Here’s Nature:
Feelings influence risk perception. Broadly, risk can be thought of as the likelihood of an event occurring in some time frame, multiplied by the consequences should the event occur. People's risk perception is influenced disproportionately by many things, including the rarity of the event, how much control they believe they have, the adverseness of the outcomes, and whether the risk is voluntarily or not. For example, people in the United States underestimate the risks associated with having a handgun at home by 100-fold, and overestimate the risks of living close to a nuclear reactor by 10-fold7.
Here’s the re-write:
17. People are terrible at risk perception. Many people are fearful of GMOs and nuclear power plants, yet those are both far, far safer than automobiles — which kill more than 30,000 Americans every year.
The Nature authors are referring to surveys that compared perceived versus actual risks. The RealClearScience re-write tries to directly compare one very different kind of risk to another and ignores the benefit side of the equation, which for many individual consumers is likely to be negligible for GMOs and quite high for our cars. If someone invented a perfectly safe non-car vehicle that was as fast, got comparable mileage and had heated seats, people would buy it. If some GMO food came along that tasted better and was less fattening than its non-GMO counterpart, people would buy it.
I’d stick with the original Nature version of the 20 tips. The authors do a great service in helping people take a more skeptical approach to studies, and that’s even more important now that the distinction between journalistic stories and press releases has almost disappeared.
But there’s a flipside: People can also misunderstand the solidity of scientific ideas backed by multiple lines of evidence – especially Darwinian evolution and the connection between carbon dioxide and global climate change.
Those 20 tips are a good start, but there’s more to the picture: