This week, as an exercise for my undergraduate science writing course, I gave my students this story: Want Your Daughter To Be a Science Whiz? Soccer Might Help. The piece, which comes from NPR’s “Shots”, used a recent study to tell readers that playing soccer might make their daughters into science whizzes. But what the study showed was simply a correlation between teens’ overall daily movement (as measured by an accelerometer) and performance in standardized tests.
This correlation was apparently strongest for girls and science tests. This seemed like just the kind of result that’s been the target of a number of recent complaints by statisticians and science critics. Slice and dice your data enough different ways and you increase the odds of getting something that looks like a real effect.
And even if the correlation is based on some real connection, you don’t have to be a soccer-playing math whiz yourself to think of alternative explanations for the result. My students came up with a good number. How many can you think of?
The story did offer a couple of qualifiers, such as the statement:
This study doesn't prove that the increased exercise was what improved the children's test scores, but parents aren't off base in thinking that it could help.
Well, you can’t “prove” anything in science and the fact that you’re telling readers about it suggests that it’s at least a good possibility when in fact the notion that soccer makes girls into science whizzes seems an unlikely explanation for the data.
The story caught my attention partly because it related back to a conversation I had with linguistics professor Mark Liberman over a recent study allegedly connecting reading of literary fiction and improved social skills. See KSJ critique of that here. Liberman wanted to know why certain types of studies get so much publicity despite weak results and dubious cause-and-effect relationships. Sometimes it seems the counterintuitive aspect of a study will propel it into the worldwide media. But other findings make a big splash by reinforcing prejudices.
Here I think both are happening at the same time. The notion that playing sports could improve science performance is surprising, and yet it reinforces a common prejudice that sports are good for girls, just as the reading study reinforced reporters’ prejudice that reading “literary” fiction was good for you.
Now what if the study found a different correlation? Suppose they found that girls who had sex at an earlier age performed better on math or science tests. What if the study connected use of illegal drugs and better science test performance? Then would you be a little more critical?
Should reporters allow scientists to broadcast made-up cause-and-effect relationships when we think they are advocating something good for us?
Leave a Reply