Why don’t negative results get published?

On a recent AskMe thread discussing a Science article on gender and collective intelligence, someone commented:

I read an article not too long ago about how studies that find fewer/no gender differences are significantly less likely to be published, and are often actively discouraged from publication. I thought I’d saved it, but I didn’t. Anyone know what I’m talking about?

Well, I don’t know the specific article, but there’s little doubt that this is true throughout science. Publishing negative results just doesn’t happen very often. Historically, I suppose there were reasons for this. As I’m banging my head against a problem, I may try 10 different approaches before finding one that works well. If each of those failures was a paper or even a paragraph, it would have made old paper journal subscriptions rather unwieldy and much less useful.

Now that everything is online, though, a handful of scientists are starting to stand up and say “hey, we should be announcing our failures as well, so others aren’t doomed to make the same mistakes”. In my opinion, these people have an excellent point.

So there are two major ways that this can come about. The first is to be encourage more openness when publishing papers. In the methods, or at least the supplement, authors could include a decent description of what techniques turned out not to be helpful and why they might have failed. This isn’t common practice now, mostly because for reasons of communication and reputation. Journal articles are always written as though the experiments were a nice, linear process. We did A, then B, then got result C. This isn’t a very accurate description of the process, and everyone knows it, but it makes those involved look smart. (I suppose if you’re clawing your way towards tenure or angling to land a good post-doc position, you don’t necessarily want to broadcast your failures). The more valid claim is that writing articles this way makes for a nice, easy to communicate story. Still, there’s no reason why more comprehensive supplements shouldn’t be added.

The second way to better announce negative results is to practice open-notebook science, where methods and raw data are published to the web in real time (or after a short delay). What’s holding this back is that scientists worry that by revealing too much, their competitors will get a leg up and publish the next big paper before they can. In this era of crushingly low paylines, where less than 20% of NIH grant applications get funded, things have gotten pretty cutthroat. Stories of being “scooped” abound, and although some people feel that these claims are exaggerated, it can happen, sometimes with career-tanking results.

So to make a long story short, no, negative results aren’t often published, even though doing so would probably be a boon to scientific enterprise as a whole. The good news is that there’s a pretty strong movement underway that is slowly making science more open, transparent, and reproducible.

Science Communcation

Over at BioStar, someone asked the question: How do you explain what you do to the guy on the street or your mum?

Poor science communication is a pet peeve of mine, so I wrote a rather long answer. First of all, I totally think that science communication should be a required course in every PhD program, and that you should have to practice explaining your work until you can do it in your sleep. Scientists can be their own best advocates, but they need to work at it.

As a scientist, you need to be able to explain your work at several different depths. The most important part of this process is accurately gauging the interest and experience of your audience, so you can choose the appropriate spiel. Here are a few of the explanations that you need to have ready:

For non-scientists:

  • The layperson’s 15-second elevator pitch: For the cocktail party or the new acquaintance who asks what you do. They should walk away understanding that you do science and that your work is trying to make the world a better place. (“better cancer treatments”, “new malaria drugs”)
  • The follow-up 2-minute overview if they ask for more details Still very high level, abstract, focused on where you’re trying to get with your research. (“understanding XYZ part of disease ABC by looking at things through a microscope”, “figuring out how the brain stores memories by sticking people in cool scanners”)
  • The full explanation. For the non-scientists who really want to wrap their head around what you do. Keep in mind that these people may not have taken a science course since high school. Avoid jargon and acronyms, and make sure that at the end, you leave them with the big picture idea of what you’re trying to accomplish and how that will advance humanity.

For scientists:

  • The scientists’s elevator pitch. For people who you’ll meet around your campus or at conferences. You may even need two or three of these, for use in different venues. (At a focused conference, you’ll be more specific and jargon-ythan at a departmental retreat)
  • The two-minute casual conversation. This one is tricky, because you need to read the person you’re talking to in a very short time. Do they know what RMA is? What about ERBB2? What’s their background, and how can I look at my problem from their angle, so as to best couch my answer in terms they’ll understand?
  • The 5, 15, and 30 minute presentations, often with slides or a poster. You should get drilled in these during your graduate school career, and if you didn’t, there’s no time like the present to start practicing.

Once you get these down, practice the final element, which is being enthusiastic about your work. After all, you probably think that what you’re researching is one of the coolest and most important things ever. If that comes across to your audience, they’ll be engaged and interested too.