The PRNP gene: One reason not all of us make good cannibals

original-fore-tribe-papua-kuruThe story starts in the eastern highlands of New Guinea, back in the 1940s-1950s. A fatal disease called kuru blazed through many villages, and women and children were especially affected. It started with a few months of head and body aches. This was followed by trouble standing and walking. Eventually tremors began, and sufferers lost the ability to get around entirely. Finally, euphoria set in (kuru is sometimes called the “laughing death”). By the late 1950s, more than 200 new cases a year were being reported, and, all told, over 3,000 deaths resulted (in a population that numbered only about 12,000). In some places, few young women were left.

Figuring out what was responsible for kuru was not an easy task. According to the Fore, the group hardest hit by the disease, it had first appeared in the 1920s and spread rapidly.  Researchers noticed that often multiple people in a family were affected, so they started building kuru pedigrees. Complicated genetic causes were proposed. But eventually the strange distribution of the disease (it primarily affected women and young children) shed light on the true culprit: cannibalism. When someone died (say, of kuru) the women and children in the family would prepare their body for the funeral. Part of this preparation involved eating their loved one’s body, in a feast that signified respect for the deceased. The brain, which was the most infectious body part, was eaten by women and children. And the infectious entity wasn’t a bacterium or virus. Instead, it was something entirely different: a prion, or a protein folded in a strange way. Once prions enter the body, they cause other proteins to misfold, until there are enough faulty proteins to cause real problems. The gene that encodes the prion protein is called PRNP, and changes in this gene have since been linked to other prion diseases like variant Creutzfeldt-Jakob disease (aka mad cow disease), fatal familial insomnia, and Gerstmann-Sträussler-Scheinker disease. Kuru and similar diseases are called transmissible sponge encephalopathies, because of the sponge-like effect they have on the brain (shown in the picture below).

BrainSectionsWhile nobody knows for sure how the epidemic started, the current theory is that someone among the Fore happened to develop variant Creutzfeldt-Jakob disease early in the 20th century. When his or her loved ones performed the mortuary feast, they ingested the prion and developed kuru. When they died, the chain continued, and the disease spread across the Eastern highlands. In the 1950s, the Australian government instituted strict, new laws that outlawed cannibalism. After this, the disease started to lose steam. A trickle of new cases continued to appear, however, well into the 2000s. Although the average time from participation in a fateful mortuary feast to the onset of kuru is 12 years, it turns out that the incubation time for kuru can be anywhere from 4 years to over 50! What accounts for that variation? And why did some people who participated in mortuary feasts get kuru, while others did not?

It turns out that the story is not so simple! Those early attempts to link kuru to genes were not so far off after all. At the 129th amino acid residue in the PRNP protein, people either have a methionine (M) or a valine (V). Since we all have two versions of the PRNP protein (one from mom, and one from dad), a person can either be an MM, an MV, or a VV. It turns out that being an MM makes you especially susceptible to kuru. MVs and VVs are both less likely to develop kuru AND if they do develop it, the incubation time is longer. But being an MV is the absolute best. After the kuru epidemic ended, there was a deficiency of MM individuals among the Fore, because so many of them had died of kuru. And it turns out that the finding that MM individuals are susceptible to kuru is relevant beyond New Guinea. Patients who develop variant Creutzfeldt-Jakob disease have also invariably turned out to be MMs!

In 2003, an attention-grabbing paper on the PRNP gene came out in Science. Scientists had already established that, among the Fore, MVs were the most likely to survive the kuru epidemic. When the best genotype to have is one that combines two different types of alleles, it is called balancing selection. Natural selection will often end up keeping both types of alleles in a population; the success of individuals with both versions will result in a “balance” being struck between the two. In the Science paper, a group of researchers said they had also found evidence of balancing selection in the PRNP gene worldwide. What could this mean? The authors speculated that maybe a long history of exposure to prion diseases spread from animals (like mad cow disease) could account for selection that favored MVs. Or MAYBE it was a long history of human cannibalism! This study has been a controversial one. Not everyone agrees with the way the analysis was performed, for example. But it’s pretty interesting.

In short, MMs don’t make good cannibals–or maybe even meat eaters, since they are also more susceptible to mad cow disease. As it happens, you can figure out your genotype using 23andme! If you’ve been genotyped, you can go to the browse raw data page and enter rs1799990It will tell you your DNA sequence at the relevant part of the gene: An “A” corresponds to the M version of the protein, and a “G” corresponds to the V version. I found out I have two As, which means I’m an MM. I’m not cut out to be a cannibal, and maybe I should consider becoming a vegetarian too! That’s the breaks, I guess.

Bacteriophage therapy: An idea whose time has come?

220px-PhageBacteriophages are viruses that attack bacteria–their name means “bacteria eater” in Latin. Here, you can see a bunch of bacteriophages on the surface of a bacterial cell. They look sort of like balloons tethered to the surface of the moon. They were discovered near the dawn of the twentieth century, and at first they enjoyed some spectacular therapeutic successes. Felix d’Herelle, one of the men who discovered bacteriophages, treated four dysentery patients with them in 1919. Soon after, he used bacteriophages to treat outbreaks of cholera in India and plague in Egypt. In the U.S., phage trials were performed at Baylor, and the researchers involved were impressed by phage therapy. It didn’t always work so well in practice, though. Phage therapy can be tricky. The organomercury used to preserve phage cocktails often destroyed them instead. Preparations could also be contaminated with exotoxins produced by the bacteria used to grow the phages. Finally, there have been problems in the past with inconsistency. The composition of a phage treatment may vary from batch to batch, with predictable effects on efficacy. Many physicians were less than happy with the results they got with phages. And when antibiotics were discovered, there was really no reason to continue to struggle with this form of therapy.

Things are different today, though. There are so many reasons why bacteriophage therapy makes sense. First, antibiotic resistance is increasingly becoming a problem. The utility of our antibiotics is dwindling, and the discovery of new antibiotics isn’t keeping pace. Antibiotics are static. But bacteriophages can evolve along with their bacterial prey. Instead of  struggling to identify new antibiotics, perhaps we could use phages and let natural selection do our work for us. If we’re lucky, maybe bacteriophages could replace some of the antibiotics we’re losing. In addition, we’re beginning to appreciate how important our microflora are. Wiping out our microbial ecosystem wholesale with antibiotics isn’t desirable, and it can increase susceptibility to pathogens like C. difficile. Bacteriophages, because they target only specific types of bacteria, might get around this problem–killing the problematic bacteria, but leaving everything else intact. Mixtures of phage could potentially be tailored to a person’s specific infection.

One of the reasons that phage therapy has been slow to catch on in the U.S. is that it has been primarily practiced in Eastern Europe and the Soviet Union. Today, phage therapy is routinely being used in Russia and the Republic of Georgia. But over the years, much of the work done was not published in English, so scientists in the West were unaware of it. In addition, many studies didn’t meet the standards that western scientists require. And, of course, getting approval to use a new therapy in the clinic is tough. As a result, companies have been wary of trying out phage therapy in humans and have been using it in different settings instead. Phages targeted to Listeria, a food-borne pathogen, have been approved by the FDA to help sterilize processed foods. Other phage mixtures have been approved to protect crops against pathogens. And more phage treatments are in the works.

Baby steps toward human treatments are being made, though. In the U.S., several safety studies in humans have shown promising results. And the first randomized controlled trial in the West was recently performed by Biocontrol Limited. It targeted adult patients with chronic ear infections caused by antibiotic resistant Pseudomonas aeruginosa; a bacteriophage solution was swabbed on infected ears. The researchers involved reported outcomes that were better than those achieved with a placebo, which is promising! As we have begun to appreciate the ecosystem in our guts, some researchers have proposed that bacteriophage therapy could help us perform more subtle manipulations than the ones to which we’re accustomed–by introducing certain phage, perhaps we could promote the biosynthesis of nutrients or the breakdown of parts of our diet.

Phage therapy faces some of the same major challenges that antibiotics do. Bacteria can develop resistance to bacteriophages. And a given phage can only target a relatively narrow range of bacteria. For these reasons, often a mixture of phage are administered to a patient. In terms of obtaining regulatory approval, this can be tricky, however. Although treating people with multiple phages does prolong the time until resistant bacteria arise, resistance may be inevitable. Therefore, researchers have been throwing around ideas like combining phage and antibiotic treatments (which seems to work especially well), cycling different phage mixtures, or engineering phage that can circumvent mechanisms of resistance.

Delivering the phage to where they need to go can also be tough. They have to spread from the site of application, and they also have to avoid being cleared from the bloodstream. For that reason, researchers have been focusing on infections that are localized (like ear infections and wounds). Over time, it may be possible to engineer delivery systems that will enable phage treatments to be used for more systemic infections. Some researchers have even shown that specially-engineered bacteriophages have the potential to break down biofilms, a gooey layer of protection that bacteria can hide behind.

Bacteriophage therapy may not be ready for primetime yet, but great strides have been made in the past few years. I’m really looking forward to seeing where this alternative type of treatment is going to go!

Of dogs and men

a-lot-of-dogs[1]When we cozy up to a dog, or cuddle a cat, or watch a docile cow grazing in a field, it’s easy to forget that the wild ancestors of these animals were not nearly so friendly. Pets and farm animals are so familiar that it’s hard to imagine life without them, but the process of domestication didn’t start until relatively recently in human history. One of the neat things about genetic studies of modern animals is that we are sometimes able to reconstruct their past. In particular, there has been a lot of excitement about potentially revealing the genetic basis for animal domestication. Can a couple of mutations turn a wolf into an affectionate dog? Or does it take a whole slew of genetic changes? Do all domestic animals harbor similar genetic alterations, linked to things like tameness and color? Or does every species become domesticated in its own way? Were most animals domesticated only once? Or does it happen over and over again, in different places, once the idea catches on? Lots of questions about domestication, and the answers are still trickling in.

Recently, a flurry of studies on dog domestication has come out. Over the years, dogs, especially, have gotten a lot of attention from scientists interested in domestication, probably because they love dogs like the rest of us. Although lots of studies have been done, the findings have been a little confusing. And when you consider how tough domestication is to investigate, the ambiguous results start to make sense. Basically, most studies work like this: you get a bunch of dogs and sequence some genes (or genomes, if you are lucky). If dogs from a certain region exhibit a lot of genetic variation, you start to think that maybe this is where they were domesticated. (More variation in one geographic location usually means an animal has a longer history in that spot.) But as it happens, when you do this kind of study, it really matters how you pick and choose your dogs. The more animals that you pick from one location, the more likely you are to find a lot of variation in that region. So if you study a lot of dogs from Asia, you find a lot of variation in Asia, and it seems like dogs must have been domesticated there. If you study a lot of African dogs, then THAT seems like an equally likely site for domestication. On top of that, genetic signatures start to get really murky because domestic dogs sometimes interbreed with wolves. When wolf genes enter a dog population, they introduce new genetic variants–and the greater amount of variation that results can make a dog population look “older” than it really is. I think the latest consensus is that it’s just not clear where dogs were domesticated. It may have happened first in the Middle East or Europe. Or maybe China. There is also a lot of confusion about when dogs were domesticated. Based on studies of nucleotide substitution rates, scientists have hazarded guesses of anywhere from 100,000 YBP (probably way too early) to 15,000 YBP. All of the estimates so far seem to predate agriculture, which began around 10,000 YBP, and both genetic and paleozoological evidence suggests that the dog was probably the first animal domesticated.

OK, so maybe we haven’t had a ton of luck figuring out where or when dogs were domesticated. But that doesn’t mean we can’t learn more about HOW they were domesticated–genetically, that is. Recently, Erik Axelsson, a scientist at Uppsula University in Sweden, and colleagues sequenced the entire genomes of 12 wolves and 60 dogs of various breeds. By comparing the two groups of genomes, they identified 36 genomic regions that appeared to have undergone natural selection in dogs. In other words, under selective pressure to become domesticated, these parts of the dog genome look different from the same parts of the wolf genome. Genes involved in nervous system development seemed to be disproportionately represented among these altered regions of the genome. Since dramatic behavioral changes are some of the first things we think about when comparing wolves and dogs, finding these changes wasn’t terribly surprising. In fact, another study on dog domestication that just came out in Molecular Biology and Evolution concluded that genes expressed in the prefrontal cortex, a region of the brain responsible for complex cognitive behaviors like cooperating with humans during a hunt, evolved rapidly very early in the domestication process. Clearly, our best friend’s brain underwent major changes as we started to spend a lot of time together.

Axelsson’s study also found that genes involved in digestion and food metabolism, including starch digestion, were prominent on the list of dog genes affected by domestication. Since the ability of dogs to thrive in or near human settlements must have involved a big change in their diet (less meat, more starch), this makes sense too. What’s more, another recent whole-genome study on dogs and wolves, carried out by an independent group, identified the same trend: changes in genes involved in starch digestion appeared to be very important for domestication. After finding all these diet-related genetic changes, Axelsson and colleagues suggested that the development of agriculture may have catalyzed the domestication of dogs. This is interesting, since it would put the timing for dog domestication thousands of years later than other genetic studies have estimated. Since dogs being domesticated post-agriculture doesn’t seem compatible with a lot of the other findings, even recent ones, I guess it’s best to take a wait and see approach toward this interpretation.

One of the most exciting things about these recent findings is that they demonstrate that humans and dogs have undergone parallel changes over the course of our shared history. Similar genetic changes to the ones described in dogs have been found in human populations with high-starch diets, for example. And researchers who compiled a relatively comprehensive list of human and dog genes shaped by natural selection identified a substantial amount of overlap. Shared genes mostly fell into two categories: those involved in digestion and those involved in neurological processes. I guess it makes sense. Some people, like anthropologist Peter Wilson, have argued that since the advent of agriculture, we humans have been domesticating ourselves, settling down to live in large communities, eating new things, behaving in new ways. Dogs and people eat a lot of the same things, and we share the same environment. It turns out our genomes reflect our intertwined lives.

The 2013 take on the hobbits of Flores

tumblr_lkg5geJMhG1qagynuo1_500When the first “hobbit” or Homo floresiensis skeleton was found in 2003 in a cave on the island of Flores, it made headlines around the world. But it didn’t take long for the arguing to begin. Did this small skeleton represent a whole new kind of hominin? A petite species that was still around as recently as 12,000 YBP, long after the Neandertals had disappeared? Or was it just the remains of some poor soul with a severe pathology? Scientists tossed around all sorts of ideas about which disorders could result in a person growing to only about a meter tall, with a tiny skull and a curious resemblance to Homo erectus. An Indonesian paleoanthropologist named Teuku Jacob was one of the first scientists who suggested the skeleton could belong to someone suffering from microcephaly. Soon after the remains were discovered, he “borrowed” them, taking them from the center where they were kept and bringing them to his own laboratory. This caused an uproar. Eventually, he returned the hobbit remains to the researchers who found them, but they had been severely damaged. Among other things, the pelvis was smashed and several important bones were missing. As if that wasn’t bad enough, in 2005, Indonesia forbade researchers access to the cave where the hobbit was found. It wasn’t until Jacob’s death a couple of years later that research there was allowed to resume. And the colorful history of the hobbit finds doesn’t end there. Maciej Henneberg, Robert Eckhardt, and John Schofield self-published a book called The Hobbit Trap, in which they called into question the status of the hobbit as a new species. One of their objections was that the teeth showed signs of modern dental work, a claim Peter Brown (one of the hobbit’s discoverers) understandably called “complete lunacy.” Nevertheless, this claim enjoyed a lot of attention from the media. Some people have even speculated that species like H. floresiensis may still be hidden away in remote corners of the world–apparently, rumors of tiny people abound in Indonesia, and in particular, on Flores. I wish we lived in a world where finding another hominin tucked away somewhere seemed like a real possibility!

When the Flores remains were found, the hypothesis that they could have resulted from microcephaly or cretinism was reasonable. After all, when the first Neandertal remains found, people thought maybe they belonged to a Cossack soldier with rickets. In the case of  the hobbit, as in the case of that first Neandertal, there was just the one skeleton–and it’s hard to be sure about a new species designation from a single set of bones. But in-depth study of the skull recovered from the cave demonstrated that its features resembled those of archaic humans. And comparisons to skulls of people suffering from the proposed disorders showed that there wasn’t a good match. Eventually nine tiny sets of remains spanning 3,000 years were discovered in the cave, providing pretty strong evidence that the original find didn’t belong to an isolated individual suffering from a disease. And, early this year, a team of researchers showed that two different Homo floresiensis specimens had wrist bones distinct enough from ours to warrant a separate species designation.

homo-floresiensis-stegodon-florensis-insularisWhat else do we know about the hobbits? Researchers think their short stature may have resulted from “island dwarfism,” a tendency of species to shrink over many generations once they have arrived on an island. A study published this April in Proceedings of the Royal Society by Yousuke Kaifu and colleagues suggests that this idea is reasonable and that the process could have resulted in the hobbit’s body plan. We don’t know much about what life was like for the hobbits, but it may not have been so different from what Homo sapiens were doing around the same time. Possible evidence of stone tools and cooking has been found in their cave. Some scientists believe that a volcanic eruption on Flores may have wiped out both the hobbits and the island’s Stegodon, a species of dwarf, elephant-like creatures that the hobbits liked to hunt. Although Svante Paabo has worked magic in the past, teasing the Neandertal and Denisova genomes out of ancient remains, no one has been able to coax usable DNA from the hobbit remains yet. I am hopeful that the future will reveal additional Hobbit specimens, though, and that one of them may yield DNA suitable for sequencing. Maybe we will find that, like the Denisovans and the Neandertals, hobbit genes live on in us. Wouldn’t THAT be exciting?

The Neandertal in Our Genes

neandertalSpit in a tube, stick it in the mail, and several weeks and $99 later, 23andme can tell you just how Neandertal you are. For the average client of European ancestry, an estimated 2.6% of the genome can be traced back to Neandertal ancestors. If you are one of the people carrying around this vague signature of a Neandertal past, you may wonder what it all means. Does a Neandertal ancestor account for your pronounced brow? Or your red hair? Until recently, nobody really had any answers. Things are starting to change, however. In a recent article in Molecular Biology and Evolution, scientist Fernando Mendez and colleagues revealed a specific gene that has been influenced by Neandertal forbears.

The variants are present in a gene cluster called OAS which plays a role in immunity. Mendez noticed that a variant found in modern humans closely resembled that present in the Neandertal genome, which has been published. And while Neandertals and humans parted evolutionary ways around 300,000 YBP, this genetic variant in the OAS cluster was found to have diverged from Neandertal sequences only 125,000 YBP. Finally, this variant is found in people from Eurasia and North Africa, but not SubSaharan Africa, consistent with the area where Neandertals were once found. Pretty neat, huh?

And this is not the first gene that appears to have come from Neandertals! Last year, Mendez and colleagues identified Neandertal variants in another gene called STAT2. This gene, too, is involved in immunity. Although no one is sure what the functional importance of the Neandertal version of the OAS and STAT2 genes are, one interesting finding is that alternate versions of both of these genes also appear to have entered our genomes from interbreeding with Denisovans, another type of archaic human that is known only from a single bone recovered from a cave in Siberia. This has led researchers to question whether some alleles, like those involved in immunity, may be especially likely to get passed along after a romantic episode with a mysterious stranger belonging to another species. It seems likely that in the future, additional genes that have been passed down from Neandertal ancestors will be identified.

There have been some fantastic articles about the research supporting human-Neandertal interbreeding recently. One was Sleeping With the Enemy, an article by Elizabeth Kolbert that appeared in the New Yorker in 2011–it focuses on Svante Paabo’s work. Another article appeared in Scientific American last month. It was by Michael Hammer, one of the scientists involved in the work I discussed here. It’s called Sex with Other Human Species Might Have Been Secret of Homo Sapiens’s Success, and it’s definitely worth a read.

Margie Profet, Lost then Found

01aaa-margieprofetWhen I started my PhD program in Evolutionary Biology, I read all of Margie Profet’s articles. Basically, she looked at some of the things that make being female unpleasant (menstruating, morning sickness, etc.) and asked if they might have some adaptive function. She hypothesized that menstruating could be a female’s way of shedding sperm-borne pathogens and that morning sickness could be a warning system to keep pregnant women away from foods that might be dangerous to the babies they are carrying. If you have spent any time in evolutionary biology, you know that proposing adaptive features for medical issues/non-ideal biological states is like holding a lightning rod up in a storm. Other scientists will start quoting the Spandrels of San Marcos, and everyone will get excited about poking holes in your theories. Especially if they get a ton of media attention, as Profet’s ideas did, and you give dramatic advice that flouts convention (like telling women to avoid vegetables in early pregnancy). Picking apart hypotheses is just the nature of who we are and what we do. And often times, Panglossian theories about adaptive silver linings do turn out to be wrong. So it’s no surprise that Profet’s hypotheses were controversial or that fellow researchers began to refute some of her arguments with their own data.

Following these articles and the spirited responses they evoked was fun. It wasn’t until years later that I found out Profet was pretty fascinating in her own right. I hadn’t heard anything about her for a while, and I wondered what she was up to and started googling. First, I found out she was awarded a MacArthur genius grant for her work in evolutionary biology. That’s a pretty big achievement in and of itself. What’s even more amazing, though, is she published all these articles and won this award without an advanced degree in Biology. Instead, these ideas started incubating when she was working in Bruce Ame’s lab. I remember reading that post-MacArthur award, she had moved to the University of Washington and was studying math (after becoming frustrated with the reception she got in Evolutionary Biology was my guess). And after that it was like she fell off the map. I couldn’t find out anything about her current work.

Well it turns out she really did disappear. This summer I remembered the Margie Profet mystery and tried googling her again. And I came across an article by Mike Martin in Psychology Today: The Mysterious Case of the Vanishing Genius. She had last been seen taking math courses at Harvard. The last electronic traces of Profet stopped in 2002, when she cut ties with her family. The last sightings of her were in 2004–2005. After that, radio silence. It was a disheartening article–the quotes from her friends and family were very sad. But maybe getting the story out into the world was important, because in May of 2012, Martin published an update on his website. After learning that her loved ones were seeking her, Profet reached out to her family and they reunited. It sounds like those missing years were very difficult ones for everyone involved.

I haven’t really kept up on the status of her major hypotheses, but Martin says that her ideas about allergies as a defense against cancer have gotten support from some recent studies by other investigators. I hope that now she is back, she’s gratified to hear about these new findings. More than that, of course, I hope she finds love, support, and comfort among the people who love her. Maybe all of her work in science seems like another lifetime. Welcome back, Margie Profet. You were missed.

Retraction Ruckus at Rutgers

51xmqlhnzxL._SY380_If you have ever studied evolutionary biology, you probably know who Robert Trivers is. His groundbreaking articles on reciprocal altruism, the evolution of sex ratios, and parent-offspring conflicts are staples in every Evolutionary Biology course. They make great reading, and his theories pop up all the time in all sorts of disciplines. I’m sure his work has inspired many budding scientists.

Currently, Trivers is a professor at Rutgers. In 2005, he co-authored an article on body symmetry and dance that made the cover of Nature. A couple years later, he and his colleagues started to suspect that the first author on the paper may have fabricated data. In 2008, they contacted Nature about their suspicions and tried to retract the article. According to Trivers, because at least one co-author anonymously disagreed with the decision to publish a retraction, Nature didn’t publish one, although this month they did publish a great piece by Eugenie Samuel Reich about the saga in which Trivers has become embroiled. Frustrated that he couldn’t get the word out about the faked data, in 2009 he and two other scientists self-published a short book called The Anatomy of a Fraud, detailing evidence that the data in the paper had been forged. After the book came out, Rutgers was forced to investigate the matter, and in 2012 they issued their conclusions. In the Research Advisory Board’s report, which Trivers posted on his website, they agreed that it was pretty clear that the first author of the paper had manipulated the data in order to make a good story. The fall out from this episode has been major. A glance at his website shows how aggrieved Trivers is about the failure of so many institutions (Nature, Rutgers) to take the accusations of fraud seriously. And in 2012, Trivers was banned from campus for months after an unpleasant exchange with a collaborator who is also on faculty at Rutgers.

While there has been a lot of chatter about the aftermath of these accusations (he said, she said type-stuff about the way the co-authors have interacted with one another), it seems like the real issue at stake is how hard it is to blow the whistle on scientific fraud. No one looks forward to admitting that they co-authored a doctored study–especially when that study appeared in a high profile journal like Nature. It takes integrity to investigate whether fraud was committed in your own lab and, after finding it was, to promptly try to retract the affected work. Since scientific progress is cumulative,  with current studies building on past findings, this kind of self-policing is exactly the type of behavior all of us want to encourage. Trivers and colleagues published their book outlining the problems with the symmetry and dance study in 2009, but that didn’t stop the citations. According to Google Scholar, the article has been cited 128 times, many of the citations occurring as late as 2013. It’s well-known that articles continue to be cited even after they are retracted, but certainly a retraction in Nature would have helped in this example–the odds that someone doing a literature search would link this article to a little known, self-published book are extremely low. While the problem of retractions (their growing frequency, their inability to effectively remove false findings from the literature once the original articles have been published) is attracting increasing attention, especially with help from blogs like Retraction Watch, retracting a fatally flawed article is certainly better than not retracting it. Unfortunately, retractions are an unpleasant business for all of the important players involved: the journal, the academic institution, and the co-authors. If a very prominent scientist has this much trouble trying to reveal a case of academic fraud in his own research group, what hope is there for everyone else? Hopefully all of the attention this case has received will put the pressure on Nature to rectify this situation, but I’m afraid events like this may dissuade junior scientists from coming forward with unpleasant but important information.