Syphilis: Then and Now (Or What I’ve Been Doing For the Last 10 Years)

syphilisAn article about our work called Syphilis: Then and Now appeared in this month’s edition of The Scientist.

In it, Molly Zuckerman (U. Mississippi), George Armelagos (Emory U.), and I describe the work we’ve done together on the origins of syphilis. This was a great opportunity to look back at the last ten years, weaving together many different strands of research to figure out what exactly we have learned.

We talk about all the different approaches we’ve employed to try and learn more about the past of T. pallidum, the bacterium that causes syphilis, as well as the lesser-known non-sexually transmitted diseases, yaws and bejel. Looking at old bones in dusty basements? Building a phylogeny with T. pallidum samples collected from all over the world, including remote Amazonian villages? Getting to the bottom of a gruesome disease that causes wild baboons’ genitals to drop off? We’ve done it all! (With a lot of help from other people, of course.)

I will always feel incredibly lucky that I got to carry out my dream dissertation project. Every research project has its highs and lows, but throughout my PhD research I marveled that somebody was paying me to do what I would have gladly done for free. I will also be forever grateful that I had the privilege to work with so many amazing scientists. Thinking back on all this work was a really pleasant endeavor.

Writing this article also forced us to think about the future of this line of research. As we make clear in the article, although our work has shed some new light on the centuries-old debate about syphilis’s origins, there are plenty of questions left. As our ability to obtain whole genome sequences from even poor-quality samples improves, I’m really looking forward to seeing what we learn. The history of this bacterium is just as fascinating to me now as it was when I began my work.

Anyway, writing this article was a lot of fun–and if you are interested in the history of infectious diseases, I hope you will check it out!

And thanks to the folks at The Scientist, especially Jef Akst, for the chance to share our work. It was a pleasure to work with them on this.

Ancient RNA: A Whole New World of Possibilities for Paleopathology?

imgres-3I just wrote a piece for BiteSize Bio called Ancient RNA: Does Next Generation Sequencing Offer a New Window into the Past?

In it, I describe an article by Fordyce et al that came out earlier this year in PLoS ONE: Deep Sequencing of RNA from Ancient Maize Kernels. This group of researchers was able to obtain RNA sequences from 700-year-old corn kernels. Neat, right? I was really surprised that this paper didn’t get more attention when it came out. I think it’s basically been taken for granted that studying ancient RNA just isn’t possible, due to RNA’s fragility–so a paper showing that aRNA studies may actually be feasible was pretty exciting in my opinion.

Tom Gilbert, the senior author of the article, told me they have encountered some skepticism regarding the results, because the idea that RNA can’t survive over long periods of time has become so ingrained. Maybe that’s why there hasn’t been more chatter. I thought the paper was pretty thorough, though, in ruling out issues such as contamination. I have to say–I’m a believer so far!

Which leads me to wonder… if you can amplify ancient RNA from ancient corn kernels, is it also possible to do so using other types of samples? Tooth pulp, for example, which should also provide a relatively protected environment? If so, perhaps we could look at infection with RNA viruses (e.g., coronaviruses, influenza, hep C) in times past. Or maybe even gene expression under various conditions. The possibilities seem endless! I’m really curious to see where this line of research goes. Is this just the beginning?

Reconstructing the ancestral microbiome: the American Gut approach

hadza-615I came across the human food project website today. This site is run by the American Gut project folks. You may have heard of their work before: you send them $99 and they send you a home sampling kit. You swab yourself, and they give you a list of the microbes living in your gut. Neat, right? And besides yielding fun information for you, it provides a rich data source for them to analyze (with the goal being to learn more about what Americans are carrying around in their intestines).

I’m waiting for my results to come back from uBiome, a similar service.

Anyway, on the website, I noticed a section entitled Ancestral Microbiome. Exciting! I’ve actually been thinking about how one could go about reconstructing the ancestral microbiome a lot myself lately. Recently, I published a paper entitled Genomics, the origins of agriculture, and our changing microbe-scape in the Yearbook of Physical Anthropology with George Armelagos. One thing we looked at is attempts to learn more about the pre-agricultural microbiome. This is a really tough problem, for several reasons.

One approach would be to use aDNA to characterize microbiomes from ancient remains. The aDNA approach has worked out for some ancient post-agricultural remains (check out Insights from characterizing extinct human gut microbiomes by Tito et al.), but so far nobody has gotten it to work on remains dating prior to the advent of agriculture. Bummer.

Another approach is to study contemporary hunter gatherer groups. This is pretty problematic too, though. First, the very few hunter gatherer groups that are still around have been able to protect their way of life primarily by fending off agriculturalist intruders. This means that they are probably not amenable to being studied by swab-bearing scientists. Second, learning about the hunter gatherer groups around today isn’t necessarily going to tell you all that much about the typical hunter gatherer group living tens of thousands of years ago. The very fact that a hunter gatherer society is still around in a very agriculturalist world suggests that it may be unique in some way. Third, even though some hunter gatherer groups have been able to maintain their traditional subsistence strategies, more or less, it’s possible that they have still been exposed to agriculturalist microbes, which might have altered their microflora.

So how are the Human Food project researchers going about reconstructing the ancestral microbiome? They are studying the Hadza of Tanzania. About 1,000 Hadza live around Lake Eyasi in the northern part of the country, and roughly one-quarter of them live as hunter-gatherers (no crops, no livestock). Not surprisingly, the Hadza are not completely cut off from the world outside their homeland. Contact with agriculturalists stretches back for at least a century. Most Hadza now speak Swahili fluently. Alcohol has become a problem for some, and microbes like TB have been introduced. In the mid 1990s, anthropologist Frank Marlowe found that about 10% of calories that came into Hadza camps was from non-foraged food delivered by missionaries or obtained through trade with agriculturalist neighbors. It’s possible that an increasing stream of money from tourism may result in more calories being obtained from purchased crops nowadays.

It will definitely be interesting to see how the microbiomes of the Hadza differ from those found in other groups in the area. I think this is a project worth doing. But my guess is that there has been a lot of “microbe-creep” between the Hadza and these neighboring groups (and perhaps even foreign tourists). This, along with the other problems I pointed out, would compromise our ability to draw conclusions about the “ancestral” genome. I’m eager to see how this will play out–and curious to hear what other people think.

How the media interprets studies of home vs hospital births: Do mothers matter?

mother-and-baby-201x300-1I thought the news coverage of a recent study on planned home births vs hospital births was really interesting. The article, Selected perinatal outcomes associated with planned home births in the United States, appeared in the October issue of AJOG.

I’ll come clean here–I have two kids, and they were both born in hospitals. My goal both times was to minimize intervention, but I never seriously considered a home birth. I know I don’t have a very high pain tolerance, and though I tried to get through labor without meds both times, I wanted to know that they would be an option. In the end, I got an epidural for each birth (and was extremely grateful for it). But the climbing C-section rates, the prison-like environment of the hospital… I understand the appeal of the home birth for many women. So I’ve been following home birth vs hospital birth safety studies with a lot of interest.

I’ll summarize the study’s findings and then review some of the media coverage, which I found a little surprising.

What the researchers did

Basically, the authors looked at two different types of outcomes: (1) neonatal outcomes, such as the 5 minute APGAR score for babies and whether babies had seizures and (2) maternal outcomes, such as operative vaginal delivery (i.e. forceps or vacuum used) and labor induction/augmentation. They analyzed over 2 million singleton births that occurred in 27 states in 2008, roughly 12,000 of which were home births. I think it’s fantastic they were able to look at only singleton births and that they were able to identify planned home births. That way, they aren’t counting emergency situations where women can’t make it to the hospital as “home births,” and riskier multiple births don’t enter the analysis, complicating things. In case you’re interested, here are some other birth types that were excluded from the dataset: breech deliveries, deliveries that were < 37 or > 43 weeks, and births at freestanding birthing centers. In other words, they were trying to focus on relatively low risk births.

Women who plan home births are a unique subset of mothers

The authors found that by virtually any measure, women who plan home births are different than those who do not. They are more likely to have given birth before. They are older. They are way more likely to be white. And married. They are more educated. They initiate prenatal care later. And their babies are born at a later gestational age. Obviously, when you are comparing an outcome in two groups of people that differ in so many ways, epidemiological studies are very, very tough to interpret. The authors tried to control for these differences as best they could by using multivariate models that adjusted for parity, maternal age, race/ethnicity, educational level, marital status, gestational age at delivery, smoking during pregnancy, prenatal visits, and medical conditions such as gestational diabetes and preeclampsia. However, residual confounding is always a problem. Sometimes a big one.

Planned home births are more likely to result in babies with an Apgar score of < 4 and babies who had seizures

Planned home births were roughly twice as likely to result in a baby with an Apgar score of < 4. A score of < 4 is pretty serious stuff–it is a very good indicator of neonatal death. However, the absolute number of such births was small in both home and hospital groups (0.37% for home births, 0.24% for hospital births). And babies from planned home births were roughly three times more likely to have seizures (although again absolute numbers were small: 0.06% vs 0.02%).

Babies born in planned home births are less likely to end up in the NICU

The authors found that babies born in the hospital were roughly five times more likely to end up in the NICU (remember that these are adjusted odds ratios, so gestational age and maternal complications are included in these calculations).

Planned home births are associated with significantly lower levels of intervention

This is probably no surprise to most people. Women who gave birth at home in a planned delivery had about a tenth the odds of operative vaginal delivery, a fifth the odds of labor induction, a third the odds of labor augmentation, and less than half the odds of antibiotic use.

So what’s the take home?

I think any epidemiologist will tell you that studies like this are really, really tough to interpret. There will  probably never be a randomized controlled trial of home vs hospital births. And the women who choose home births are different in many ways than women who do not, so residual confounding is always a problem. Plus, we don’t have a way at present to identify women who wanted to give birth at home but ended up in the hospital with complications… so this may actually result in underestimation of the risks associated with planned home birth.

This study had a lot of strengths, though. Relatively large sample size, relatively low-risk births being compared, adjustment for many potential confounders. Given the findings, what’s a pregnant woman to do? How do you weigh the greater (but still very unlikely) odds of neonatal complications with the lower (but much more common) odds of maternal interventions, which carry their own risks? There are some tough tradeoffs here. I was curious how the media would interpret this study.

The media spin: Ignore the mothers

Here’s how different news organizations interpreted the study.

The headline in Science News was Home births more risky than hospital deliveries. The reporter here decided to focus on neonatal outcomes without mentioning the other half of the paper: maternal outcomes. He did point out the low absolute occurrence rate of neonatal problems, though, which is great. I find that often absolute risk isn’t discussed in news articles, and it’s important information for readers trying to interpret health studies.

Same thing in coverage of the story on the New York Times blog, which was entitled Home births pose special risks. No discussion of the maternal outcome findings, and here they didn’t even mention the low absolute occurrence of neonatal complications.

The coverage in Medical Daily was even more alarmist: Home births linked to increased neonatal complications; Mothers should plan for emergency hospitalization. While planning for possible transport to the hospital certainly seems wise, including “plan for emergency hospitalization” in the title didn’t really seem to follow from this study’s findings.

So why did the media ignore the maternal half of the paper? Is it just the fact that home births involve a lower level of intervention is old news? Or do maternal risks not matter very much to the general public when babies are involved? I’d be curious to hear what other people think about this!

Leprosy: holding steady for 400 years

0613_leprosy_skullIn a new Science article, Genome-wide comparison of medieval and modern Mycobacterium leprae, Verena J. Schuenemann and colleagues managed to amplify ancient pathogen DNA from individuals suffering from leprosy hundreds of years ago. Leprosy is one of my many dorky fascinations. This winter, I dragged my sister to visit a former leper colony on the gorgeous Hawaiian island of Molokai. Getting there involved riding a mule down a sheer cliff face–it was a pretty exhilarating experience for someone with a life as tame as mine. And the palpable sense of history there, where a few patients with Hansen’s disease still reside, was really moving. Anyway, you can imagine that I was pretty stoked to see this study on leprosy in the headlines.

These scientists gathered the bones and teeth of 22 medieval skeletons from Denmark, Sweden, and the UK, hopeful that at least a few would yield quality DNA. And they were in luck! Dealing with ancient DNA is always a tricky business, and the researchers were ready with a special capture technique to enrich for M. leprae DNA, making it easier to sequence by getting rid of contaminating DNA from other species. One tooth from Sweden, however, provided M. leprae DNA that was in such great condition that they didn’t even need to use the capture method (you can see the skull from which the tooth came here). It yielded a whole genome sequence on its own–and the tooth even contained more M. leprae DNA than human DNA. Pretty amazing!

In fact, the superb preservation of the M. leprae DNA was a recurring theme throughout the article. It actually sort of threw a wrench in the works, since one quality control measure that researchers use when dealing with ancient DNA is looking at the pattern of nucleotide misincorporation patterns, to make sure they are consistent with an ancient source. What does that mean? As DNA ages, more and more DNA bases are replaced by faulty copies, or nucleotide misincorporations. Therefore, ancient DNA should have a lot of nucleotide misincorporations. Usually, if an “ancient” sequence looks brand new, in terms of nucleotide misincorporations, you know you’re in trouble. You may be inadvertently studying modern DNA that has somehow contaminated your samples or laboratory. This case seems to be the exception, however. The other quality control measures in this study–like blank controls, independent replication by other groups, and identification of mycolic acids consistent with M. leprae–all looked good. It’s probable that the great preservation of the M. leprae DNA was due to the waxy, cell wall that surrounds the bacteria. Apparently, it protects the DNA inside from degradation. The same thing goes for M. tuberculosis, the cause of tuberculosis and a close relative of the leprosy bacterium; researchers have had pretty good luck finding ancient M. tuberculosis DNA that is in good shape. If you’re interested, you can find a couple of neat examples of recent ancient tuberculosis studies here and here.

Schuenemann et al. were able to obtain whole genome sequences from five of their ancient samples, representing each of the three countries (Sweden, the U.K., and Denmark) and dating from the 10th-14th centuries. They compared these whole genome sequences to 11 obtained from modern strains, which were collected in places like India, Thailand, the US, Brazil, Mali, the Antilles, and New Caledonia. And they found that there were very few genetic differences between all of the strains. In fact, one of the major conclusions that emerged from this article is that leprosy strains have changed very little since Medieval times. After building a phylogeny (i.e. a family tree for the bacteria), Schuenemann and colleagues found that the ancient European strains were most closely related to modern strains from Turkey and Iran. This may indicate that Medieval European strains originated in the Middle East. One popular hypothesis about why the number of leprosy cases in northern Europe shot up around the 11th century is that knights returning home from the Crusades ignited epidemics–this paper adds a little more evidence to support that theory.

The rather sudden disappearance of leprosy from Europe has been an enduring mystery in the annals of historical epidemiology. It has been estimated that there were almost 20,000 leprosaria (or leprosy hospitals/colonies) in Medieval Europe. Today, there is only one remaining colony for patients with Hansen’s disease in Europe. Obviously, antibiotics helped drive down the prevalence of the disease. But hundreds of years before we discovered a cure, leprosy was vanishing from Europe. Why? A change in the bacteria that made it less transmissible? Or a change in Europeans that made them less susceptible? Nobody knows! One of the authors’ conclusions that I found particularly interesting was that because there were so few genetic differences between ancient strains and modern strains, it’s unlikely that changes in the pathogen can explain why leprosy is no longer such a scourge. They hypothesized that other factors (like co-infections, social factors, or host immunity) were probably responsible for the susceptibility of Medieval Europeans to leprosy. These findings and this conclusion are similar to those that emerged from a comparison of ancient Y. pestis strains obtained from victims of the Black Plague and modern Y. pestis strains–there were no unique genetic differences in the ancient strain, so the authors (many of whom also worked on this leprosy article) concluded that genetic characteristics of the pathogen was unlikely to explain why the Black Plague was so deadly.

I think the authors may well be correct about host factors accounting for the decline of leprosy in Europe. There is no question in my mind that things like nutrition and hygiene play a very important role in susceptibility to infection. But I also think ruling out important pathogenic changes because there are few genetic differences between strains is risky. When you see that a particularly virulent strain of bacteria has recently acquired a big chunk of DNA, especially one that contains genes linked to virulence, it’s easy to pinpoint the basis for that microbe’s nastiness. But failing to find big genetic differences doesn’t necessarily mean that important changes aren’t present. As we know from studies of viruses (like influenza and the λ bacteriophage) and bacteria (like Y. pseudotuberculosis), one or two tiny mutations can have a large effect on things like transmission and virulence. It’s really hard to look at a smattering of DNA substitutions and know what they mean! I’ll be curious to see what we learn about some of the genetic changes identified in the study in the future.

Congratulations to Schuenemann and the other scientists involved in this work for such an exciting study. I really envy the people who study Mycobacteria. I worked on the bacterium that causes syphilis, T. pallidum, for a long time. Being able to sequence a few ancient strains of this microbe could go a long way toward solving the mystery of this infection’s origins (you can find some of my relevant articles here and here and here). In particular, did Columbus introduce this disease into Renaissance Europe after returning from the New World? Unfortunately, T. pallidum DNA seems to be very sensitive to degradation. So there doesn’t seem to be much hope that ancient DNA is going to come to the rescue in this case! With the constantly improving sensitivity of sequencing techniques, though, who knows what the future holds?! The technological advances that have emerged since I began graduate school, ten years ago, have transformed the face of science. It’s an exciting time.

Voodoo Death and Public Health

images-1After being condemned by a medicine man, a terrified man dies of fear. A woman commits a taboo action, and, convinced that punishment will be swift and lethal, she perishes. This type of voodoo death was a subject of great interest in the first half of the 20th century. U.S. physicians stationed in Australia, South America, the Democratic Republic of Congo, or other far-flung places occasionally got the opportunity to examine the victims of voodoo curses. Though these doctors usually found nothing wrong with the patients via standard workups, they reported it was clear that their charges felt very ill. Often the cursed would pass away, although the physicians couldn’t figure out the physiological cause of death. Other times, the victim would get a lucky break–he or she might receive a countercharm or assurance from a sorcerer that the curse had just been a joke–and in these cases, a rapid recovery could occur.

In the 1940s, Walter Cannon wrote a great account of this type of Voodoo Death for American Anthropologist. He relayed some anecdotes about the phenomenon and raised the possibility that maybe there was something to it. Perhaps strong emotions, like fear, can actually do us in. Cannon hypothesized that the cause of Voodoo Death was a hyperreactive sympathetic nervous system brought on by emotional stress. Excess nervous system activity could result in a fall in blood pressure, eventually leading to death. He compared reports of Voodoo Death to cases of shock that had been described during war: subjected to a terrible stress (like a grenade going off nearby), some soldiers would quickly die, even though no gross injury was apparent. And everyone has heard stories about someone dropping dead or having a heart attack following shocking news. Same concept.

Being cursed in a culture that believes in sorcery certainly sounds like a stressful event. One researcher working in northern Australia pointed out the strong social dimensions of a voodoo curse. Once someone is cursed, they are excluded from social life. The cursed individual is treated as though he or she is already dead. The only social interaction they can expect after being cursed is being present for the commencement of their funeral rites. Surrounded by people more or less pressuring them to die, victims cooperate, refusing food or drink and accepting their fate. In this view, voodoo works because people believe in it. It’s like the well known phenomonen in which being given a placebo is followed by improved health–except for the opposite happens. In fact, this expectation of sickness has been given its own name: the “nocebo” concept.

Not everyone agrees with the voodoo-death-caused-by-severe-stress-and-fearful-expectations hypothesis, of course. For example, some people think that the typical voodoo curse victim may be poisoned or simply denied food and water until they die. But in the 1970s, anthropologists like Barbara Lex voiced support for the theory that manipulating the autonomous nervous system via fear could be sufficient to result in death. Although we might consider Voodoo Death to be death by suggestion, the suggestion leads to a real physiological chain of events. A number of anthropologists who have championed this view have pointed out that their hypotheses can be tested. For example, if you examine a victim of a voodoo death curse, they should bare the telltale signs of parasympathetic activation: constricted pupils, pallid skin, etc. Unfortunately, or maybe fortunately, there haven’t been many cases of Voodoo Death easily accessible to researchers who want to gather this kind of data.

A doctor named Harry Eastwell, who provided psychiatric services to local communities in Northern Australia, described a scenario in which both psychological and physical deprivation were at work. The third most common “psychiatric” syndrome he treated in the region was a gross fear state, in which people (almost all males) were terrified that they were going to die from sorcery. Already in a sorry psychological state, these people may be prime candidates for “voodoo” deaths. Only two of his 39 patients suffering from this fear state died, though. And in both cases, mundane causes of death could be identified (although their state of fear may certainly have contributed to these proximal causes). Eastwell also reported that voodoo death could be averted by removing the victim from a situation in which everyone around them thought that death was a foregone conclusion and by treating any conditions that ailed them (like dehydration). After seeing a pattern in which water was either denied to a curse victim or they would not drink it themselves, he thought some of the mystique of the voodoo death had disappeared. While psychological forces were certainly at work, Eastwell believed that denial of fluids was an important cause of death in both victims of sorcery and those who suffered from other illnesses believed to be fatal. It should be noted that some researchers vehemently disagree that people in Northern Australia withhold food and fluids from the ill. So, like any interesting topic, the role of dehydration in hurrying along the cursed is controversial!

Although most voodoo deaths have been reported from the remote locations where cultural anthropologists used to do their fieldwork, similar anecdotes have emerged here in the U.S. For example, in 1960s Oklahoma, a healthy, successful man decided to sell a business that he operated with his demanding mother. Unhappy, she predicted that if he made the sale, something dire would happen to him. Two days later, despite having no previous history of breathing problems, he suffered his first asthma attack. Soon, he was in and out of the hospital for asthma that was out of control. One night, he called his mother and told her of his plans to reinvest the money from the sale of his business into a new venture. He also expressed optimism about his health prospects. She told him that no matter what he or his doctors thought about his chances of recovery, he should get ready for the worst. An hour later, he was dead. And this story isn’t entirely unique. I’m sure you can recall any number of stories about people who died of fright, or sadness, or because they had given up the will to live.

All of this makes you take a second look at your own culture. What proportion of illnesses and deaths are due to nocebo-type beliefs? Researchers have questioned whether certain surgical patients have a “predilection to death.” That is, these patients are convinced that they are going to die, and they may even view death as desirable. Not surprisingly, they are more likely to die. Although Voodoo Death sounds exotic, some researchers see Walter Cannon’s seminal paper on this subject as the beginning of a long and fruitful research agenda focusing on the link between emotions (like fear) and health outcomes. There is still a lot of confusion about whether and how Voodoo Death occurs in other places, but it seems we have been able to use this unusual topic to shine a light on an important cause of illness in our own society.

The PRNP gene: One reason not all of us make good cannibals

original-fore-tribe-papua-kuruThe story starts in the eastern highlands of New Guinea, back in the 1940s-1950s. A fatal disease called kuru blazed through many villages, and women and children were especially affected. It started with a few months of head and body aches. This was followed by trouble standing and walking. Eventually tremors began, and sufferers lost the ability to get around entirely. Finally, euphoria set in (kuru is sometimes called the “laughing death”). By the late 1950s, more than 200 new cases a year were being reported, and, all told, over 3,000 deaths resulted (in a population that numbered only about 12,000). In some places, few young women were left.

Figuring out what was responsible for kuru was not an easy task. According to the Fore, the group hardest hit by the disease, it had first appeared in the 1920s and spread rapidly.  Researchers noticed that often multiple people in a family were affected, so they started building kuru pedigrees. Complicated genetic causes were proposed. But eventually the strange distribution of the disease (it primarily affected women and young children) shed light on the true culprit: cannibalism. When someone died (say, of kuru) the women and children in the family would prepare their body for the funeral. Part of this preparation involved eating their loved one’s body, in a feast that signified respect for the deceased. The brain, which was the most infectious body part, was eaten by women and children. And the infectious entity wasn’t a bacterium or virus. Instead, it was something entirely different: a prion, or a protein folded in a strange way. Once prions enter the body, they cause other proteins to misfold, until there are enough faulty proteins to cause real problems. The gene that encodes the prion protein is called PRNP, and changes in this gene have since been linked to other prion diseases like variant Creutzfeldt-Jakob disease (aka mad cow disease), fatal familial insomnia, and Gerstmann-Sträussler-Scheinker disease. Kuru and similar diseases are called transmissible sponge encephalopathies, because of the sponge-like effect they have on the brain (shown in the picture below).

BrainSectionsWhile nobody knows for sure how the epidemic started, the current theory is that someone among the Fore happened to develop variant Creutzfeldt-Jakob disease early in the 20th century. When his or her loved ones performed the mortuary feast, they ingested the prion and developed kuru. When they died, the chain continued, and the disease spread across the Eastern highlands. In the 1950s, the Australian government instituted strict, new laws that outlawed cannibalism. After this, the disease started to lose steam. A trickle of new cases continued to appear, however, well into the 2000s. Although the average time from participation in a fateful mortuary feast to the onset of kuru is 12 years, it turns out that the incubation time for kuru can be anywhere from 4 years to over 50! What accounts for that variation? And why did some people who participated in mortuary feasts get kuru, while others did not?

It turns out that the story is not so simple! Those early attempts to link kuru to genes were not so far off after all. At the 129th amino acid residue in the PRNP protein, people either have a methionine (M) or a valine (V). Since we all have two versions of the PRNP protein (one from mom, and one from dad), a person can either be an MM, an MV, or a VV. It turns out that being an MM makes you especially susceptible to kuru. MVs and VVs are both less likely to develop kuru AND if they do develop it, the incubation time is longer. But being an MV is the absolute best. After the kuru epidemic ended, there was a deficiency of MM individuals among the Fore, because so many of them had died of kuru. And it turns out that the finding that MM individuals are susceptible to kuru is relevant beyond New Guinea. Patients who develop variant Creutzfeldt-Jakob disease have also invariably turned out to be MMs!

In 2003, an attention-grabbing paper on the PRNP gene came out in Science. Scientists had already established that, among the Fore, MVs were the most likely to survive the kuru epidemic. When the best genotype to have is one that combines two different types of alleles, it is called balancing selection. Natural selection will often end up keeping both types of alleles in a population; the success of individuals with both versions will result in a “balance” being struck between the two. In the Science paper, a group of researchers said they had also found evidence of balancing selection in the PRNP gene worldwide. What could this mean? The authors speculated that maybe a long history of exposure to prion diseases spread from animals (like mad cow disease) could account for selection that favored MVs. Or MAYBE it was a long history of human cannibalism! This study has been a controversial one. Not everyone agrees with the way the analysis was performed, for example. But it’s pretty interesting.

In short, MMs don’t make good cannibals–or maybe even meat eaters, since they are also more susceptible to mad cow disease. As it happens, you can figure out your genotype using 23andme! If you’ve been genotyped, you can go to the browse raw data page and enter rs1799990It will tell you your DNA sequence at the relevant part of the gene: An “A” corresponds to the M version of the protein, and a “G” corresponds to the V version. I found out I have two As, which means I’m an MM. I’m not cut out to be a cannibal, and maybe I should consider becoming a vegetarian too! That’s the breaks, I guess.