Do people get yaws from monkeys and apes? A potential roadblock for eradication.

Recently, a letter I co-authored called Treponemal infection in nonhuman primates as possible reservoir for human yaws was published in Emerging Infectious Diseases. It’s free if you want to check it out!

imgresMost people I know have never heard of yaws, but at one time it was very, very common in tropical regions across Africa, Asia, and the Americas. It’s a chronic, debilitating infection that is usually contracted during childhood, and it is caused by a bacterium closely related to the one responsible for syphilis. Luckily, it’s easily treated. You can cure it in its early stages with a single shot of penicillin, and recently we have learned that a single course of oral antibiotics appear to work just as well. In short, there is really no reason for anybody to have to suffer from this horrible disease.

Many other people feel the same way. In fact, a huge yaws eradication campaign took place in the mid-20th century. After World War II, this was one of the first big public health campaigns planned by a brand new World Health Organization. More than 40 million people were treated, and the number of new cases fell by as much as 95%. Not bad! The campaign wasn’t successful, though, in that it never achieved its ultimate goal: wiping this disease from the face of the earth.

There are multiple reasons why the first campaign failed. One big reason is that it simply didn’t have the resources to keep on top of things. After a while, the WHO turned over the responsibility for yaws surveillance and treatment to local governments. Unfortunately, the whole reason the campaign was necessary in the first place was that local governments weren’t capable of carrying out these kinds of tasks without support. Not surprisingly, yaws resurged in a number of countries and is still around today.

There is another important reason that the eradication campaign may have run into trouble: a potential animal reservoir. One of the most important criteria for an eradicable disease is that there is no animal reservoir. Otherwise, you can totally eliminate the infection from a population, only to have it re-enter via an infected animal. A single infected person spreads it throughout a newly susceptible population, and all of your hard work is for naught. In this situation, eradication is not an acceptable goal–though control certainly is. In our EID article, we outline all the evidence that supports the hypothesis (around since the 1960s) that (1) African monkeys and apes are infected with yaws and (2) they may be capable of spreading the infection to humans. Infection via animals could help explain the mysterious cases encountered during the first campaign, when infected individuals would turn up in a previously treated population, having had no contact with any infected people as far as anyone could tell.

The WHO announced a second yaws eradication campaign recently, but it doesn’t seem as though much thought has been given to the problem of an animal reservoir. People involved in the first eradication campaign were calling for further research into the potential problem of simian yaws as early as the 1960s, but this history seems to have been largely forgotten. That’s unfortunate. Eradication campaigns are incredibly expensive. In the end, the cost of finding and treating cases skyrockets, because it entails going to remote and dangerous places to treat the very last hidden cases of an infection on earth. The polio eradication campaign has been going on for years and years longer than was originally planned, and we have spent much, much more than was originally budgeted because of these difficulties. Eradication campaigns also put a tremendous financial burden on the countries involved, as well as sponsor organizations such as the WHO. Money spent on yaws eradication (vs. simple yaws control) is money that low income countries cannot spend on other important health problems, like HIV, tuberculosis, and the childhood infections that represent huge sources of mortality. There is a huge opportunity cost involved. (Side note: a great book on the drawbacks of the polio eradication campaign, relevant to eradication campaigns in general, is William Muraskin’s Polio Eradication and its Discontents.) If we decide to launch a new eradication campaign, we need to make sure that we can actually carry it out, so that the resources we expend will have been well spent.

Our argument in this letter in a nutshell: before throwing a massive amount of resources behind another eradication campaign, it makes sense to do our due diligence and make sure that an animal reservoir is not going to torpedo yaws eradication for a second time.

Does the flu vaccine cause Guillain-Barré syndrome or not?

urlGuillain-Barré syndrome (GBS) is a pretty scary condition. It starts with weakness and tingling in the extremities and can eventually leading to paralysis. Although most people recover in time, death can occur. Luckily, it’s a rare disease. It’s thought to result from an autoimmune process in which peripheral nerves are demyelinated and destroyed.

What causes GBS? Infections seem to be a major trigger. In about two-thirds of cases, the syndrome is preceded by either a gastrointestinal or respiratory infection. Campylobacter enteritis seems to be the most common trigger, but influenza, cytomegalovirus, Epstein-Barr virus, and HIV have all been implicated too. It appears that, in rare cases, these pathogens trigger the autoimmune cascade that leads to the diseases.

The 1976 H1N1 vaccine and GBS

Way back in 1976, researchers noticed something scary. 1976 was the year of the big swine flu epidemic scare. In February of that year, two army recruits at Fort Dix, in New Jersey, tested positive for swine flu. Researchers believed the strain they were infected with was similar to the one that had caused the 1918 flu pandemic that had killed millions. When they looked a little harder, they found that hundreds of other recruits at the base had been infected as well. Because these were not folks that had contact with pigs, it meant that the virus was spreading from person to person. Naturally, people were nervous. The government decided it would produce a vaccine against this strain and vaccinate as many people as possible, in order to head off what it feared might be a terrible pandemic.

Something strange happened though. Cases of GBS in people who had received the flu vaccine started cropping up. Hundreds of them. In 2009, The New York Times ran the story of Janet Kinny, a woman who developed GBS after receiving the shot in 1976. GBS put this young mother in the hospital for a month, paralyzed from the neck down. She recovered, but not everyone was so lucky. More than 30 of the people who developed GBS after getting a flu shot that year died. Epidemiologists spent a while debating whether or not this cluster was just a coincidence. In the end, most agreed that the shot really was associated with an increased risk of GBS. Researcher Lawrence Schonberger estimated that people who received the 1976 flu shot were roughly 7 times more likely to develop GBS than people who did not. For every 100,000 people vaccinated, approximately one got GBS. In December 1976, after having immunized more than 40 million people and failing to see evidence that the H1N1 pandemic was actually going to materialize, government officials called off the vaccination campaign due to the GBS risk.

Not surprisingly, people became wary of flu vaccines. Nobody wants to get GBS. A lot of work has been done over the years to try to clarify the risk that flu shots pose, but GBS is such a rare condition that it has been hard to put together studies large enough to shed light on this problem. Well, this year, three important studies on the flu vaccine and GBS came out. These were huge studies that each looked at millions of people, and they’ve provided a lot of insight into the relationship between vaccines and GBS.

2013: The year of gigantic flu shot/GBS studies

The first study appeared in Clinical Infectious Diseases. The authors mined data collected over 13 years (from 1994–2006) by Kaiser Permanente. Those of you from the West Coast know that Kaiser Permanente is a big insurer/hospital system with tons of clients. Using records for 3 million of patients, they were able to identify 415 confirmed cases of GBS. Sure enough, exactly two-thirds of these patients had suffered a respiratory and/or gastrointestinal illness in the 90 days preceding the onset of their GBS. But only 25 had received a vaccine of any kind in the 6 weeks prior to onset. In this study, GBS was NOT associated with getting a prior flu shot. However, the authors pointed out they could not rule out a very small increased risk of GBS; it’s always possible that with a larger sample size (i.e. more cases) they would have increased power to identify a small association.

Another even larger study appeared in the Lancet Infectious Diseases, and it, too, focused on seasonal flu vaccines. Carried out using the universal health care system records in Ontario, Canada, it was able to identify 2,831 incidents of GBS between the years of 1993 and 2011. The authors of this study found that the risk of developing GBS was roughly 50% higher in the six weeks following a seasonal flu shot, vs the risk experienced 9–42 weeks after. Thus, there really did seem to be a small, increased risk of GBS associated with seasonal flu shots. However, the authors found that actually GETTING the flu was a much bigger risk factor for GBS. In the six weeks after seeking medical help for the flu, the risk of GBS was roughly 16 times greater (vs 1.5 times for the shot) than in the weeks following the danger period. To put these findings in context: For every million people vaccinated with the flu shot, about 1 would get GBS, and for every million people who got the flu, 17 would get GBS.

The third study appeared in The Lancet, and it focused on the flu vaccine in one special year: 2009. You may remember that 2009 was the year of another H1N1 swine flu scare (and thus another H1N1 swine flu vaccine). So if any flu shot was linked to a greater risk of GBS, as in 1976, it seems like it would be this one. When focusing on this single year, researchers found an increase in GBS cases associated with the vaccine. Of the 23 million people who received the H1N1 vaccine and were included in the study, 54 developed GBS within 6 weeks of the shot. This works out to be about 1.6 extra GBS cases for every million people vaccinated. Thus, there WAS a slightly higher risk of GBS linked to the shot–but it was tiny. So tiny, the authors point out, that most studies of seasonal flu vaccines simply wouldn’t be large enough to detect the association. The sample size issue may well explain why the first study (based on the Kaiser Permanente data) did not identify an association between flu vaccines and GBS.

The take-home message: the extremely low risk of vaccine-related GBS is outweighed by the much higher risk of flu complications

So it appears that there is a small risk of GBS associated with flu vaccines. But, as flu researchers have been quick to point out, the risk associated with actually GETTING the flu is much higher. Poland and colleagues have posed a thought experiment in the Lancet on this subject. They point out that if everyone in the US had gotten the 2009 H1N1 vaccine, it’s estimated that 22 vaccine-related deaths would have occurred. But it everyone had gotten the H1N1 flu, 12,470 deaths would have occurred. Although the side-effects of a vaccine loom large in our minds, it’s important to put these risks in perspective: most vaccines prevent dangerous diseases, so foregoing a vaccine poses its own (often much greater) risks.

These modern studies still don’t explain exactly what happened in 1976. Why did that particular vaccine cause the syndrome at such a high rate (1 per 100,000 vs. 1 per 1,000,000 for modern vaccines)? Nobody knows. One explanation for the increased GBS risk is that the vaccine was contaminated with a bacterial trigger like Campylobacter.  Another explanation, which seems more plausible, is that something in the vaccine resembled nerve cells–so that when a recipient’s body mounted an attack against the vaccine, the attack might have hurt nerve cells as well. It would be comforting if, eventually, we could identify the problem.

Ancient RNA: A Whole New World of Possibilities for Paleopathology?

imgres-3I just wrote a piece for BiteSize Bio called Ancient RNA: Does Next Generation Sequencing Offer a New Window into the Past?

In it, I describe an article by Fordyce et al that came out earlier this year in PLoS ONE: Deep Sequencing of RNA from Ancient Maize Kernels. This group of researchers was able to obtain RNA sequences from 700-year-old corn kernels. Neat, right? I was really surprised that this paper didn’t get more attention when it came out. I think it’s basically been taken for granted that studying ancient RNA just isn’t possible, due to RNA’s fragility–so a paper showing that aRNA studies may actually be feasible was pretty exciting in my opinion.

Tom Gilbert, the senior author of the article, told me they have encountered some skepticism regarding the results, because the idea that RNA can’t survive over long periods of time has become so ingrained. Maybe that’s why there hasn’t been more chatter. I thought the paper was pretty thorough, though, in ruling out issues such as contamination. I have to say–I’m a believer so far!

Which leads me to wonder… if you can amplify ancient RNA from ancient corn kernels, is it also possible to do so using other types of samples? Tooth pulp, for example, which should also provide a relatively protected environment? If so, perhaps we could look at infection with RNA viruses (e.g., coronaviruses, influenza, hep C) in times past. Or maybe even gene expression under various conditions. The possibilities seem endless! I’m really curious to see where this line of research goes. Is this just the beginning?

My Geeky Pleasure: Retraction Watch

retractionwatch3An entire website devoted to the retraction of scientific articles. I know it doesn’t sound very exciting–but this site is actually pretty fascinating!

Let’s just take some recent examples. This is where I learned about the Czech scientist who broke into a lab that was trying to replicate his (falsified) findings, in an attempt to gum up the works. And about a principal investigator’s wife who apparently got her own PhD by borrowing liberally from a student in the lab.

Or what about this? Remember those lurid stories about a drug called krokodil that were all over a little while back? It supposedly turned up in St Louis–by way of Russia. It’s way cheaper than heroin, but it has the major drawback of making its users’ skin turn green and fall off. Gross, right? But it turns out there are some serious problems with the scientific article that those recent news stories were based on. Where did I find out about this? Retraction Watch! (In case you’re interested, here’s a Slate piece by Justin Peters that takes on the stories about krokodil in the US.)

All that stuff is pretty fascinating in a geeky-version-of-Judge-Judy-for-scientists way. But Retraction Watch is also where I have learned about problems that affect my own work. For example, I’ve done a lot of research on ways to get DNA out of old samples. One of the papers I looked at while doing this research turns out to be based on fabricated data. Good to know!

Finally, I think the website is important because it provides a space to look at the big picture. Are the growing number of retractions in the literature a good thing or a bad thing? Are we (journals and scientists) handling retractions in the right way? Could we be doing better?

This site is run by two guys–Adam Marcus, the managing editor of Anesthesiology News and a freelance writer, and Ivan Oransky, vice president and global editorial director of MedPage Today. I’m sure these are two busy people, and I can’t even imagine how much time it takes to run this site. Plus, it sounds like the authors involved in the retraction stories threaten them on a fairly regular basis, which must be stressful. Anyway, it’s a great public service to the science community. A big thanks to them both!

Why are parents refusing the Vitamin K shot for their babies?

images-2Between February and August of this year, 4 babies in Nashville developed brain hemorrhages or gastrointestinal tract bleeding. Luckily, all of them survived. Not all babies have been so lucky. There was a case, in Australia in 2011, in which the vitamin K shot was refused by the parents and a baby died.

Nashville-area physicians report that an increasing number of parents are refusing vitamin K shots for their babies. Although the percentage of parents refusing the shot is only about 3% at local hospitals, almost 30% of parents refused at birthing centers. And this isn’t just a Nashville thing. Over 20% of parents at a St. Louis-area birthing center refused the shot as well, and I’m sure the stats for hospitals/birthing centers in other places are similar.

Why would parents decline the vitamin K shot? Maybe because of misinformation like that present on Joseph Mercola’s website. Mercola warns of three risks.

1. Inflicting pain on the newborn (in the form of a shot). He warns that the momentary prick of the shot may have long-term effects on the baby’s wellbeing and may jeopardize the success of breastfeeding. I’ll let you judge for yourself whether you think this sounds reasonable. I don’t, and there is certainly no good evidence to support it.

2. The amount of vitamin K injected is 20,000 times the needed dose and contains toxic preservatives. Wow, 20,000 times the necessary dose? Toxic preservatives? What is his source for this dramatic claim? A peer-reviewed journal article? Nope, I’m afraid not. It’s a website called Giving Birth Naturally. This website, in turn, gives no sources at all. Solid stuff, Dr. Mercola!

3. Babies run the risk of acquiring an infection at the injection site. This is true of any injection, but the chances of infection are so, so small. Even a hypochondriac like me thinks this is a pretty minimal risk. Infinitesimally small–I can’t even find reliable numbers on how often it happens, it’s so rare. For what it’s worth, I haven’t been able to find a single reported case of a baby developing an infection at the site of a vitamin K injection.

Now even Mercola acknowledges that the vitamin K shot doesn’t cause cancer. Unfortunately, not everybody has gotten that memo. Check out this website: the Healthy Home Economist. Although the author DOES eventually point out that the vitamin K-leukemia link has been debunked, she buries this acknowledgement in the comments, where no one will read it. Nice. The same uber-outdated information is also found in Mothering Magazine’s Natural Family Living Guide to Parenting. If you’d like to take a look at some of the articles debunking this association, you can check out this one in the New England Journal of Medicine (from way back in 1993!) or this more recent one, from the British Journal of Cancer.

Many of these anti-vitamin K shot websites give suggestions for what parents can do in lieu of the shot. Unfortunately, they are not well thought out.

1. Why not just request an oral dose of vitamin K for your baby? Because it doesn’t prevent hemorrhaging, that’s why. While it sounds totally reasonable, single oral doses just don’t do the trick.  Comparisons of “failure rates,” i.e. the rates of hemorrhaging, in countries that use different methods to administer vitamin K  demonstrate that a limited number of big oral doses just doesn’t work as well as the shot. Daily, low doses may be as effective as the shot–but to the best of my knowledge, those aren’t an option in the US.

2. Eat a lot of vitamin K-rich foods and breastfeed your baby. Again, not a great strategy. Very little vitamin K makes it into breastmilk, even when a mother eats a lot of it. Very little can cross the placenta beforehand either, even if the mom has a great diet. That’s why the shot is necessary.

You would never realize it from the scare-mongering articles out there on the internet, but in reality the risks associated with the vitamin K shot are negligible compared to its potential benefits. It’s true that the chances of any one baby developing vitamin K deficiency-related bleeding are small–but when such a great way to avoid this risk is present, why not use it? A vitamin K shot may not be natural (meaning it didn’t exist tens of thousands of years ago). But neither are vaccines. Or carseats. And these inventions save lives. For any given child, the risk of dying from a hemmorhage or measles or a car accident may be small. But at the population level, these easy fixes make a difference–they save lives.

Reconstructing the ancestral microbiome: the American Gut approach

hadza-615I came across the human food project website today. This site is run by the American Gut project folks. You may have heard of their work before: you send them $99 and they send you a home sampling kit. You swab yourself, and they give you a list of the microbes living in your gut. Neat, right? And besides yielding fun information for you, it provides a rich data source for them to analyze (with the goal being to learn more about what Americans are carrying around in their intestines).

I’m waiting for my results to come back from uBiome, a similar service.

Anyway, on the website, I noticed a section entitled Ancestral Microbiome. Exciting! I’ve actually been thinking about how one could go about reconstructing the ancestral microbiome a lot myself lately. Recently, I published a paper entitled Genomics, the origins of agriculture, and our changing microbe-scape in the Yearbook of Physical Anthropology with George Armelagos. One thing we looked at is attempts to learn more about the pre-agricultural microbiome. This is a really tough problem, for several reasons.

One approach would be to use aDNA to characterize microbiomes from ancient remains. The aDNA approach has worked out for some ancient post-agricultural remains (check out Insights from characterizing extinct human gut microbiomes by Tito et al.), but so far nobody has gotten it to work on remains dating prior to the advent of agriculture. Bummer.

Another approach is to study contemporary hunter gatherer groups. This is pretty problematic too, though. First, the very few hunter gatherer groups that are still around have been able to protect their way of life primarily by fending off agriculturalist intruders. This means that they are probably not amenable to being studied by swab-bearing scientists. Second, learning about the hunter gatherer groups around today isn’t necessarily going to tell you all that much about the typical hunter gatherer group living tens of thousands of years ago. The very fact that a hunter gatherer society is still around in a very agriculturalist world suggests that it may be unique in some way. Third, even though some hunter gatherer groups have been able to maintain their traditional subsistence strategies, more or less, it’s possible that they have still been exposed to agriculturalist microbes, which might have altered their microflora.

So how are the Human Food project researchers going about reconstructing the ancestral microbiome? They are studying the Hadza of Tanzania. About 1,000 Hadza live around Lake Eyasi in the northern part of the country, and roughly one-quarter of them live as hunter-gatherers (no crops, no livestock). Not surprisingly, the Hadza are not completely cut off from the world outside their homeland. Contact with agriculturalists stretches back for at least a century. Most Hadza now speak Swahili fluently. Alcohol has become a problem for some, and microbes like TB have been introduced. In the mid 1990s, anthropologist Frank Marlowe found that about 10% of calories that came into Hadza camps was from non-foraged food delivered by missionaries or obtained through trade with agriculturalist neighbors. It’s possible that an increasing stream of money from tourism may result in more calories being obtained from purchased crops nowadays.

It will definitely be interesting to see how the microbiomes of the Hadza differ from those found in other groups in the area. I think this is a project worth doing. But my guess is that there has been a lot of “microbe-creep” between the Hadza and these neighboring groups (and perhaps even foreign tourists). This, along with the other problems I pointed out, would compromise our ability to draw conclusions about the “ancestral” genome. I’m eager to see how this will play out–and curious to hear what other people think.

Does the polio vaccine cause polio?

imgres-2One claim that I’ve heard a lot from vaccine opponents is that the polio vaccine is actually causing polio epidemics instead of preventing them. Scary, right? On websites like “GreenMedInfo” you see headlines like Polio vaccines now the #1 cause of polio paralysis. And Joseph Mercola maintains that because of problems like this, “the polio vaccine is not the ultimate solution to prevent” polio.

Is any of this true? As it turns out, there IS a grain of truth to this claim. But the truth is way more complicated than vaccine skeptics seem to understand, and, as it happens, the solution is more vaccination, not less.

Here’s the deal. There are two types of polio vaccine, one that involves the injection of dead virus and another that involves oral drops filled with live (but non-disease causing) virus. Here in the U.S., we use the first type of vaccine. But there are many benefits associated with the second type of vaccine: it’s easier to administer (drops vs shots), it provides stronger protection, and here’s the biggie: because it can spread after being administered, it effectively inoculates other people in the community. This is called passive immunization, and it’s definitely a good thing! It means the protection of the vaccine extends beyond just the people who get vaccinated. The virus can remain in a child’s feces for up to six weeks, during which time she has the potential to inoculate the people around her.

There is a dark side to this passive immunization, though. Give the vaccine-derived virus enough time to spread, and it can begin to mutate. Tick tick tick. Enough mutations, and it may regain its ability to paralyze. When does this happen? When the proportion of vaccinated kids is low. Surrounded by plenty of susceptible kids, the likelihood that a vaccine-derived virus can circulate for long enough to gain these mutations goes way up.

Unfortunately, this situation sometimes occurs. There have been a number of vaccine-derived outbreaks of polio. In total, scientists think that, since the year 2000, there have been 20 outbreaks of vaccine-derived polio in 20 countries, resulting in 655 vaccine-derived polio cases. The biggest outbreak, involving hundreds of kids, happened in Nigeria. This isn’t surprising, because vaccination stopped in some regions of the country for over a year in the early 2000s.

The counterintuitive solution to fighting vaccine-derived outbreaks is to increase vaccination rates. That way, children are inoculated with the harmless version of the virus and can’t be infected with any “bad” versions floating around. Vaccine-derived strains are nipped in the bud before they have time to accumulate dangerous mutations.

One thing to ask people critical of the polio vaccine because of vaccine-derived outbreaks is what they recommend as a substitute. Joseph Mercola’s big idea is to cut down on sugar, apparently. Sadly, sugar consumption isn’t a big problem in most of the places where polio is still a problem. So I think we can rule that solution out. Sayer Ji, founder of GreenMedInfo, recommends improved sanitation. While improved sanitation certainly cuts down on opportunities for infection (polio is spread through the fecal/oral route), it can paradoxically create more opportunities for paralysis. Why? The older you are when you become infected with polio (and the age of infection goes up when opportunities for infection go down), the greater the odds of paralysis. So I think we can rule that solution out too (although improved sanitation has many other benefits, and it is definitely a worthwhile goal).

So while vaccine-derived polio outbreaks are a real thing, they are not a reason to abandon the polio eradication campaign, which has reduced the number of new polio cases by 99% since it began in the late 1980s.