Category Archives: Research Blogging

Pseudomonas: Even More Bad-Assed Than We Thought

The first time I met the genus Pseudomonas, I was a brand-new graduate student doing a rotation in David Figurski’s lab at Columbia University. Dave works on “promiscuous” plasmids that can move to many different species of bacteria. These plasmids pay their rent by providing multiple antibiotic resistance genes and other goodies to the bugs that host them. In that lab, we commonly cultured new strains with three or four different antibiotics in order to select for particular plasmids. Novice that I was, it seemed to me that such cultures would be pretty foolproof in terms of contamination – I mean, how many “wild” bacteria could live in such a drug-laden soup?

Pseudomonas aeruginosa, for one. It wasn’t long before I managed to contaminate a plate with something that grew bright green colonies and had a weird grape-like odor. A senior grad student in the lab diagnosed it immediately. “But how can it survive on there?” I asked. “It’s Pseudomonas,” he laughed.

Pseudomonas killing another bacterium with secreted toxins.

The last thing you see: P. aeruginosa (red) secretes toxins into another bacterium (green/blue), causing its cell wall to fall apart. Image by James Easter.

While many bacteria pick up drug resistance genes here and there through mobile genetic elements such as plasmids and transposons, P. aeruginosa has resistances baked right into its genome. It carries multiple drug-specific resistance genes as well as general-purpose pumps that throw antibiotics out of the cell before they can even act.

One Pseudomonas cousin, now known as Burkholderia cepacia, takes this tough-guy act a step further; it can actually use penicillin as a food source. Imagine hornets passing around a can of Raid so they can take long pulls from it while commenting on its pleasant flavor, and you’ve got the general idea. These are some seriously bad-assed bugs.

Given all that, perhaps it shouldn’t come as a surprise that P. aeruginosa is also covered in venomous spines. That was the conclusion of a paper published last summer by researchers at the University of Washington, and discussed in an accompanying News & Views article in Nature.

The authors start off the paper with an excellent description of the microbial world:

Competition for niches among bacteria is widespread, fierce and deliberate. These organisms produce factors ranging in complexity from small diffusible molecules to multicomponent machines, in order to inhibit the proliferation of rival cells.

It’s a jungle down there at the micrometer scale, and P. aeruginosa is the honey badger.

This is a heavily armed conflict on all sides. While we often think of antibiotics as brilliant products of medical research, microbes invented them. Penicillin is just a fungus’s way to get ahead of its bacterial competitors; penicillin resistance is the bacterial side’s countermeasure. We’re naive newcomers to a battle that’s been raging for billions of years.

One of the more sophisticated siege machines in play is called the Type VI secretion system, a set of proteins found in many Gram-negative bacteria. The Type VI system injects proteins from one cell directly into another when they touch, in a manner similar to the way some bacteriophages transfer their genetic material into their hosts. That similarity could be a case of convergent evolution, in which form followed function, or the bacteria might have ripped off the mechanism directly from an ancient phage infection that went awry. I favor the latter explanation, largely because it fits so well with the general Pseudomonas attitude. Scrapping an infecting pathogen to build a weapon just sounds like something it would do.

In P. aeruginosa, the Type VI system exports at least three proteins, called Tse1, Tse2, and Tse3. All three, the new paper explains, are poisonous to other Gram-negative bacteria. Tse1 and Tse3 attack the peptidoglycan structure that lies in the periplasmic space between the outer and inner membranes of Gram-negative cells, causing the cell to fall apart or lyse. Earlier work had shown that Tse2 is toxic when it enters the cytoplasm, but it’s not clear exactly how it kills. When P. aeruginosa contacts another Gram-negative bacterium, it can use its Type VI secretion system to inject these toxins into the periplasmic space, killing off its competition so it can colonize a new food source.

That’s all well and good, but there’s a problem: the Tse proteins can also kill P. aeruginosa. To solve that, the bacteria carry antidotes, called Tsi1, Tsi2, and Tsi3. Each poison/antidote pair is expressed from a bicistronic operon (two genes transcribed from a single promoter), so whenever a cell manufactures the poison, it simultaneously produces the antidote for itself.

If Pseudomonas is producing both poison and antidote, though, why not just secrete the poison into the environment, antibiotic-style, rather than inject it through direct contact? It could be that these particular toxins don’t work well when diluted in the surrounding fluid, or that the whole toxin-plus-injection system evolved as a complete setup. Evolution doesn’t analyze a problem and calculate the most efficient way of doing things, it just tries out solutions until something works, and this works. It’s also possible that Pseudomonas is playing a more sophisticated game that we still haven’t fully uncovered. Transfering a toxin through direct contact could allow the bacterium to perform some kind of recognition step so that it won’t inject its brethren. Bayonets are less likely to cause collateral damage than bombs.

Whatever Pseudomonas is up to, it’s certainly a good system to keep studying. Besides being a pathogen in its own right, especially in patients with cystic fibrosis or immune deficiencies, this genus provides a nice sampling of the kinds of adaptations bacteria have evolved to survive in ferociously competitive environments. If it shows up as a contaminant in your lab, though, just throw those plates into the autoclave. That’s one thing even Pseudomonas can’t survive – yet.

1. Alistair B. Russell, Rachel D. Hood, Nhat Khai Bui, Michele LeRoux, Waldemar Vollmer & Joseph D. Mougous (2011). Type VI secretion delivers bacteriolytic
effectors to target cells, Nature, DOI: 10.1038/nature10244

Exploring the Sourdoughome

I love it when my interests intersect, so this new report from researchers in Italy and Belgium, on the microbiota of sourdough breads, definitely caught my attention. As the authors explain:

This study aimed at the identification of the [lactic acid bacteria] (LAB) and yeast microbiotas of 19 Italian sourdoughs used for the manufacture of traditional/typical Italian breads. The dominating LAB and yeasts were monitored by culture-dependent methods. Multivariate statistical analyses were performed in order to find the correlation between ingredients and the composition of the sourdough microbiotas, as well as the effects of the latter on the biochemical characteristics of sourdoughs.

Sourdough

My own applied microbiology project.

It seems that Italian sourdough is a particularly good subject for this, as the country is home to about 200 different types of bread, many of them leavened with regionally unique sourdough starters. These distinct bread recipes also use different types of flour and employ different procedures for “back-slopping,” or propagating the mixed bacterial-yeast culture. If there’s anything Italians love more than bread, it’s disagreeing about how to do things.

To see what the different starters look like microbially and chemically, the team took samples from 19 different sourdoughs, then cultured and identified their bacterial and fungal constituents to the level of species and strains. They also analyzed such parameters as pH, lactic acid concentration, and levels of free amino acids (FAA), gamma amino butyric acid (GABA), and other byproducts of fermentation.

Sure enough, the diverse baking techniques have led to diverse microbial and biochemical traits in the sourdough starters. There are a few dominant species of lactic acid bacteria and yeasts – Lactobacillus sanfranciscencis is the top bacterial species and the venerable Saccharomyces cerevisiae dominates the yeast communities – but they’re joined by a large supporting cast of related microbes, and each region seems to have its own specific combination of sub-strains. The resulting breads are equally varied, from Pane di Altamura (pH 4.03, 82mM lactic acid) to Pane Casareccio di Genazo (pH 4.14, 63.7mM lactic acid) to Pagnotta del Dittaino (pH 3.70, 83mM lactic acid).

The differences might affect more than just the flavor of the bread. As the researchers comment in the paper:

In addition, FAA and GABA produced by LAB may increase the nutritional value of the breads. For instance, the amount of GABA in 150 g of Pane di Matera PGI represents the minimum effective daily dose to get positive effects in humans.

The “positive effects” they’re talking about include lowering blood pressure in people with mild hypertension. Perhaps that’s another way the famous “Mediterranean diet” offsets the effects of that region’s delicious meats and cheeses.

One potential limitation of the study was that it relied on culturing the sourdough microbes in order to identify them. As metagenomic studies have recently revealed, the culturable part of the microbial world is just the tip of the iceberg. There’s a whole universe of bacterial, fungal, protozoan, and viral life out there that just can’t survive in any of the relatively small number of culture media available in the lab. That said, sourdough starters have been selected for a certain type of culturability, so there probably aren’t too many unculturable organisms in these samples. If it grows in a deliberately maintained culture in the kitchen, it will probably do so in the lab, too.

I hope the authors are planning to do follow-up studies on other sourdoughs, and perhaps on some beers. Belgian beers should be particularly interesting, as that country’s brewers have pursued their art in as many ways as Italian bakers have theirs.

Meanwhile, I’ll keep doing my own work in this field, which consists of baking a loaf of sourdough every few weeks. My starter allegedly originated in 1847 along the Oregon Trail, though it’s been passaged by many people in different parts of the US since then. The really charming thing about it is that a dedicated group of volunteers still distributes this starter to anyone who asks, for the cost of a self-addressed stamped envelope. I don’t know whether it produces enough GABA to lower anyone’s blood pressure, but it sure does taste good.

ResearchBlogging.orgMinervini, F., Di Cagno, R., Lattanzi, A., De Angelis, M., Antonielli, L., Cardinali, G., Cappelle, S., & Gobbetti, M. (2011). Lactic Acid Bacterium and Yeast Microbiotas of 19 Sourdoughs Used for Traditional/Typical Italian Breads: Interactions between Ingredients and Microbial Species Diversity Applied and Environmental Microbiology, 78 (4), 1251-1264 DOI: 10.1128/AEM.07721-11

The Other Superbugs: Pesticide Resistant Insects

In 1955, the World Health Organization launched an ambitious campaign to eradicate malaria. The effort relied on new, synthetic antimalarial drugs such as chloroquine and a miraculous new insecticide called DDT. Initially, it went pretty well: several countries’ malaria rates plummeted. Then it fell apart. The malaria parasites became resistant to the new drugs, and the mosquitoes which transmit the disease became resistant to DDT. After two decades of work and a massive expenditure of money and effort, the WHO gave up. Once again, Plasmodium and Anopheles had kicked Homo‘s butt.

Quick Henry, The Flit!

Quick Henry, The Flit!

By the 1990s, a new generation of public health officials was ready to take another run at the problem. Armed with a wider spectrum of antimalarial compounds, campaigns such as Roll Back Malaria seemed well prepared to deal with the problem of drug resistance by the parasite. Insecticide resistance was a different story. As before, the WHO-sponsored effort would rely heavily on a single chemical class to combat mosquitoes. This time, the effort favored pyrethroids.

The WHO’s enthusiasm for pyrethroids was understandable. As insecticides go, they’re pretty spectacular. These compounds are either mixtures or derivatives of a natural plant product called pyrethrum. Pyrethrum is only modestly effective by itself, but in the late 1940s chemists discovered that mixing it with another compound, piperonyl butoxide, boosts its killing power dramatically. Since then, researchers have synthesized several variants of pyrethrum, such as permethrin and deltamethrin, that are even more effective. What really makes these pesticides blockbusters, though, is that they’re highly specific for arthropods; their human and environmental toxicities are extremely low.

Because they’re so effective and nontoxic, pyrethroids are now the dominant over-the-counter insecticides worldwide. If you walk into the hardware store to buy some bug spray, you’ll see what appears to be a huge variety of products, but a close reading of the ingredient lists reveals that they’re almost all the same. Raid Ant Killer, Ortho Garden Insecticide, generic wasp killer, and most of the other colorful containers are just different package designs. What you’re really looking at is shelf upon shelf of pyrethroids.

We should know better. Bacteria, viruses, fungi, and protozoans have repeatedly taught us the same fundamental lesson about adaptation: if you keep throwing one chemical at a class of organisms long enough, they’ll eventually get used to it. Inevitably, the same has now happened with insects and pyrethroids.

In the tropics, particularly Africa, pyrethroid resistance has become a major public health problem. Because malaria control in poor, hot countries relies so heavily on pyrethroid-treated bed nets, resistant mosquitoes can now bypass the only real barrier between them and their victims.

Using these compounds willy-nilly has also spawned other problems. The treated bed nets, plus indoor spraying, have placed heavy selective pressure on all of the other insects that live in close association with people. Bedbugs, for example.

Indeed, bedbug populations have become highly resistant to pyrethroids, which is why homeowners’ DIY efforts to control them seldom work out. There’s been some debate about where that resistance came from, but recent results on US bedbug populations suggest that this resurgent pest is an import. It’s possible – even likely – that widespread pyrethroid use to combat mosquito-borne diseases in developing countries has spawned these new populations of superbugs.

Switching to other pesticides may help, at least sometimes with some insects. A study in Benin found that bendiocarb-treated bed nets were very effective against pyrethroid-resistant mosquitoes. Unfortunately, bendiocarb is highly toxic to birds and fish, and acutely toxic to humans in high doses. Treating a bed net with it is probably okay, but it’s not the kind of thing that should be sprayed around the house by amateur exterminators.

Nor is chemical-switching a panacea. Turning back to bedbugs, it appears their pyrethroid-resistance mechanisms are many and varied. Deep sequencing analysis revealed that the pesticide-resistant strains in a US infestation carry multiple changes in multiple genes, including increased expression of general detoxifying enzymes that could be useful against a broad spectrum of chemicals.

The solution, if there is one, will have to be twofold. First, we need a sustained research effort to understand the basic mechanisms of insecticide resistance and find new compounds that can overcome it. Second, both pesticide makers and public health officials need to take more responsibility for how these products are actually being used in the field, with a special focus on the problem of resistance. Our approach to distributing these powerful and important chemicals needs a thorough debugging.

Akogbeto, M., Padonou, G., Bankole, H., Gazard, D., & Gbedjissi, G. (2011). Dramatic Decrease in Malaria Transmission after Large-Scale Indoor Residual Spraying with Bendiocarb in Benin, an Area of High Resistance of Anopheles gambiae to Pyrethroids American Journal of Tropical Medicine and Hygiene, 85 (4), 586-593 DOI: 10.4269/ajtmh.2011.10-0668

Adelman, Z., Kilcullen, K., Koganemaru, R., Anderson, M., Anderson, T., & Miller, D. (2011). Deep Sequencing of Pyrethroid-Resistant Bed Bugs Reveals Multiple Mechanisms of Resistance within a Single Population PLoS ONE, 6 (10) DOI: 10.1371/journal.pone.0026228

Sewage Treatment, Coral Disease, and Koch’s Postulates

Coral reefs are in a tight spot these days. Increasing CO2 levels and rising ocean temperatures aren’t doing them much good, but their biggest problems are more direct. Overfishing is wiping out important predators, the aquarium trade picks off whatever looks pretty, agricultural and other runoff is clogging the filter-feeders, and some folks are even blowing them apart with dynamite.

Places with strict environmental regulations and protected marine preserves are generally doing a better job protecting their reefs, but even there we may be doing damage without realizing it. For example, what if coral reefs are catching human diseases?

That seems to be exactly what’s happening in Caribbean elkhorn coral (Acropora palmata), an iconic and structurally crucial reef species that’s been dying from a mysterious condition called white pox, or acropora serratiosis. The disease has been so deadly that the US EPA declared A. palmata an endangered species in 2006. Now, scientists have fingered human sewage as the source of the pathogen causing white pox.

A diver swims past an elkhorn coral colony. Image courtesy James Porter, University of Georgia.

A diver swims past a healthy elkhorn coral colony on Molasses Reef, near Key Largo, FL. Many other elkhorn colonies are dying from an infection that may be caused by a human pathogen. Image courtesy James W. Porter, University of Georgia.

Researchers at Rollins College and the University of Georgia described the finding yesterday in PLoS ONE:

Here we hypothesize that [Serratia marcescens] strain PDR60 isolated from two distinct environments, one terrestrial (human wastewater) and one marine (APS-affected A. palmata, apparently healthy Siderastrea siderea and Coralliophila abbreviata) causes APS [acroporid serratiosis] in A. palmata. To examine this hypothesis we conducted challenge experiments by inoculating eight isolates of Serratia marcescens representing three strains onto A. palmata fragments maintained in closed seawater aquaria. Our results confirm strain PDR60 as a coral pathogen through fulfillment of Koch’s postulates. These results are also consistent with the hypothesis that non-host corals and predatory snails may function as interepizootic reservoirs or vectors of the APS pathogen. Furthermore, we show that S. marcescens isolated from human wastewater causes APS in as little as four days, unequivocally verifying humans as a source of a marine invertebrate disease.

S. marcescens is a ubiquitous gram-negative bacterium. It’s in dirt, sewage, and probably your shower. If you haven’t noticed it, it’s because you have a working immune system. People who aren’t so fortunate – especially in hospitals – can get serious opportunistic S. marcescens infections. When this bug first turned up as a possible culprit in white pox, I figured it was probably an opportunist in the elkhorn corals as well. Perhaps the coral somehow got the anthozoan equivalent of a suppressed immune system, and Serratia took advantage of the situation.

The new work suggests otherwise, though I’m not sure it quite seals the case. The investigators experimentally infected elkhorn corals growing in tanks of purified saltwater, and found that a single inoculation with a pure S. marcescens strain cultured from sewage effluent was enough to give the corals a virulent case of white pox. This is called fulfilling Koch’s postulates, and it’s the Holy Grail of epidemiology. We can now say for sure that S. marcescens from human sewage causes white pox.

Or can we? I’m certainly convinced that the bacterial strain in sewage is capable of causing the distinctive pathogenesis of this disease, consisting of bleached white zones that spread across the coral colony. But we need to read the fine print.

The coral colonies these researchers used were harvested from “healthy” wild corals. That’s the only practical way to get experimentally useful amounts of this slow-growing creature. Unlike mice or guinea pigs, we can’t just breed up a bunch of stock from a well-characterized lab strain. However, picking apparently healthy corals from the wild doesn’t prove that they really are healthy. They’re presumably exposed to the same stresses and insults that afflict their white pox-infected neighbors, and could very well be on the brink of contracting the disease themselves. Maybe they’re already infected with the real underlying cause of the disease, and are just one wound or stressor away from getting the S. marcescens component that will finish them off. Because it’s currently impossible to characterize all aspects of the immunological and infectious status of a coral sample, we can’t know whether the bacterium alone is enough to cause disease.

Worse, we can state with certainty that the corals in these experiments were exposed to unusual stress. The scientists chipped off a piece of the colony (traumatic injury), sampled its mucoid coating (open wound), then carried it by boat to a laboratory tank (physiological stress).

That’s not to say I don’t believe the conclusions or the authors’ recommendations. Indeed, the measures they suggest include improved sewage treatment plants throughout the Caribbean, a step that’s clearly a good idea for a long list of reasons, whether or not it will save the elkhorn coral. Humans already get well-documented cases of sewage-borne diseases, and many Caribbean towns use inadequate treatment systems that raise the risk of these infections. That’s why Florida is already in the process of upgrading the treatment plants throughout the Keys. Of course, someone should also take a long, hard look at the waste treatment (or lack thereof) on cruise ships in international waters.

In the meantime, I hope researchers will continue studying white pox, with an eye toward preventing and perhaps treating it. As one of the authors points out in an accompanying press release, the stakes are high, even if the metaphors are a bit mixed:

“These bacteria do not come from the ocean, they come from us,” said Porter. Water-related activities in the Florida Keys generate more than $3 billion a year for Florida and the local economy. “We are killing the goose that lays the golden egg, and we’ve got the smoking gun to prove it,” [University of Georgia Ecology Professor James] Porter said.

1. Sutherland, K., Shaban, S., Joyner, J., Porter, J., & Lipp, E. (2011). Human Pathogen Shown to Cause Disease in the Threatened Eklhorn Coral Acropora palmata PLoS ONE, 6 (8) DOI: 10.1371/journal.pone.0023468

Leapfrogging Microfluidics?

“Microfluidics” is one of the hottest buzzwords in biotechnology and diagnostics research these days, with good reason: these lab-on-a-chip devices are about the coolest technology to come along since monoclonal antibodies. The designs vary widely, but the basic principle is to take traditional lab assays and miniaturize them onto silicone or plastic chips, often using manufacturing techniques developed by the semiconductor industry. I’ve blogged about these nifty devices once or twice (okay, maybe three times) before.

While I’ve found the technology fascinating to watch, it’s remained a bit of a laboratory curiosity. Everyone seems to agree that the “killer app” for microfluidics will be field-portable devices that that will let minimally-trained people diagnose diseases or detect specific compounds in the environment, especially in poor countries. That’s because chip-based labs can incorporate all of their equipment and reagents onto a disposable device no larger than a credit card. In principle, a technician could place a drop of fluid, such as blood, onto one end of the chip, and micrometer-size channels would siphon it around, mixing it and moving it to different chambers to perform assays that would normally require a fully-equipped lab. The volumes are so small that the reactions tend to occur very quickly, often shortening multi-hour tests to a few minutes. But that’s where the good news ends.

Because the reactions produce subtle chemical changes inside minuscule containers, detecting the result usually requires sophisticated analytical equipment, at which point we’re right back to building a full-size laboratory. No matter how cheap or portable the chips get, the assay readout has generally remained huge and pricey.

Until now. Two recent papers highlight what I think is a new stage in the development of microfluidics, where researchers are finally addressing the readout problem. In one effort, scientists at Columbia University report on a microfluidic clinical testing system that incorporates a whole slew of new ideas. More importantly, it actually seems to work in the field. As senior investigator Samuel Sia says in an accompanying press release:

“We have engineered a disposable credit card-sized device that can produce blood-based diagnostic results in minutes,” said Sia. “The idea is to make a large class of diagnostic tests accessible to patients in any setting in the world, rather than forcing them to go to a clinic to draw blood and then wait days for their results.”

Sia’s lab at Columbia Engineering has developed the mChip devices in collaboration with Claros Diagnostics Inc., a venture capital-backed startup that Sia co-founded in 2004. The microchip inside the device is formed through injection molding and holds miniature forms of test tubes and chemicals; the cost of the chip is about $1 and the entire instrument about $100.

The injection-molding process is a departure from most microfluidic construction methods. Instead of engraving the device onto a silicon chip, the researchers made a mold and cast duplicates in a mass-production system. Injection molding gives us plastic cups and soda bottles, so it’s clearly a mature technology that can be scaled way, way up. That’s what drives the per-chip cost down so low.

The investigators used this cheap chip to build a miniaturized ELISA, or enzyme-linked immunosorbent assay, platform. If you’ve ever been tested for any infectious disease, you’ve probably had an ELISA; it’s one of the most common and important assays in clinical diagnosis. To perform it, one needs to incubate a sample with a reagent that will bind some analyte – let’s say an antigen that will bind antibodies against HIV in a patient’s blood. Once the analyte binds, a series of washes and secondary reagents clears up background reactions and causes some kind of easily-detected chemical change. It typically takes a skilled lab technician a few hours to perform an ELISA, and it requires careful attention to detail through the various washing and incubation steps.

On the new chips, a simple channel meanders through the plastic. The binding reagent is stuck to one section of the channel, and Sia and his colleauges feed the blood sample, wash solutions, and other reagents through the tube sequentially. To separate the reagents, they simply added tiny bubbles between them, like you might see in a very small straw that’s reached the bottom of the glass. A common medical syringe provides the vacuum force to draw the whole train of reagents through the system.

Completing this tour de force of clever ideas, the team used a nanoparticle-based detection system that deposits visible quantities of silver in the channel if there’s been a reaction. A cheap absorbance meter quantifies the amount of silver, and determines whether a test is positive or negative. The researchers walk through the system’s advantages in a nicely-produced video interview Nature Medicine released to accompany the piece:

As you’ll see in the video, the researchers also put their system to the ultimate test, hauling it to Rwanda and testing actual patient samples in an underfunded, overworked clinic. The results were impressive: the new assay is about as accurate as traditional ELISA tests for detecting HIV and syphilis, but much faster and cheaper.

This makes me wonder whether we’re about to see another example of leapfrogging in poor countries. The most popular (and really only) current example of this phenomenon is cellular phones. There are virtually no landline connections in most poor countries, but nearly everyone has a phone. By missing the first telecommunication revolution, these countries have “leapfrogged” to the second, gaining all of the advantages of instant communication without going through the intermediate stages of rural electrification, Ma Bell, party lines, and rotary dials. If microfluidic devices can bring modern medical tests to the bedside in Rwanda, will we see them and other poor countries catapulting into 21st century medicine without having to establish 20th (or even 19th) century medical infrastructure first? Maybe.

What makes me optimistic about this is that Sia and his colleagues aren’t the only ones working on this problem. Indeed, around the time their paper came out, a less-noticed but equally interesting bit of work came out in the journal Analytical Chemistry. In that paper, Aydogan Ozcan and his colleagues at UCLA and elsewhere describe a system for performing flow cytometry on a microfluidic device. Flow cytometry, or cell sorting, is a sort of ELISA on speed. Rather than incubate the bulk sample with the reagent, cell sorters separate individual cells into a stream of droplets, like one would get by shaking a running garden hose. The droplets pass through a detector that measures specific parameters of the cell, such as its light diffraction characteristics or whether it bound a fluorescently labeled antibody. Researchers can then quantify exactly how many cells of each type were in a sample.

It’s a tremendously powerful technique for immunological research, and can also be used to perform a variety of blood-counting assays, but it requires even more skill and money than an ELISA. Research-grade cell sorters are massive machines that usually occupy a small room of their own and employ a dedicated technician.

Ozcan’s team decided to use a cell phone instead. With about $5 worth of parts, they cobbled together an adapter that connects an inexpensive microfluidic cell sorter to the camera on a Sony-Ericsson phone. As they explain in an accompanying press release:

The microfluidic assembly is placed just above a separate, inexpensive lens that is put in contact with the cell phone’s existing camera unit. This way, the entire cross-section of the microfluidic device can be mapped onto the phone’s CMOS sensor-chip. The sample fluid is delivered continuously through a disposable microfluidic channel via a syringe pump.

The device is illuminated from the side by the LEDs using a simple butt-coupling technique. The excitation light is then guided within the cross-section of the device, uniformly exciting the specimens in the imaging fluid. The optofluidic pumping scheme also allows for the use of an inexpensive plastic absorption filter to create the dark-field background needed for fluorescent imaging. In addition, video post-processing and contour-detection and tracking algorithms are used to count and label the cells or particles passing through the microfluidic chip.

While they haven’t taken it into a poor country’s clinics yet, the investigators did put the system through its paces in the lab. So far, they’ve demonstrated that it can measure white blood cell density as a cell sorter, and also operate as a mid-power fluorescent microscope. The former capability could provide tests for leukemia and AIDS progression, while the latter could be useful for a variety of analyses, including detecting pathogens in drinking water.

It’s going to take more than a couple of new testing systems to fix the health problems of poor countries, but papers like these – and I suspect others will follow shortly – show at least part of the solution. Perhaps these technologies will even make their way back to the developed world, as we seem to have some of our own issues with medical costs these days.

References:

ResearchBlogging.org

Chin, C., Laksanasopin, T., Cheung, Y., Steinmiller, D., Linder, V., Parsa, H., Wang, J., Moore, H., Rouse, R., Umviligihozo, G., Karita, E., Mwambarangwe, L., Braunstein, S., van de Wijgert, J., Sahabo, R., Justman, J., El-Sadr, W., & Sia, S. (2011). Microfluidics-based diagnostics of infectious diseases in the developing world Nature Medicine DOI: 10.1038/nm.2408

Seo, S., Isikman, S., Sencan, I., Mudanyali, O., Su, T., Bishara, W., Erlinger, A., & Ozcan, A. (2010). High-Throughput Lens-Free Blood Analysis on a Chip Analytical Chemistry, 82 (11), 4621-4627 DOI: 10.1021/ac1007915

Lionfish Derbies vs. Groupers

I love both diving and fishing, so the continuing saga of Pacific lionfish invading the Caribbean has definitely caught my attention. The backstory is that Pterois volitans and its cousin Pterois miles probably escaped from home aquarists’ tanks in Florida sometime several years ago. It could have been from a hurricane flooding someone’s house and washing the fish out, or (more likely) some hobbyists discovered that these big, venomous fish were more than they could handle, and “returned” them to the sea. However they got out, these critters quickly adapted to their new environment and started following the standard invasive species script: without the predators and pathogens that keep them in check in their home seas, they’ve bred like crazy.

Lionfish in hand.

Lionfish, with spines removed. Image courtesy Serge Melki.

Fisheries biologists are concerned, but not quite panicking yet. Just being prickly and venomous isn’t anything special in the Caribbean, and top predators such as reef sharks can eat lionfish, at least occasionally. Humans have also been chowing down, which is a particularly good strategy; we’ve proven repeatedly that we can overfish just about any species to the brink of extinction, so why not use that power for good?

Unfortunately, as a recent paper in PLoS ONE shows, even human predation may not do the job. The finding is based on mathematical modeling, so it comes with the usual caveat that simulations are not reality, but it provides some testable predictions that field scientists can now check.

Model results suggested that a high level of sustained removal would be required to reduce lionfish population sizes below the SPR threshold of recruitment overfishing. Scaling the annual exploitation rate to a lionfish per hectare removal figure based upon published data on lionfish density [7], [15], suggests a yearly removal of 157–293 lionfish per hectare would be required to cause recruitment overfishing for a population based on M and CR values of 0.5 and 15. Thus, the control of lionfish populations through targeted removal efforts will be costly, and eradication through removal efforts is highly unlikely.

A hectare is 10,000 square meters, or about 2.4 acres. One large dive boat could probably drop enough divers into the water to spear 200 lionfish over the course of a two-dive trip, but they’d all have to be serious underwater hunters to pull it off. And that would only take care of one hectare’s worth of fishing for one year. Even if that’s multiplied by hundreds of lionfishing boat trips per season in a popular diving destination, the ocean is way too big for us to take care of the whole job ourselves. The fishermen can’t pick up the slack, either:

Furthermore, such a lionfish fishery would be limited to shallow water (<30 m) spearfishing and handnetting as lionfish have a low vulnerability to capture by hook and line [7]. This gear and depth limitation provides potential refugia from fishing, potentially making removal efforts less effective. Lionfish are being captured regularly as bycatch in reef fish trap fisheries [7], but feasibility of a lionfish specific trap capable of removing high densities of lionfish without high bycatch of native species is questionable.

There is one thing that could help: groupers. These diverse fish (several species in the subfamily Epinephelinae) are large predators that have traditionally been common throughout the Caribbean. They can grow to the size of small sharks, and they aren’t fussy eaters: if it swims and it’s smaller than the grouper, it’s potential grouper chow. Unsurprisingly, researchers have found lionfish in grouper stomachs. But how much lionfish does a grouper eat?

In another recent PLoS ONE paper, researchers took a crack at that question using a natural experiment: the Exuma Cays Land and Sea Park (ECLSP). Two decades ago, Bahamian officials declared this area of small islands and reefs a no-fishing zone. Since then, groupers, normally some of the most heavily fished species in the world, have become abundant inside the park. Comparing the populations of multiple fish species in the ECLSP and in nearby fishable waters, the scientists saw a striking trend:

The biomass of lionfish was significantly negatively correlated with the biomass of grouper, with predator biomass explaining 56% of the variance of prey biomass (linear regression p = 0.005, Fig. 2, Table 1). Unlike large-bodied groupers (mean total length 55 cm, range 30–110 cm), other smaller predatory fishes such as Cephalopholis spp., lutjanids, carangids and aulostomids had no significant bearing on lionfish biomass (p = 0.17, Table 1), which might imply that large-bodied fish are the primary predators of lionfish. The relationship of grouper on lionfish was strongly non-linear such that an 18-fold variation in predator biomass among sites (~170–3000 g 100 m−2) was related to a tenfold difference in lionfish density (~0.3–0.03 fish 100 m−2) and 7-fold difference in lionfish biomass (Fig. 2). A 50% reduction in lionfish biomass was achieved with a grouper biomass of 800 g 100 m−2. Reducing lionfish density to 30% its highest value required a further doubling of grouper biomass to approximately 1516 g 100 m−2 (Fig. 2). The mean body length of lionfish was 24.5 cm (SD 4.1, range 15–34 cm).

There are limitations to the study, of course. In particular, it doesn’t directly measure grouper predation on lionfish. All it really shows is that having lots of big groupers around correlates with having fewer lionfish, and that the relationship is nonlinear, i.e. you need a whole lot of groupers before you see a serious dent in the lionfish population. In any case, it strongly suggests that we should try to boost grouper populations elsewhere if we’re serious about getting rid of the lionfish.

That’s going to be tough, though. As I mentioned, grouper is heavily fished, for the good and simple reason that it’s delicious. Indeed, the data clearly show – and my own experience confirms – that big groupers are now uncommon outside protected marine reserves. The appropriate policy might be to protect more reefs from fishing, but the authors conclude with a blunt assessment of that strategy:

However, if the historical trend of poor management continues [25] then direct capture and eradication may be the only practicable form of lionfish control for much of the Caribbean.

And that brings us back to spearing them.

1. Mumby, P., Harborne, A., & Brumbaugh, D. (2011). Grouper as a Natural Biocontrol of Invasive Lionfish PLoS ONE, 6 (6) DOI: 10.1371/journal.pone.0021510

2. Barbour, A., Allen, M., Frazer, T., & Sherman, K. (2011). Evaluating the Potential Efficacy of Invasive Lionfish (Pterois volitans) Removals PLoS ONE, 6 (5) DOI: 10.1371/journal.pone.0019666

The Epidemic That Still Isn’t: Autism Rates and Case Definitions

A paper that came out yesterday in the American Journal of Psychiatry has generated a lot of press coverage of the “autism epidemic,” as it purportedly shows that one in every 38 South Korean children is autistic. That’s more than double the incidence previous epidemiological studies have found. I have no doubt that this result, completely devoid of its larger context, will now be picked up by all manner of woo-woo peddlers as indisputable proof of whatever nonsense they’re selling. What the paper really shows, though, is how slightly different interpretations of case definitions can produce radically different results.

First, let’s dispense with the most obvious flaw in much of the news coverage of autism: there is no “epidemic.” As numerous epidemiologists have found every time they’ve looked at this issue, the apparent rise in the rate of autism spectrum disorder (ASD) tracks perfectly with changes in the diagnostic and special education criteria for the disease. It’s not rising, just being identified more often.

One particularly telling point is that the supposed increase in ASD doesn’t age well. Autism appears in childhood, but many folks seem to forget that children grow up, and the disease doesn’t go away when they do. That means that if ASD rates are genuinely increasing, there should be more autistic kids than autistic adults. But as a paper last week in Archives of General Psychiatry showed, there aren’t.

A Reuters story on that paper explained the key finding nicely:

Researchers found nearly one percent of Britons older than 16 years have autism, a rate that is similar to that seen in children. Younger people were no more likely to be affected than older ones, however, which would have been expected if the condition were truly on the increase.

“It was surprising to all of us,” said Dr. Traolach Brugha, a psychiatrist at the University of Leicester, who worked on the study. “If this study is correct, it does put a big question mark over the autism epidemic.”

Not so much a question mark as a stake through the heart. Brugha’s team found that the adults with ASD were less likely to have been diagnosed previously than kids with it, pretty much proving that the rising rates are due to increased diagnosis and awareness. Nor is Brugha’s report the first in this genre. For example, Orac at Respectful Insolence has an excellent post about a 2006 study that essentially proved the same point using a different method.

So case rates aren’t rising. But are they really only 1%? That’s what Kim et al. wanted to determine in the new South Korean study. I encourage everyone to read the whole paper, which Am J Psych has made available for free. It’s an impressive piece of work.

First, the researchers identified a South Korean community that was demographically diverse, and set out to sample all 7- to 12-year-old kids in that region. There were 55,266 of them. Before I start my criticism of this study, I have to congratulate the authors on the dedication, organization, and plain hard work that they put in to tackle such a huge job. As I said, it’s an impressive paper.

Kim’s team broke their sample into two groups: those who received special education or other psychiatric help (the “high-probability group”), and those who were in the general school population. That ensured that their sample wouldn’t be skewed by accidentally pulling too many kids from special ed. Between the two groups, they found a whopping 2.64% prevalence for ASD, or about 1 in 38 kids. That’s the conclusion that’s been plastered all over the headlines, and the one that I predict will launch a thousand quacks.

As often happens, though, the Abstract giveth while the Results taketh away. Let’s start with the autism rates in the high-probability group, which were, unsurprisingly, very high:

In the high-probability group (those in the Disability Register, those in special education schools, and those in regular schools who had psychiatric or psychological service use) 97 of 114 children were confirmed to have autistic disorder (N=74) or other ASDs (N=23). The high- probability group contributes 0.18% for any ASD to the total population prevalence (autistic disorder=0.13% and other ASDs=0.05%; the ratio of autistic disorder to other ASDs was 2.6:1).

I’ll parse that. In ASD, the middle letter is critical: autism occurs on a spectrum. Contrary to popular belief, not everyone with ASD is Rain Man. In fact, the cases depicted in movies are generally the extreme end of the spectrum, people with the biggest problems. Milder cases extend from there all the way through (according to many researchers) Asperger Syndrome. There are no bright lines dividing these folks. Indeed, there’s no reason to believe that there’s even a clear boundary between “sick” and “well” here. Extremely autistic individuals clearly need help, but what about high-functioning folks with Asperger’s? At what point does one divide someone with very mild ASD from someone who’s just socially awkward? Like most psychiatric case definitions, it’s murky at the edge.

What Kim et al. found in the special ed group was a lot of ASD, with the majority of diagnoses in the “autistic” category rather than in milder categories. These are the sickest kids. Presumably that’s why they’ve been singled out for special help. So far, so good.

The general population data get more problematic:

For 104 children with ASDs in the general-population sample, among the 172 assessed, the crude prevalence for any ASD was similar to that in the high-probability group (0.19%). However, the ratio of autistic disorder to other ASDs was reversed, with prevalences of 0.05% and 0.14%, respectively (ratio, 1:2.6) (Table 2). Other differences between the high-probability and general-population groups included the ratio of boys to girls, which was 5.1:1 in the high-probability group and 2.5:1 in the general-population sample (p=0.037) (Table 2). Mean performance IQ for individuals with any ASD was 75 (SD=28) in the high-probability group and 98 (SD=19) in the general-population sample (p<0.001)

Now we’re seeing a majority who are on the mild end of the spectrum. How mild? In some cases, quite. The surveys the researchers used to assess ASD, while standard in the field, suffer from the usual drawbacks of any psychiatric survey tool. Does your child have only a few friends? Prefer playing alone? Is he or she easily overwhelmed by extraneous stimuli? There’s lots of wiggle room on questions like these, and the closer you look the more likely you are to start pathologizing mere eccentricity.

Finally, there’s an extrapolation problem. As the numbers in the paragraphs above indicate, the researchers didn’t manage to get detailed diagnostic information on all 55,266 kids in the district. A lot of parents responded to the initial questionnaire, but many didn’t. Of the ones who did, many declined subsequent followups. By the time we get to the most detailed tests, the investigators are down to hundreds rather than thousands of data points. Nonetheless, they extrapolate from this self-selected group to the entire population. If we assume that parents who had some concerns about their children were more likely to follow up with the study – a reasonable assumption – then the extrapolation fails.

I’m not saying the results are completely bogus. The researchers are aware of their tools’ limitations, and they take efforts to control for some of them. Still, we have to ask whether a child who’s enrolled in regular school programs, hasn’t been identified as sick by any of his or her teachers, and seems to be progressing just fine in life needs to be given a diagnosis. Maybe there are a lot of children who aren’t truly “normal,” but who are acting the part well enough to pass. On some level, doesn’t that describe us all?

Brugha, T., McManus, S., Bankart, J., Scott, F., Purdon, S., Smith, J., Bebbington, P., Jenkins, R., & Meltzer, H. (2011). Epidemiology of Autism Spectrum Disorders in Adults in the Community in England Archives of General Psychiatry, 68 (5), 459-465 DOI: 10.1001/archgenpsychiatry.2011.38

Kim, Y., Leventhal, B., Koh, Y., Fombonne, E., Laska, E., Lim, E., Cheon, K., Kim, S., Kim, Y., Lee, H., Song, D., & Grinker, R. (2011). Prevalence of Autism Spectrum Disorders in a Total Population Sample American Journal of Psychiatry DOI: 10.1176/appi.ajp.2011.10101532

Anti-Vaxxers Tax Pediatricians

In recent years, there’s been a steady flow of bullshit from a small but vocal group of people who oppose vaccination. This movement rests on a foundation of quackery, misinformation, and outright fraud, and it’s done real damage to real kids. There’s no question that vaccine refusal has become a public health catastrophe of the first order.

What wasn’t clear was what effect this antiscientific noise has on overworked, underpaid family physicians and pediatricians. These folks, who are the cornerstone of children’s health, operate mostly in small practices with even smaller profit margins. After paying their office staff, malpractice insurance, rent, and other expenses, many of them are barely staying afloat. Vaccination is a big part of their business, but it’s one of the lowest-paying services they offer. Some pediatricians actually lose money on every vaccine dose they deliver.

Now it seems that anti-vaccination rhetoric is making this bad situation even worse. According to a new study, patients who refuse vaccination for their children, or demand altered immunization schedules based on pseudoscience, cause the doctor to waste 10-20 minutes explaining why they’re wrong. That’s about half a patient visit worth of lost time, each time this happens.

Apparently it’s not a rare occurrence, either:

Overall, 8% of physicians reported that ≥10% of parents refused a vaccine and 20% reported that ≥10% of parents requested to spread out vaccines in a typical month.

So in a typical month, at least one-fifth of the family practitioners and pediatricians in the country are forced to waste time and money rebutting blatantly false claims that someone has piped into the heads of their patients. It’s bad enough that anti-vaccination nonsense endangers kids, but on top of that it drives up healthcare costs and decreases access for everyone.

The study has some hopeful data as well, though. More than half of the pediatricians surveyed require parents to sign a form if they refuse vaccination, which could encourage at least some of them to change their minds. I don’t know what the exact wording is on those forms, but I certainly hope they contain the phrase “parental malpractice” somewhere.

In addition, the authors found some of the strategies that work best in combating the misinformation. Personal statements about what the doctor would do (or did do) for his or her own children seem to be most effective in persuading vaccination-shy parents to do the right thing. In that spirit, I’ll reiterate what I’ve said elsewhere: my child is and will always be fully vaccinated. If yours isn’t, take seven minutes to watch this all the way through:

The video is disease-specific, but the concept is identical for any vaccine-preventable infection. These pathogens kill and maim, and you or your kid could be next.

Kempe, A., Daley, M., McCauley, M., Crane, L., Suh, C., Kennedy, A., Basket, M., Stokley, S., Dong, F., Babbel, C., Seewald, L., & Dickinson, L. (2011). Prevalence of Parental Concerns About Childhood Vaccines American Journal of Preventive Medicine, 40 (5), 548-555 DOI: 10.1016/j.amepre.2010.12.025

Have You Double-Blinded Your Dog Today?

It’s hard not to like dogs. Even if you don’t like them, you’d better say that you do. After all, what kind of person are you if you don’t like an animal that is so clearly and completely into humans? Canis lupus familiaris has been with us so long that we can’t help but stick up for it. As a result, I’m a bit hesitant to cast aspersions at the recent spate of feel-good stories about doggie superpowers, but perhaps I’ll get a pass if I frame it in terms of human error. I’ll also start with the good news.

As research news outlets mentioned a few weeks ago, two studies have underscored dogs’ apparent ability to detect human cancer by sniffing clinical samples. In one, researchers tested the ability of a trained black Labrador retriever to differentiate between the breath and stool samples of healthy controls and those of patients with colon cancer. The dog’s accuracy ranged from 95% for breath samples to 98% for stool.

ResearchBlogging.org

Meanwhile, another group trained a Belgian Malinois shepherd to recognize urine samples from patients with prostate cancer. In that case, the canine sniff test was 91% accurate. Interestingly, the dog even managed to identify one case that the researchers weren’t aware of. After the animal indicated a positive result on one of the biopsy-negative samples, the team re-biopsied that patient, and discovered that he actually did have prostate cancer.

Given the invasiveness and risks of the standard diagnostic techniques for colon and prostate cancer, these “dog scans” could be a huge improvement. Colonoscopy requires a nasty bowel preparation procedure, anesthesia, and a long tube inserted into one’s nether regions. Prostate biopsy carries its own set of rare but potentially serious risks. Both procedures are expensive. Having a dog sniff samples of stool and urine could save money, time, and lives, right?

Unfortunately, it’s not that simple. As Sonoda et al. explain in the colon cancer paper:

It may be difficult to introduce canine scent judgement into clinical practice owing to the expense and time required for the dog trainer and for dog education. Scent ability and concentration vary between different dogs and also within the same dog on different days. Moreover, each dog can only conduct tests for a maximum of 10 years.

There’s another problem, too: the “Clever Hans” effect. A persistent bugbear of animal behavior research, the effect is named after a German horse that was initially thought to have amazing intellectual powers. As investigators later discovered, the horse was really just responding to subtle, unconscious cues from the people asking it questions. Hans wasn’t clever, just observant.

Animal behaviorists control for that by performing their experiments in double-blind fashion, coding their samples so that the experimenters themselves don’t know which is which until after the test, and therefore cannot bias the results. Both of the recent cancer studies used proper blinding and carefully controlled laboratory settings to address this, so we can trust their results. However, none of those controls are likely to be done in the real-world settings where service dogs usually work.

As Lit et al. recently demonstrated, service dogs are highly susceptible to the Clever Hans effect. In this study, the authors concealed various scented items in a series of rooms, then brought in 14 experienced teams of explosive-sniffing dogs and their handlers. In rooms where the researchers had placed a subtle visual cue indicating experimenter activity (a piece of construction paper taped to a cabinet), the dogs were much more likely to detect explosives, even when none were present. The most likely explanation is that the handlers noticed the paper and subconsciously cued the dogs that something was different.

The implications for drug- and bomb-sniffing dog teams are obvious. In the real-world settings in which those teams operate, the dogs are more likely to “alert” when doing so would confirm the biases of their handlers. If the handlers have preconceptions about the criminality of different ethnic, national, gender, or age groups, the dogs might engage in profiling by proxy.

Something similar could easily happen in high-volume clinical labs. Imagine a stressed-out MLS holding a sample cup for a dog to sniff. Based on the patient’s information in the computer, the technician may already have a bias about the likely outcome, and the dog, now sniffing its umpteenth sample in a long shift, would like nothing better than to please the human. The technician gets a coffee break, the dog gets a chew-toy, and a few days later Mr. Jones gets an unnecessary biopsy.

While doggie diagnosticians aren’t likely to work well, the new studies still provide a clue to what would. In both of the cancer studies, the dogs clearly noticed something different about the smells of patient samples compared to controls, so there must be volatile organic compounds (VOCs) in those samples that are unique to cancer. The next step is to identify those VOCs, and develop sensors that can detect them. The dogs may not be qualified to diagnose diseases on their own, but they did throw us a bone.

Sonoda, H., Kohnoe, S., Yamazato, T., Satoh, Y., Morizono, G., Shikata, K., Morita, M., Watanabe, A., Morita, M., Kakeji, Y., Inoue, F., & Maehara, Y. (2011). Colorectal cancer screening with odour material by canine scent detection Gut DOI: 10.1136/gut.2010.218305

Cornu, J., Cancel-Tassin, G., Ondet, V., Girardet, C., & Cussenot, O. (2011). Olfactory Detection of Prostate Cancer by Dogs Sniffing Urine: A Step Forward in Early Diagnosis European Urology, 59 (2), 197-201 DOI: 10.1016/j.eururo.2010.10.006

Lit, L., Schweitzer, J., & Oberbauer, A. (2011). Handler beliefs affect scent detection dog outcomes Animal Cognition DOI: 10.1007/s10071-010-0373-2