#GMOFAQ How Bt corn and Roundup Ready soy work, and why they should not scare you

Background

Last week I wrote about the anti-science campaign being waged by opponents of the use of genetically modified organisms in agriculture. In that post, I promised to address a series of questions/fears about GMOs that seem to underly peoples’ objections to the technology. I’m not going to try to make this a comprehensive reference site about GMOs and the literature on their use and safety. I’m compiling some general resources here, and a list of all FAQs here.

Question 2) Maybe GMOs aren’t automatically bad, but isn’t it obvious that it’s dangerous to consume crops that produce their own pesticides and can tolerate high doses of herbicides?

Approximately 90% of soybeans, maize, cotton and sugar beets grown in the US are have been genetically modified to produce a protein that kills common insect pests or to make them highly tolerant of an herbicide used to control weeds, or in some cases both. To make a rational judgment about whether these specific GMOs are good or bad, it’s important to understand exactly what they are and how they work.

Bt soy and corn

The pesticide resistant plants have been engineered to produce a protein isolated from the bacterium Bacillus thuringiensis (known generally Bt). Each strain of Bt produces a different version of the protein, known as Cry, each highly specific to a limited number of related species. Bt has evolved these proteins as a key part of a reproductive strategy in which they kill insects that ingest them and then eat nutrients released by the dying host. The Cry protein found in Bt spores must be activated by a protein-cleaving enzyme found in the host gut and then bind to a specific protein on the surface of cells in the digestive system, which Cry then destroys. Insects, who are not huge fans of this strategy, eventually evolve resistance by modifying one or both of these proteins. Bt stains that rely on this insect adapt in turn, creating highly-specific strain-insect relationships.

The irony of Cry becoming a major bugaboo of the anti-GMO movement is that, until the gene that produces it was inserted into corn, it was the poster-child of a “natural” insecticide, preferred over chemical pesticides because of the potential for extreme host specificity and complete biodegradability.  Bt spores were sprayed on crops for decades, and are still widely used to control pests by organic farmers. But the effectiveness of Bt as an insecticide is limited because it degrades in the matter of days – more rapidly when it rains. This led agricultural biotechnology companies to try and insert Cry genes directly into the plants, and there are now many varieties on the market, each targeting pests that are a particular problem for a given crop (some varieties of Bt corn, for example, targets the European corn borer).

Given what we know about  Cry proteins, there is very little reason to be concerned about the safety of eating it. These are proteins that have evolved to kill insects – and not just insects in general, but very specific subsets of insects. And humans are not insects. Regulatory agencies in the US and Europe have consistently rejected claims that plants that produce their own Cry cause problems in either humans or farm animals.

Nonetheless, anti-GMO activists continually raise the spectre of “plants that make their own pesticide” as if this alone was sufficient reason to not only avoid them, but to ban them. Here is a banner running on the website of one of the organizations pushing the CA GMO-labeling initiative:

If you don’t know a lot about plants, I can see how this would seem threatening. But this picture and the anti-GMO campaign it accompanies are based on the flawed premise that “normal” plants are pesticide free. This could not be farther from the truth. Almost since they first appeared on Earth, plants have had to reckon with a diverse array of animals determined to eat them. And this is a battle that continues today, as anyone who has tried to garden, or wandered through a forest, can attest. To fight off these pests, plants have evolved a dizzying array of defense mechanisms, including the production of a diverse arsenal of chemicals targeted at the insects and other pests that afflict them.

As far as I know, natural pesticides have been found in every plant in which they have been sought, including all conventionally grown crops. Wheat makes a family of proteins lethal to hessian flies, peas contain the insecticidal protein PA1b, tomatoes tomatine, and so on. And even if the corn in that picture was not genetically modified, that cute little girl is about to get a mouthful of the insecticide maysin. Indeed almost any mouthful of unprocessed plants from any source will likely contain some kind of natural pesticide that is inert in humans. There is nothing at all unusual, or particularly worrisome, about eating plants that contain the Bt Cry protein as we’ve been eating insecticides for eons.

I’m sure some people will say that we may have been eating insecticides all along, but we haven’t been eating Bt Cry protein and, under the “you never know” principle, should just avoid it. This would all be fine and good if there weren’t strong evidence supporting the value of Bt corn and soy in reducing pesticide use on farms and limiting collateral damage to insects that are in the vicinity of, but not eating, the relevant crop. As a panel of the US National Academies of Science reported in a 2010 study of GMOs:

The evidence shows that the planting of GE crops has largely resulted in less adverse or equivalent effects on the farm environment compared with the conventional non-GE systems that GE crops replaced. A key  improvement has been the change to pesticide regimens that apply less pesticide or that use pesticides with lower toxicity to the environment but that have more consistent efficacy than conventional pesticide regimens used on non-GE versions of the crops.

To me, the demonization of Bt in anti-GMO rhetoric is a emblematic of everything that is wrong with the GMO debates. The producers of Bt crops have done a horrible job of explaining why plants expressing a single insecticidal protein should not – and do not – harm humans. And the anti-GMO advocates either have not bothered to understand the science behind their activity, or (worse) are cynically exploiting peoples’ fears of pesticides to promote their cause.

Glyphosate tolerant crops

The second major class of GMOs (mostly soy) have been engineered to be tolerant of the herbicide glyphosate (Roundup). Glyphosate is a small molecule that inhibits an enzyme, 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS), which catalyzes an essential step in the biosynthesis of the amino acids  phenylalanine, tyrosine and tryptophan. By denying rapidly growing plants these amino acids, it is able to rapidly inhibit grown of plants onto which it has been sprayed. Glyphosate is generally considered to be inert in humans, who get these amino acids from their food, and do not have an EPSPS.

The obvious problem with using glyphosate to control weeds is that it will, under normal circumstances, also kill crop plants. However, plants that have been engineered to express an alternative form of EPSPS that functions normally even in the presence of glyphosate. These plants are thus “Roundup Ready“, and will survive doses of glyphosate used to kill weeds in the field.

Although the EPSPS gene used in Roundup Ready plants comes from a bacterium, the necessary changes could now easily be made to the plant’s own copy of EPSPS. Thus Roundup Ready crops, which produce no new proteins not found prior to genetic manipulation,  shouldn’t really be places in the same class of GMOs as Bt expressing plants, which are expressing a new protein. And there is absolutely no reason to expect that there are any health risks associated with eating the altered form of EPSPS found in glyphosate resistant transgenic plants.

Concern about Roundup Ready plants focuses instead on the adverse effect of glyphosate on people and the environment. There are some suggestions that high doses of glyphosate are bad for humans, though these studies are hotly contested (note this was a Monsanto-funded study and must be assessed with that in mind). But the more important question is whether the use of glyphosate in conjunction with Roundup Ready crops is better for humans and the environment than the alternatives. Here, the aforementioned NRC report concluded that:

GE soybeans, corn, and cotton are designed to be resistant to the herbicide glyphosate, which has fewer adverse environmental effects compared with most other herbicides used to control weeds.

This does not argue that glyphosate is safe. However, it suggests that the net effect of the GMO – Roundup Ready- has been positive. There is a bigger discussion to be had about the role of herbicides in farming – but this is really orthogonal to issues of genetic modification.

NEXT: Question 3) Why should I trust the big companies that develop these crops? Didn’t it take years to realize PCBs, DDT etc. were bad for us?

About me

I am a molecular biologist with a background in infectious diseases, cancer genomics, developmental biology, classical genetics, evolution and ecology. I am not a plant biologist, but I understand the underlying technology and relevant areas of biology. I would put myself firmly in the “pro GMO” camp, but I have absolutely nothing material to gain from this position. My lab is supported by the Howard Hughes Medical Institute, the National Institutes of Health and the National Science Foundation. I am not currently, have never been in the past, and do not plan in the future, to receive any personal or laboratory support from any company that makes or otherwise has a vested interest in GMOs. My vested interest here is science, and what I write here, I write to defend it.

Posted in GMO | Tagged | Comments closed

#GMOFAQ: Transferring genes from one species to another is neither unnatural nor dangerous

Last week I wrote about the anti-science campaign being waged by opponents of the use of genetically modified organisms in agriculture. In that post, I promised to address a series of questions/fears about GMOs that seem to underly peoples’ objections to the technology. I’m not going to try to make this a comprehensive reference site about GMOs and the literature on their use and safety (I’m compiling some good general resources here.)

I want to say a few things about myself too. I am a molecular biologist with a background in infectious diseases, cancer genomics, developmental biology, classical genetics, evolution and ecology. I am not a plant biologist, but I understand the underlying technology and relevant areas of biology. I would put myself firmly in the “pro GMO” camp, but I have absolutely nothing material to gain from this position. My lab is supported by the Howard Hughes Medical Institute, the National Institutes of Health and the National Science Foundation. I am not currently, have never been in the past, and do not plan in the future, to receive any personal or laboratory support from any company that makes or otherwise has a vested interest in GMOs. My vested interest here is science, and what I write here, I write to defend it.

S0, without further ado:

Question 1) Isn’t transferring genes from one species to another unnatural and intrinsically dangerous

The most striking thing about the GMO debate is the extent to which it contrasts “unnatural” GMOs against “natural” traditional agriculture, and the way that anti-GMO campaigners equate “natural” with “safe and good”. I’ll deal with these in turn.

The problem with the unnatural/natural contrast is not that it’s a mischaracterization of GMOs – they are unnatural in the strict sense of not occurring in Nature – rather that it is a frighteningly naive view of traditional agriculture.

Far from being natural, the transformation of wild plants and animals into the foods we eat today is – by far – the single most dramatic experiment in genetic engineering the human species has undertaken. Few of the species we eat today look anything like their wild counterparts, the result of thousands of years of largely willful selective breeding to optimize these organisms for agriculture and human consumption. And, in the past few years, as we have begun to characterize the genetic makeup of crops and farm animals, we are getting a clear picture of the extent to which traditional agricultural practices have transformed their DNA.

Let’s take a few examples. This is a Mexican grass known as teosinte and its seed.

Thousands of years of selection transformed this relatively nondescript plant into one of the mainstays of modern agriculture – corn. The picture below – which shows the seeds of teosinte on the left, and an ear of modern corn on the right – gives a pretty good sense of the scope of change involved in the domestication and improvement for agriculture of teosinte.

Thanks to the pioneering work of geneticist John Doebley, and more recently an international consortium who have sequenced the genome of maize and characterized genetic variation in teosinte and maize, we now have a good picture of just what happened to the DNA of teosinte to accomplish the changes in the structure of the plant and its seed: a recent paper that characterized the DNA of 75 teosinte and maize lines identified hundreds of variants that appear to have been selected during the process of domestication. And maize is not weird in this regard – virtually all agriculturally important plants have a similar story of transformation from wild ancestors as generations of farmers adapted them to be easier to grow, safer to eat, more nutritious, resistant to pests and other stresses, and tastier.

For most of history this crop domestication and improvement has been a largely blind process, with breeders selecting crossing individuals with desired traits and selecting the offspring who have inherited them until they breed true – unaware of the molecular changes underlying these traits and other changes to the plants that may have accompanied them.

Modern genetics has fundamentally altered this reality. It has increased the power breeders have to select for desirable traits using traditional methods, and makes it far easier ensure that undesirable have not come along for the ride. And it also gives us the ability to engineer these changes directly by transferring just the DNA that confers a trait from one individual in a species to another. There are many ways to accomplish this – the most common involves extracting the DNA you want to transfer from the donor, placing it into a bacterium whose natural life-cycle involves inserting its DNA into that of its host, and then infecting the target individual with this bacterium. But recently developed technologies make it possible to effectively edit the genome in a computer and then make the desired changes in the living organism.

When applied to transfer genetic information from one individual in a species to another, this is an intrinsically conservative form of  crop improvement around since is all but eliminates the random genetic events that accompany even the most controlled breeding experiment.

The only difference between this and the generation of GMOs is that the transfered DNA comes not from a member of the same species, but from somewhere else on the tree of life. I understand why some people see this is a big difference, but modern molecular biology has shown us that all living things share a remarkably similar molecular toolkit, with the distinct properties of each species coming more from how these pieces are wired together than which ones are where.

Transferring a gene from a fish into a plant does not make the plant swim any more than stealing the radio from someone’s Maserati and putting it into my Honda Civic would turn it into a high-performance sports car. Indeed, scientists routinely use genes from mice, fungi, plants and even bacteria to substitute for their human counterparts, and vice-versa – which they often do perfectly.

And this doesn’t just happen in the lab. There are countless examples of genes moving naturally between species. Microorganisms swap DNA all the time – this is how antibiotic resistance spreads so quickly between species. Our own genome contains genes that got their start in bacteria and were subsequently taken up by one of our ancestors.

The relatively low rate of such “horizontal gene transfer” in multicellular organisms like plants and animals compared to bacteria is more a reflection of reproductive barriers and the defenses they have evolved to prevent viruses from hitchhiking in their DNA, than from a fundamental molecular incompatibility between species.

This is why I do not find the process of making GMOs unnatural or dangerous – certainly no more so than traditional breeding. And why I find the obsession with, and fearmongering about, GMOs to be so bizarre and irrational.

Of course the fact that making GMOs is not inherently dangerous does not mean that every GMO is automatically safe. I can think of dozens of ways that inserting a single gene into, say, soybeans could make them lethal to eat. But it would be because of what was inserted into them, not how it was done.

For what its worth, it would also be relatively easy to make crops plant dangerous to eat by strictly non-GM techniques. Essentially all plants make molecules that help them fight off insects and other pests. In the foods we eat regularly, these molecules are present at sufficiently low levels that they no longer constitute a threat to humans eating them. But it is likely that the production of these molecules could be ramped up when crossing crop varieties with wild stocks, or by introducing new mutations, and selecting for toxicity, much as one would do for any other trait. Indeed, there have been reports of potatoes that produce toxic levels of solanines and celery that produce unhealthy amounts of psoralens, both chemicals present at low levels in the crops. Which segways nicely into the next topic.

NEXT: Question 2) Maybe GMOs aren’t automatically bad, but isn’t it obvious that it’s dangerous to consume crops that produce their own pesticides and can tolerate high doses of herbicides?

 

Posted in GMO | Tagged | Comments closed

How President Obama could really lead on open access

[The Washington Post ran a nice op-ed today from two student leaders linked to the recent public access petition campaign. I had submitted one that urges the administration to take a more agressive stance, which I am posting here.]

Last weekend, a “We the People” petition calling on the Obama administration to provide free access to the results of taxpayer-sponsored research passed the 25,000 signature threshold that earns a White House response.

The petition advocates an extension to all government agencies of a policy already in place at the National Institutes of Health that requires papers arising from research they fund to be made available within a year of publication. But truly unleashing the potential of information technology to convey new discoveries to the public and accelerate the rate at which they are made will require bolder action from the President.

People new to this issue will rightly ask why, in this era of Internet-enabled open government, the results of federally funded research are not automatically made available to the public already. The answer is that most research journals still charge the same hefty subscription fees for electronic access they did for printed copies, which effectively  limits access  to those affiliated with large universities and other institutions that can afford them.

The NIH public access policy was carefully crafted to avoid challenging this status quo. Rather than requiring that newly published works be made available immediately, it permits a delay of up to one year, giving journals a period of exclusivity during which would-be readers must still pay for access. The public, and many researchers, are thus cut off from the most recent – and important – discoveries.

Rather than take the more cautious step suggested by the petition, President Obama should address this critical flaw by crafting a public access policy for all federal agencies with these features:

First, all federal research grants should come with an unambiguous requirement that any papers arising from the funded work should be made immediately freely available. Grants already come with a dizzying array of requirements for grantees – this would be a painless new one.

Second, federal agencies should stop channeling billions of dollars to libraries to cover subscription costs, and use the money to foster alternative business models. Many American businesses would salivate at the prospect of receiving billions of dollars to publish the few hundred thousand papers a year arising from government funded research.

Finally, the National Library of Medicine should be expanded into a National Library of Science, Medicine and Technology, and charged not only with making the works of government funded research available to the public, but also with helping researchers and entrepreneurs develop new tools for navigating and mining the entire scientific literature in ways that will speed the research process itself.

Few would object to these steps. Many in the public would rejoice at having unfettered access to the research their tax dollars paid to create. Scientists would welcome the opportunity for their work to be seen as widely and as quickly as possible. Publishers would value the clarity of knowing that there was a stable source of revenue for their businesses. And entrepreneurs would salivate at the prospect of having a digital repository of scientific knowledge as the foundation for a new wave of innovation.

A similar proposal was put forward in 1999 by former NIH Director Harold Varmus. Unfortunately, neither the scientific community nor the business community were ready for what, at that time, seemed like a radical transformation. But a lot has changed in the ensuing years, and, as the petition demonstrates, keeping publicly funded research hidden from public view is now the radical act.

Scientific and medical research in the United States is one of the great public endeavors of all time. President Obama has the opportunity to complete the movement to make its output one of the great public resources.

Michael B. Eisen, Ph.D. is an associate professor of molecular and cell biology at the University of California, Berkeley and co-founder of the Public Library of Science, a San Francisco based non-profit publisher of open access scientific journals.

Posted in publishing, science, science and politics | Comments closed

Blog’s back

Posted in Uncategorized | Comments closed

The anti-GMO campaign’s dangerous war on science

This November, Californians will vote on an initiative that would require any food containing ingredients derived from genetically modified crops to be labeled as such.

Backers of the “California Right To Know Genetically Engineered Food Act” are pitching it as a matter of providing information to consumers, who, they argue, “have a right to know what’s in the food we buy and eat and feed our children, just as we have the right to know how many calories are in our food, or whether food comes from other countries like Mexico or China.”

I have no concerns about the safety of GMOs. But I support the right of people to make choices about what they eat, and think we should provide them with the information they need to do so.

I understand where some of the nervousness about GMOs comes from. I worry about the uncontrolled chemical experiments our species is doing on our bodies, and am a big consumer of organic foods. I am also skeptical when industries assert that their products are safe, because so often these claims have turned out to be false.

But I also appreciate the challenges of feeding our growing population, and believe in the power of biotechnology to not just make agriculture more efficient, but to make it better for people and the planet. And as a molecular biologist very familiar with the technology of genetic modification and the research into its safety, I do not find it in the least bit frightening.

What I do find frightening, however, is the way backers of this initiative have turned a campaign for consumer choice into a crusade against GMOs. They don’t want the “genetically engineered” label to merely provide information. They want it to be a warning – the equivalent for GM food of the cancer warning on cigarette boxes.

The problem is there is no justification for a warning. There is no compelling evidence of any harm arising from eating GMOs, and a diverse and convincing body of research demonstrating that GMOs are safe. But rather than reckon with this reality, anti-GMO campaigners have joined their climate-change denying brethren, and launched an agressive war on science.

Opponents of GMOs  are so sure that GMOs are dangerous that any study suggesting they are safe must have been funded by Monsanto, and any scientist pointing out the holes in their arguments must be an industry shill. In the anti-GMO universe, it often seems that the best evidence that something must be true is the existence of multiple experiments showing it is false.

The language of the initiative itself contains clear misstatements of scientific consensus. For example, one of the “Findings and Declarations” states:

Government scientists have stated that the artificial insertion of DNA into plants, a technique unique to genetic engineering, can cause a variety of significant problems with plant foods. Such genetic engineering can increase the levels of known toxicants in foods and introduce new toxicants and health concerns.

While I’m sure they have a reference that justifies their making this assertion, the reality is that the US and EU government scientists have repeatedly and consistently demonstrated that GMOs are safe. For the backers of the initiative to claim otherwise as a finding of fact is an outright lie, and an outlandish attack on science.

If this initiative passes it will reify the war on science, and deal another body blow to the idea, already reeling from the climate change debate, that public policy should be based on good data and solid reasoning. It MUST be stopped.

The question of course is how. I am so infuriated with rhetoric from backers of the labeling campaign that I’m tempted to just sit here ridiculing the egregious misinformation, bad science, pseudoscience and non-science that they traffic in. This would make me less angry. But it wouldn’t be productive.

I suspect the most zealous opponents of GMOs are not open to being convinced. But, polls show, the bulk of the electorate doesn’t know a lot about this issue and probably come into the debate inclined both the support labeling and have vague fears about GMOs. So I am going to suppress my fury and be constructive and address these fears with the only tool at my disposal – science.

Following her article in the NYT on the labeling debate, Amy Harmon posted a series of “GMO FAQs” on Twitter, distilled from the ~500 comments posted following her story. These seem to capture a good chunk of the fears people have about genetic modification and GMO foods. And so, over the next several days, I am going to answer each of those questions, as well as a few of my own. Check back here (I’ll add the answers below as I get to them), or follow me on Twitter. And please participate in the discussion, and feel free to pose any more questions!

GMO FAQ (v2.0)

Question 1) Isn’t transferring genes from one species to another unnatural and intrinsically dangerous

The most striking thing about the GMO debate is the extent to which it contrasts “unnatural” GMOs against “natural” traditional agriculture, and the way that anti-GMO campaigners equate “natural” with “safe and good”. I’ll deal with these in turn.

The problem with the unnatural/natural contrast is not that it’s a mischaracterization of GMOs – they are unnatural in the strict sense of not occurring in Nature – rather that it is a frighteningly naive view of traditional agriculture.

Far from being natural, the transformation of wild plants and animals into the foods we eat today is – by far – the single most dramatic experiment in genetic engineering the human species has undertaken. Few of the species we eat today look anything like their wild counterparts, the result of thousands of years of largely willful selective breeding to optimize these organisms for agriculture and human consumption. And, in the past few years, as we have begun to characterize the genetic makeup of crops and farm animals, we are getting a clear picture of the extent to which traditional agricultural practices have transformed their DNA.

Let’s take a few examples. This is a Mexican grass known as teosinte and its seed.

Thousands of years of selection transformed this relatively nondescript plant into one of the mainstays of modern agriculture – corn. The picture below – which shows the seeds of teosinte on the left, and an ear of modern corn on the right – gives a pretty good sense of the scope of change involved in the domestication and improvement for agriculture of teosinte.

Thanks to the pioneering work of geneticist John Doebley, and more recently an international consortium who have sequenced the genome of maize and characterized genetic variation in teosinte and maize, we now have a good picture of just what happened to the DNA of teosinte to accomplish the changes in the structure of the plant and its seed: a recent paper that characterized the DNA of 75 teosinte and maize lines identified hundreds of variants that appear to have been selected during the process of domestication. And maize is not weird in this regard – virtually all agriculturally important plants have a similar story of transformation from wild ancestors as generations of farmers adapted them to be easier to grow, safer to eat, more nutritious, resistant to pests and other stresses, and tastier.

For most of history this crop domestication and improvement has been a largely blind process, with breeders selecting crossing individuals with desired traits and selecting the offspring who have inherited them until they breed true – unaware of the molecular changes underlying these traits and other changes to the plants that may have accompanied them.

Modern genetics has fundamentally altered this reality. It has increased the power breeders have to select for desirable traits using traditional methods, and makes it far easier ensure that undesirable have not come along for the ride. And it also gives us the ability to engineer these changes directly by transferring just the DNA that confers a trait from one individual in a species to another. There are many ways to accomplish this – the most common involves extracting the DNA you want to transfer from the donor, placing it into a bacterium whose natural life-cycle involves inserting its DNA into that of its host, and then infecting the target individual with this bacterium. But recently developed technologies make it possible to effectively edit the genome in a computer and then make the desired changes in the living organism.

When applied to transfer genetic information from one individual in a species to another, this is an intrinsically conservative form of  crop improvement around since is all but eliminates the random genetic events that accompany even the most controlled breeding experiment.

The only difference between this and the generation of GMOs is that the transfered DNA comes not from a member of the same species, but from somewhere else on the tree of life. I understand why some people see this is a big difference, but modern molecular biology has shown us that all living things share a remarkably similar molecular toolkit, with the distinct properties of each species coming more from how these pieces are wired together than which ones are where.

Transferring a gene from a fish into a plant does not make the plant swim any more than stealing the radio from someone’s Maserati and putting it into my Honda Civic would turn it into a high-performance sports car. Indeed, scientists routinely use genes from mice, fungi, plants and even bacteria to substitute for their human counterparts, and vice-versa – which they often do perfectly.

And this doesn’t just happen in the lab. There are countless examples of genes moving naturally between species. Microorganisms swap DNA all the time – this is how antibiotic resistance spreads so quickly between species. Our own genome contains genes that got their start in bacteria and were subsequently taken up by one of our ancestors.

The relatively low rate of such “horizontal gene transfer” in multicellular organisms like plants and animals compared to bacteria is more a reflection of reproductive barriers and the defenses they have evolved to prevent viruses from hitchhiking in their DNA, than from a fundamental molecular incompatibility between species.

This is why I do not find the process of making GMOs unnatural or dangerous – certainly no more so than traditional breeding. And why I find the obsession with, and fearmongering about, GMOs to be so bizarre and irrational.

Of course the fact that making GMOs is not inherently dangerous does not mean that every GMO is automatically safe. I can think of dozens of ways that inserting a single gene into, say, soybeans could make them lethal to eat. But it would be because of what was inserted into them, not how it was done.

For what its worth, it would also be relatively easy to make crops plant dangerous to eat by strictly non-GM techniques. Essentially all plants make molecules that help them fight off insects and other pests. In the foods we eat regularly, these molecules are present at sufficiently low levels that they no longer constitute a threat to humans eating them. But it is likely that the production of these molecules could be ramped up when crossing crop varieties with wild stocks, or by introducing new mutations, and selecting for toxicity, much as one would do for any other trait. Indeed, there have been reports of potatoes that produce toxic levels of solanines and celery that produce unhealthy amounts of psoralens, both chemicals present at low levels in the crops. Which segways nicely into the next topic.

Last week I wrote about the anti-science campaign being waged by opponents of the use of genetically modified organisms in agriculture. In that post, I promised to address a series of questions/fears about GMOs that seem to underly peoples’ objections to the technology. I’m not going to try to make this a comprehensive reference site about GMOs and the literature on their use and safety. I’m compiling some general resources here, and a list of all FAQs here.

Question 2) Maybe GMOs aren’t automatically bad, but isn’t it obvious that it’s dangerous to consume crops that produce their own pesticides and can tolerate high doses of herbicides?

Approximately 90% of soybeans, maize, cotton and sugar beets grown in the US are have been genetically modified to produce a protein that kills common insect pests or to make them highly tolerant of an herbicide used to control weeds, or in some cases both. To make a rational judgment about whether these specific GMOs are good or bad, it’s important to understand exactly what they are and how they work.

Bt soy and corn

The pesticide resistant plants have been engineered to produce a protein isolated from the bacterium Bacillus thuringiensis (known generally Bt). Each strain of Bt produces a different version of the protein, known as Cry, each highly specific to a limited number of related species. Bt has evolved these proteins as a key part of a reproductive strategy in which they kill insects that ingest them and then eat nutrients released by the dying host. The Cry protein found in Bt spores must be activated by a protein-cleaving enzyme found in the host gut and then bind to a specific protein on the surface of cells in the digestive system, which Cry then destroys. Insects, who are not huge fans of this strategy, eventually evolve resistance by modifying one or both of these proteins. Bt stains that rely on this insect adapt in turn, creating highly-specific strain-insect relationships.

The irony of Cry becoming a major bugaboo of the anti-GMO movement is that, until the gene that produces it was inserted into corn, it was the poster-child of a “natural” insecticide, preferred over chemical pesticides because of the potential for extreme host specificity and complete biodegradability.  Bt spores were sprayed on crops for decades, and are still widely used to control pests by organic farmers. But the effectiveness of Bt as an insecticide is limited because it degrades in the matter of days – more rapidly when it rains. This led agricultural biotechnology companies to try and insert Cry genes directly into the plants, and there are now many varieties on the market, each targeting pests that are a particular problem for a given crop (some varieties of Bt corn, for example, targets the European corn borer).

Given what we know about  Cry proteins, there is very little reason to be concerned about the safety of eating it. These are proteins that have evolved to kill insects – and not just insects in general, but very specific subsets of insects. And humans are not insects. Regulatory agencies in the US and Europe have consistently rejected claims that plants that produce their own Cry cause problems in either humans or farm animals.

Nonetheless, anti-GMO activists continually raise the spectre of “plants that make their own pesticide” as if this alone was sufficient reason to not only avoid them, but to ban them. Here is a banner running on the website of one of the organizations pushing the CA GMO-labeling initiative:

If you don’t know a lot about plants, I can see how this would seem threatening. But this picture and the anti-GMO campaign it accompanies are based on the flawed premise that “normal” plants are pesticide free. This could not be farther from the truth. Almost since they first appeared on Earth, plants have had to reckon with a diverse array of animals determined to eat them. And this is a battle that continues today, as anyone who has tried to garden, or wandered through a forest, can attest. To fight off these pests, plants have evolved a dizzying array of defense mechanisms, including the production of a diverse arsenal of chemicals targeted at the insects and other pests that afflict them.

As far as I know, natural pesticides have been found in every plant in which they have been sought, including all conventionally grown crops. Wheat makes a family of proteins lethal to hessian flies, peas contain the insecticidal protein PA1b, tomatoes tomatine, and so on. And even if the corn in that picture was not genetically modified, that cute little girl is about to get a mouthful of the insecticide maysin. Indeed almost any mouthful of unprocessed plants from any source will likely contain some kind of natural pesticide that is inert in humans. There is nothing at all unusual, or particularly worrisome, about eating plants that contain the Bt Cry protein as we’ve been eating insecticides for eons.

I’m sure some people will say that we may have been eating insecticides all along, but we haven’t been eating Bt Cry protein and, under the “you never know” principle, should just avoid it. This would all be fine and good if there weren’t strong evidence supporting the value of Bt corn and soy in reducing pesticide use on farms and limiting collateral damage to insects that are in the vicinity of, but not eating, the relevant crop. As a panel of the US National Academies of Science reported in a 2010 study of GMOs:

The evidence shows that the planting of GE crops has largely resulted in less adverse or equivalent effects on the farm environment compared with the conventional non-GE systems that GE crops replaced. A key  improvement has been the change to pesticide regimens that apply less pesticide or that use pesticides with lower toxicity to the environment but that have more consistent efficacy than conventional pesticide regimens used on non-GE versions of the crops.

To me, the demonization of Bt in anti-GMO rhetoric is a emblematic of everything that is wrong with the GMO debates. The producers of Bt crops have done a horrible job of explaining why plants expressing a single insecticidal protein should not – and do not – harm humans. And the anti-GMO advocates either have not bothered to understand the science behind their activity, or (worse) are cynically exploiting peoples’ fears of pesticides to promote their cause.

Glyphosate tolerant crops

The second major class of GMOs (mostly soy) have been engineered to be tolerant of the herbicide glyphosate (Roundup). Glyphosate is a small molecule that inhibits an enzyme, 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS), which catalyzes an essential step in the biosynthesis of the amino acids  phenylalanine, tyrosine and tryptophan. By denying rapidly growing plants these amino acids, it is able to rapidly inhibit grown of plants onto which it has been sprayed. Glyphosate is generally considered to be inert in humans, who get these amino acids from their food, and do not have an EPSPS.

The obvious problem with using glyphosate to control weeds is that it will, under normal circumstances, also kill crop plants. However, plants that have been engineered to express an alternative form of EPSPS that functions normally even in the presence of glyphosate. These plants are thus “Roundup Ready“, and will survive doses of glyphosate used to kill weeds in the field.

Although the EPSPS gene used in Roundup Ready plants comes from a bacterium, the necessary changes could now easily be made to the plant’s own copy of EPSPS. Thus Roundup Ready crops, which produce no new proteins not found prior to genetic manipulation,  shouldn’t really be places in the same class of GMOs as Bt expressing plants, which are expressing a new protein. And there is absolutely no reason to expect that there are any health risks associated with eating the altered form of EPSPS found in glyphosate resistant transgenic plants.

Concern about Roundup Ready plants focuses instead on the adverse effect of glyphosate on people and the environment. There are some suggestions that high doses of glyphosate are bad for humans, though these studies are hotly contested (note this was a Monsanto-funded study and must be assessed with that in mind). But the more important question is whether the use of glyphosate in conjunction with Roundup Ready crops is better for humans and the environment than the alternatives. Here, the aforementioned NRC report concluded that:

GE soybeans, corn, and cotton are designed to be resistant to the herbicide glyphosate, which has fewer adverse environmental effects compared with most other herbicides used to control weeds.

This does not argue that glyphosate is safe. However, it suggests that the net effect of the GMO – Roundup Ready- has been positive. There is a bigger discussion to be had about the role of herbicides in farming – but this is really orthogonal to issues of genetic modification.

3) Why should I believe GM food is safe? Why should I trust the big companies that develop these crops? Didn’t it take years to realize PCBs, DDT, ‘good’ cholesterol, etc. were bad for us?

4) What about studies that show GM foods cause allergies, destroy organs and make mice sterile? 

5) Why won’t GM crops will escape and contaminate non-GMO crops (and maybe the planet)

6) GM crops initially reduced spraying. But now we have resistant weeds&insects. Aren’t we on a ‘pesticide treadmill’? 

7) Don’t GMOs destroy biodiversity?

8) Don’t GMOs undermine local agriculture in the developing world?

9) Aren’t Monsanto’s business practices enough to want to boycott GMOs?

About me

I am a molecular biologist with a background in infectious diseases, cancer genomics, developmental biology, classical genetics, evolution and ecology. I am not a plant biologist, but I understand the underlying technology and relevant areas of biology. I would put myself firmly in the “pro GMO” camp, but I have absolutely nothing material to gain from this position. My lab is supported by the Howard Hughes Medical Institute, the National Institutes of Health and the National Science Foundation. I am not currently, have never been in the past, and do not plan in the future, to receive any personal or laboratory support from any company that makes or otherwise has a vested interest in GMOs. My vested interest here is science, and what I write here, I write to defend it.

 

Posted in GMO, science and politics, war on science | Comments closed

The triumph of fake open access

It’s been a heady day for “open access”. A petition urging the Obama administration to extend the NIH’s public access policy to other government agencies blew past the halfway point in its goal to gather 25,000 signatures. And the faculty senate at UCSF voted to approve an “open access” policy that would “require” its faculty to make all of their papers freely available.

Both of these are important steps in the long-running push for open access. But amidst the giddy triumphalism surrounding these events on blogs and twitter, an important point is being ignored: neither of these are really “open access” policies, and treating them as if they are trivializes their shortcomings in critical areas and risks people declaring premature victory when there is no much more left to be done.

I’ll start with the White House petition. I love that this is happening, and that it is gathering signatures rapidly. But the best outcome we can hope for is what the petition calls for – implementation of public access policies like the NIH’s at other federal agencies. This would obviously be a good thing. And having the administration come out in favor of public access establishes an important principle – that the public has a stake in how the results of the research it funds are disseminated.

But the NIH policy is very very far from true open access. First, it is not immediate – authors can (and to publish in many journals must) delay free access to their articles for up to a year. And second, it provides access in only the narrowest of senses – the ability to read an article. Most of the articles made available under the NIH policy can not be redistributed, and, more crucially, their availability to the community for use in text mining or other forms of reuse is unclear, and probably limited. And if you don’t think this matters, read this great article in the Guardian today about how the negative consequences of the current roadblocks to data mining the contents of the scientific literature.

So what the petition is really about is pushing for an expansion of delayed free access to the scientific literature. A strategic victory for sure, and maybe an important one. But not open access.

The newly approved UCSF policy suffers from a different problem. As I’ve written about before, the policy (which is being considered by academic senates at all ten UC campuses) contains an opt out clause:

The University of California will waive application of the license for a particular article or delay access for a specified period of time upon express direction by a Faculty member.

The UC faculty who drafted the policy included this clause because, the claimed (I assume correctly) that the policy would not pass without it. But if a majority of faculty would have voted against this policy without the opt out, then one has to assume that most of them intend to continue publishing in journals that will not allow them to make their work available through a UC archive or the equivalent.  So the UCSF policy isn’t an open access mandate. It’s an open access option – an option UCSF faculty already have and of which they largely do not avail themselves. Will a faculty senate resolution change the choices people make about where to publish? Maybe at the margins, but it’s hard to imagine it would have a profound effect.

Again, I’m not saying the UCSF vote is a bad thing – it’s great, and I will vote for a similar policy at UCB (although I hope to amend it to strike the opt out clause). The policy does have some things I really like – most notably a focus on data mining. But it’s a largely symbolic policy – one that is unlikely to significantly increase the number of freely available articles, at least in the near term.

So, by all means celebrate the important achievements of the day. But try to refrain from calling it the “open access petition” or the “UCSF open access policy”.  And make it clear to anyone whose listening that, as much as you support these two acts, they are both compromises whose limitations must ultimately give way to true open access.

 

 

Posted in open access, science | Comments closed

What the UC “open access” policy should say

The joint faculty senate of the ten campuses of the University of California has floated a trial balloon “open access” policy. I, of course, laud the effort to move the ball forward on open access, but the proposed policy falls short in two key ways.

1) The rights reserved by the University are too limited. Rather than granting UC the right to redistribute the article, the policy should place all scholarly works produced by UC faculty under a Creative Commons Attribution License.

2) There should be no “opt-out” provision.

Here is my edited version of the proposal (my additions are in green):

The Faculty of The University of California is committed to disseminating its research and scholarship as widely as possible. In keeping with that commitment, the Faculty adopts the following policy:

Each Faculty member grants to The University of California permission to make available his or her scholarly articles and to exercise the copyright in those articles.

More specifically, each Faculty member grants to The California Digital Library, acting on behalf of the Regents of the University of California, a nonexclusive,  irrevocable, worldwide license to exercise any and all rights under copyright relating to each of his or her scholarly articles, in any medium, provided that the articles are not sold for a profit, and to authorize others to do the same.

Each Faculty member will provide a copy of each of their scholarly articles, in any medium, to  The California Digital Library, acting on behalf of the Regents of the University of California, who will make these articles available under a Creative Commons Attribution License that grants any users the right to redistribute and reuse the work provided that proper credit for its authorship is retained.

The policy applies to all scholarly articles authored or co-authored while the person is a member of the Faculty except for any articles completed before the adoption of this policy and any articles for which the Faculty member entered into an incompatible licensing or assignment agreement before the adoption of this policy.

Following adoption of this policy, faculty will not enter into any agreement with any third party that limits their ability to comply with this policy.

The University of California will waive application of the license for a particular article or delay access for a specified period of time upon express direction by a Faculty member.

Each Faculty member will provide an electronic copy of the author’s final version of each article no later than the date of its publication at no charge to the California Digital Library, or its successors, in an appropriate format. The California Digital Library, or its successors, may will make the article available in an open access repository immediately upon, or, with author consent, prior to its formal publication.

The California Digital Library, or its successors, will be responsible for interpreting this policy, resolving disputes concerning its interpretation and application, and recommending changes to the Faculty from time to time.

The policy will be reviewed after three years and a report presented to the Faculty.

Of course getting the faculty to agree to this would be great, but there is a greater need to get the administration to engage in the issue of how the work of its faculty should be communicated – something on which they, like virtually all other universities, have a very poor track record.

Posted in open access, PLoS, publishing | Comments closed

20 years of cowardice: the pathetic response of American universities to the crisis in scholarly publishing

When Harvard University says it can not afford something, people notice. So it was last month when a faculty committee examining the future of the university’s libraries declared that the continued growth of journal subscription fees was unsustainable, even for them. The accompanying calls for faculty action are being hailed as a major challenge to the traditional publishers of scholarly journals.

Would that it were so. Rather than being a watershed event in the movement to reform scholarly publishing, the tepidness of the committee’s recommendations, and the silence of the university’s administration, are just the latest manifestation of the toothless response of American universities to the “serials crisis” that has plagued libraries for decades.

Had the leaders major research universities attacked this issue head on when the deep economic flaws in system became apparent, or if they’d showed even an ounce of spine in the ensuing twenty or so years, the subscription-based model that is the root of the problem would have long ago been eliminated. The solutions have always been clear. Universities should have stopped paying for subscriptions, forcing publishers to adopt alternative economic models. And they should have started to reshape the criteria for hiring, promotion and tenure, so that current and aspiring faculty did not feel compelled to publish in journals that were bankrupting the system. But they did neither, choosing instead to let the problem fester. And even as cries from the library community intensify, our universities continue to shovel billions of dollars a year to publishers while they repeatedly fail to take the simple steps that could fix the problem overnight.

The roots of the serials crisis 

Virtually all of the problems in scholarly publishing stem from the simple act, repeated millions of times a year, of a scholar signing over copyright in their work to the journal in which their work is to appear. When they do this they hand publishers a weapon that enables them to extract almost unlimited amounts of money from libraries at the same research institutions that produced the work in the first place.

The problem arises because research libraries are charged with obtaining for scholars at their institution access to the entire scholarly output of their colleagues. Not just the most important stuff. Not just the most interesting stuff. Not just the most affordable stuff. ALL OF IT. And publishers know this. So they raise prices on their existing journals. And they launch new titles. And then they raise their prices.

What can libraries do? They have to subscribe to these journals. Their clientele wants them – indeed, they need them to do their work. They can’t cancel their subscription to Journal X in favor of the cheaper Journal Y, because the contents of X are only available in X. Every publisher is a monopoly selling an essential commodity. No wonder things have gotten out of control.

And out of control they are. Expenditures on scholarly journals at American research libraries quadrupled from 1986 to 2005, increasing at over three times the rate of inflation. This despite a massive reduction in costs due to a major shift towards electronic dissemination. These rates of growth continue nearly unabated, even in a terrible economy. (For those interested in more details, I point you to SPARC, the Scholarly Publishing and Academic Resources Coalition, who tracks journal pricing and revenues).

The opportunity universities missed

Just as the serials crisis was hitting its stride in the mid-1990’s, fate handed universities an out – the internet. In the early 1990’s access to the scholarly literature almost always occurred via print journals. By the end of the decade, virtually all scholarly journals were publishing online.

This radical transformation in how scholarly works were disseminated should have been accompanied by a corresponding radical shift in the economics of journal publishing. But it barely made a dent. Publishers, who were now primarily shipping electrons instead of ink on paper, kept raising their subscription prices as if nothing had happened. And universities let them get away with it.

By failing to show even a hint of creativity or initiative in seizing the opportunity presented by the internet to reshape the system of scholarly communication in a productive way, the leaders of American universities condemned themselves to 15 more years (and counting) of rising costs, and decreasing value. Their inaction also cost them the chance to reclaim the primary role they once held (through their university presses) in communicating the output of their scholars.

But while universities did next to nothing to fix scholarly publishing, others leapt into the fray. A new economic model, which came to be known as “open access“, emerged as an alternative to the subscription journals. Under open access the costs of publishing would be bourn up front by research sponsors, with the finished product freely available to all. In addition to the obvious good greatly expanding the reach of the scholarly literature, open access was largely free of the economic inefficiencies that created the serials crisis in the first place, and enjoyed very strong support from university libraries across the country. But despite its manifold advantages, universities as a whole did little to help it succeed.

The unholy alliance between journals and universities

The biggest obstacle to the rise of open access journals was (and to a large extent still is) the major role that journal titles play in how universities evaluate candidates for jobs and promotions. In most academic disciplines, careers are built by publishing papers in prestigious journals – those that are the most selective, and therefore have the most cache. Scholars rising through the ranks of graduate school, the job market, assistant professorships and tenure face a nearly contant barrage of messages telling them that they have to publish in the best journals if they want to succeed at the next step. Never mind that it is far less true than people believe. That people believe it is all that matters.

Almost everyone I know thinks that simply looking at journal titles is a stupid way to decide who is or is not a good researcher, and yet it remains. There are many reasons why this system persists, but the most important is that universities like it. Administrators love having something like an objective standard that can be applied to all of the candidates for a job, promotion, etc… that might allow them to compare not only candidates for one job to each other, but all candidates for any honor across the university. This is perhaps why no university that I know of has taken a forceful stand against the use of journal titles as a major factor in hiring and promotion decisions. And it is, I believe, a major reason why they are unwilling to cut off the flow of money to these journals.

It’s never too late

Although their record is pretty bad, universities could still play a major role in making scholarly publishing work better – and save themselves money in the process – with two simple actions:

  • Stop the flow of money to subscription journals. Universities should not renew ANY subscriptions. They should, instead, approach them with a new deal – they’ll maintain payments at current levels for 3 more years if the journal(s) commit to being fully open access at the end of that time.
  • Introduce – and heavily promote – new criteria for hiring and promotion that actively discourage the use of journal titles in evaluating candidates.

These ideas are not new. Indeed, the basic outlines appear in a fantastic essay from the Association of Research Librarians published in March 1998, describing the serials crisis and their solutions to fix it:

The question inevitably asked is, “Who goes first?” Which major universities and which scholarly societies have the will, confidence, and financial resources to get the process started?

Our answer is simple and to the point. It is time for the presidents of the nation’s major research universities to fish or cut bait. Collectively, they have both opportunity and motive—and, in the Association of American Universities, they have an organization with the capacity to convene the necessary negotiations.

It’s amazing that essentially nobody took them up on the challenge the first time. Let’s hope it doesn’t take  another 15 years.

Posted in open access, publishing, science | Comments closed

The weak prescriptions in Harvard’s open-access letter and how I’d fix them

Much is being made of a recent letter from Harvard’s Faculty Advisory Council on the Library to the campus community announcing their conclusion that:

major periodical subscriptions, especially to electronic journals published by historically key providers, cannot be sustained: continuing these subscriptions on their current footing is financially untenable

Judging from many of the responses, people seem to think this is some kind of major turning point in the push for universal open access. And the fact that even Harvard, with its billions dollar endowment, is feeling the sting of rising journal prices does seem to have struck a chord.

But librarians have been warning about the “serials crisis” for years (see, for example, this prescient 1998 report from the Association of Research Libraries, the Association of American Universities, and the Pew Higher Education Roundtable). I’ve seen dozens of letters to faculty from librarians urging them to abandon subscription journals. But they have little effect. I think this is at least in part due to the mismatch between the strength of their argument, and the weakness of their proposed solutions – a pattern repeated in the Harvard letter.

So I thought I would try to help by editing the provided list of things to consider demands:

Since the Library now must change its subscriptions and since faculty and graduate students are chief users, please consider immediately implement the following options open to faculty and students (F) and the Library (L), state other options you think viable, and communicate your views:

1. Make sure that all of your own papers are accessible by submitting them to DASH in accordance with the faculty-initiated open-access policies (F). [NOTE: Harvard’s open access policy provides an opt-out provision for faculty – this is not acceptable]

2. Consider submitting Submit all of your articles to open-access journals, or to ones that have reasonable, sustainable subscription costs; move prestige to open access (F).

3. If on the editorial board of a journal involved, determine if it can be published as open access material, or independently from publishers that practice pricing described above. If not, consider resigning resign (F).

4. Contact professional organizations to raise these issues demand that they immediately support universal open access (F).

5. Encourage Demand that professional associations to take control of scholarly literature in their field or shift the management of their e-journals to library-friendly organizations (F).

6. Encourage colleagues to consider and to discuss these or other options Tell your colleagues to stop being wimps (F).

7. Sign contracts that unbundle subscriptions and concentrate on higher-use journals Do not sign any contracts to access subscription-only journals (L).

8. Immediately move all journals to a sustainable pay per use system, open access model (L).

9. Insist on subscription contracts in which the terms can be made public (L).

10. Require that all works produced by university faculty be distributed under a Creative Commons Attribution License, no matter where they are published. 

Posted in open access, science | Comments closed

MLB’s “Ultimate Father-Son Sweepstakes” made my baseball-loving, Star Wars-obsessed daughter cry

My six year old daughter loves baseball (when she was three she invited Manny Ramirez to visit her school). She also loves Star Wars (which she consideres the ultimate princess movies). And she loves video games especially those that involve smashing things).

So you can imagine how thrilled she was when, as we checked the scores at mlb.com, news of the latest Red Sox debacle was followed by an add featuring R2D2 and C-3PO and the new Kinect Star Wars for the Xbox 360. She was literally jumping up and down with excitement.

And then she got a puzzled look on her face as she slowly read the text, written in big, gold, Star Wars font.

“Daddy, what’s an ‘ultimate fatherson sweepstakes’?”

I didn’t answer immediately. I was dumbfounded, and was sure I must have been reading it wrong. There was no way Major League Baseball, who have made a serious effort to reach out to their female fans, would slap my daughter in the face this way. But it said what it said. ULTIMATE FATHER-SON SWEEPSTAKES.

So I explained it to her. And she began to cry. “Why,” she asked through her tears, “is it only for boys? I like baseball too.”

Like most girls her age, she’d had boys in her first grade tell her she couldn’t do things because she wasn’t a boy. But this was literally the first time in her life that the adult world was telling her that loving sports was not for girls. It helped a little that the fine print made it clear that the contest was not actually restricted to men and boys. But she was still confused and upset.

And she wasn’t ready to let it go. She asked me about it again on our way to school this morning. What could I do but explain that a lot of people think that sports like baseball are for boys.

“But, they’re not right” she half asserted and half asked.

“No,” I said. “Don’t listen to them. They’re wrong.”

“You mean like the white people were wrong not to let Rosa Parks sit in the front of the bus?”

(She just finished a “heroes” report on Rosa Parks, and reads a book about her to her younger sister every night.)

“Yes. It’s kind of like that.”

“But I thought those people were dead.”

I didn’t know what to say.

Posted in baseball | Tagged , , , | Comments closed