Awful Library Books I Want to Read

i-1ed00058da06f3e2dddb273f0c0313fe-geneticengineering-thumb-250x364-39485.jpgAwful Library Books has a post on books about genetic engineering from the 1970's and 80's, saying that it's time to get rid of them because "Genetic information is dramatically different from what we knew in the 70's and 80's. No mention of Human Genome Project or Dolly, the sheep." If you're looking for information about cutting-edge genetic engineering you're probably better off not looking for a book at all, but while science is inherently focused on the future, a historical perspective is almost always valuable, especially when it comes to genetic engineering and synthetic biology.

The early 1970's were a tremendous time for molecular biology, when scientists moved beyond describing what they thought was happening at a molecular level to actually being able to cut and paste DNA sequences together, making new genes and new biological behaviors. Scientists saw the tremendous possibilities for this new technology, thinking of bacteria as tiny factories, producing things like human insulin for diabetics that could previously only be isolated from the pancreases of farm animals. Scientists also saw the potential threats that the technology posed--dangerous genes could be spliced into common bacteria, spreading through the environment and causing harm to people. Not to mention to ethical implications of creating chimeras, confusing the "tree of life", creating gross bacteria with human genes. For a great introduction to the early history of genetic engineering and biotechnology, read Carl Zimmer's post where he discusses all the benefits that emerged, as well as the fears and perceived threats of the technology, how they were dealt with, and how they never came to pass.

This history of how genetic engineering was discussed and regulated in the 70's is tremendously important to keep in mind in our discussions of synthetic biology, a term that was reappropriated in 1978 to apply to creating new biological systems through genetic engineering. In 1975, the year before Laurence E. Karp, M.D. wrote his book, a group of scientists got together at the Asilomar conference center to decide on guidelines for genetic engineering and recombinant DNA technology. The conference had an enormous impact on science policy, creating a precedent for open and self-regulating science at a large scale. The scientists decided that the potential benefit from genetic engineering technology was too great, and lifted the voluntary moratorium on recombinant DNA research that had been put in place. However, they also created a series of regulations to ensure that the technology was used properly and with the utmost safety in order to prevent environmental contamination or physical harm to anybody. They made recommendations for using organisms that would be unable to compete with wild bacteria outside of the lab, to not clone genes from dangerous pathogens that would create gene products harmful to human health, and a series of other suggestions for containment of genetic engineering products deemed to be of "low", "moderate" or "high" risk to health or the environment. Many of these regulations have been loosened as it became apparent that researchers would be able to safely transfer genetic material between organisms without harming themselves or the environment.

Today, genetic engineering is commonplace and the promised benefits have been successfully implemented for decades, although many of the same fears remain, especially when it comes to the technology falling into the wrong hands (for a fascinating look at biotech and security check out Effect Measure's post "Is biotech a security problem?"). For the most part, however, synthetic biology has taken the place of recombinant DNA as the scary new technology with the potential for serious harm to the environment and human health. Can we apply some of the same concepts discussed in the books from the 1970's and the Asilomar conference to synthetic biology? Synthetic biology is the direct descendant of genetic engineering, and many of the lab techniques and concepts are the same--moving pieces of DNA around from multiple organisms, turning bacteria into tiny chemical factories with tremendous implications for medicine, industry, energy, as well as potential threats to biosafety. Many of the same recommendations apply--judging the risk posed by the pieces of DNA being moved around and their host organism, maintaining facilities adequate to contain engineered organisms depending on the level of threat to human health, open discussions between scientists, the public, and policy makers on the potential risks and benefits of the new technologies.

What is different now is the ability to synthesize DNA from chemical building blocks rather than relying only on the DNA available in the organisms at hand, and this new technology has led to new policies and recommendations through discussions and conferences between law enforcement agencies, science policy makers, researchers, and DNA synthesis companies. There is a broad consensus between all of these groups for guidelines that DNA synthesis companies should follow in order to ensure that sequences of DNA coding for dangerous things like pathogens and toxins be made and distributed safely and only to trusted researchers working on things like vaccines or medical treatments.

As the promises of synthetic biology begin to unfold, especially in areas such as drug production and biofuels, perhaps the discussions of the threats and potential of the new technology will seem quaint and useless, but they will likely be of value to researchers in the future and whatever crazy new thing they think up.

More like this

A good safety feature would be to make synthetic organisms for biofuel production hyperthermophiles. It would make recovery of volatile fuels (like alcohol) easier, and if they did escape, they wouldn't be able to grow very fast at ambient temperatures.

I appreciate that it increases the degree of difficulty, but not very much in the abstract (i.e. for people who don't know much about what the difficulties actually are).

Given that one of the most used methods of sterilisation (eg of lab waste) is heat/autoclaving/flaming, thermophiles might be harder to contain than normophiles. What about making them dependent on a refined, environmentally rare substrate instead (the Thufir Hawat gambit)? Though that may make them less convenient for mass production.

Even hyperthermophiles are killed by temperatures that are modest by industrial standards. There are only a handful that can survive normal autoclaving (121 C). There are none known that can survive 140 C.

Pushing the limits of what a hyperthermophile could tolerate would be a very interesting project of synthetic biology.

Making them dependent on a rare substrate is going to be difficult. There are essentially no examples of that in actual organisms. Many organisms that are very difficult to culture are still very effective pathogens. TB is very difficult to culture but is also very difficult to treat.

One of the issues now with alcohol production by fermentation is the sterilization of the feed media. It is too expensive to do it with heat, so they use antibiotics which select for antibiotic resistant strains and contaminate the fermentation products with antibiotics making them problematic for use as animal feed.

If you could engineer bacteria that could ferment cellulose into alcohol at 100 C, you could take the product off as vapor and use that as your cooling method. Nothing natural will ferment cellulose at those temperatures, so you don't need to sterilize your feedstock because your fermentation conditions does so all by itself. (I think alcohol is a lousy fuel, I am just using that as an example. I like hydrocarbons better.)