A New Day for Digital DNA


Xconomy Boston — 

Digital technology and molecular biology are advancing in ever more wondrous ways, perhaps none so transformative as the biological storage of data. It sounds crazy, but we could be on the cusp of a once-unthinkable kind of convergence.

Yet, in a way, it’s not so far-fetched. All life forms have sensors to monitor and respond to their environment. Thanks to evolution, these systems are far more sophisticated than what humans have produced with electronics. Take the human nose for example: with 400 different sensors (so-called “receptors”) working in parallel, it is capable of distinguishing one trillion different odors.

Biology is extremely good at sensing, or “reading,” but what’s new is our ability to store those details by “writing” them to memory. Not the silicon-based memory of ever-smaller hard drives, but writing into DNA itself by taking advantage of new discoveries from the microbial world.

How could we connect biologic sensors to DNA-based digital memory devices? A little background is perhaps in order.

In a way, our own DNA—written in 3 billion letters, or as we prefer to call them “base pairs”—is a living document. It’s a memory storage device that defines who we are, where we’ve come from, and the trials and tribulations of our journey. Our story is a long one and rather slowly written.

The idea of a “genome” (and the word itself) is a rather recent scientific notion, and original credit is given to the University of Hamburg botanist Hans Winkler. In the 1920s, when working on a new textbook, Winkler merged the German word “gene” with “ome” (the Greek suffix for body), and, voilà, “genome” was born. But it was the seminal paper of Watson and Crick in 1953 that forever sealed the concept (and the word) into the consciousness of society at large.

For the most part, changes to genomic stories are recorded at speeds proportional to the generational time of a given organism. Mutations occur in DNA, and if the organism survives, the mutations can be passed on to the next generation.

If a change, by chance, provides the host with some type of advantage, then the host’s descendants will, over time, become more prevalent. This is the basis for Darwin’s natural selection.

But microbes, under harsh selection pressure, have evolved different genomic “memory” systems—in other words, a different way to record changes in their genomic stories. The story starts with microbial immunity.

Like animals, microbes have to contend with competing and often pathogenic life forms. But unlike animals, single-celled microbes are under pressure to remain genomically simple, so they have entirely different tools to recognize and react to their deadly competitors.

Using a family of enzymes called CRISPR/Cas, microbes capture and record the genetic fingerprints of pathogenic organisms such as invasive viruses. They “write” the details of the invader into their own genomes, then “read” it to rapidly defend themselves when the same invader returns. The CRISPR/Cas system is remarkably simple and is easily co-opted and redeployed as a potential research or therapeutic tool.

This brings us back to convergence of biological sensors and information storage.   When biological receptors are coupled to the DNA writing tools—in nature or in the lab—living information systems result. For example, bacteria could be built to monitor the presence of specific substances during transit of the gastrointestinal tract and record the journey by writing specific details into DNA that can be subsequently read back with DNA sequencing. Photoreceptors, like those found in the human eye, could be coupled to pulsed lasers to rapidly write information into DNA. And the catalogue of biological sensors amenable to this type of reengineering is enormous.

The potential industrial and medical applications of these systems are substantial, yet also elicit appropriate concerns and implications. The power of these tools is raising social questions of a kind not confronted since the 1970s, when the now famous Asilomar meeting in 1975 resulted in a worldwide moratorium on recombinant DNA until an approach for oversight was established.

Recently the scientists who helped turn bacterial CRISPR/Cas into a gene editing system co-authored a paper with other experts in the field calling for a similar worldwide moratorium, this time on the use of their invention to engineer the human germline when pregnancies are involved. The centrality of maintaining public trust is doubly underscored in their recommendations: “At the dawn of the recombinant DNA era, the most important lesson learned was that public trust in science ultimately begins with and requires ongoing transparency and open discussion. That lesson is amplified today with the emergence of CRISPR-Cas9 technology and the imminent prospects for genome engineering. Initiating these fascinating and challenging discussions now will optimize the decisions society will make at the advent of a new era in biology and genetics.”

In the past 20 years, the Internet, digital cameras, silicon memory, and email have let us create a tremendous new digital world. Add in new genetic recorders that could be created using CRISPR/Cas technology and the quick, cheap, and sustainable ability to store and share information recorded biologically will set the stage for new industries. Yet, at the same time, these new tools must be carefully limited to applications for which ethics and honest interpretation of our understanding can set defined limits.