[Editor’s Note: This is part of a series of posts from Xconomists and other technology leaders from around the country who are weighing in with the Top 5 innovations they’ve seen in their respective fields the past 10 years, or the Top 5 disruptive technologies that will impact the next decade.]
We look at amazement at strange technologies from the past. How did people function in worlds with quill pens or connect with each other by Morse code and telegrams? Within biotechnology’s short history, we have already seen approaches from the ’90s such as Southern blots that look at sizes and amounts of DNA, and antisense therapies, are being replaced. I guarantee you at least 50 percent of what we think of as the enabling technologies and approaches to biological knowledge will be relegated to museum displays in the next five (OK, maybe 10) years.
Here are five that are ready to be replaced:
1. Genome-Wide Association Studies (GWAS) studies based on single nucleotide polymorphisms (SNPs). This is the approach that does high-speed scanning for markers across the complete sets of DNA, or genomes, among many individuals to spot small variations that might be associated with a particular disease. Single nucleotide polymorphisms (SNP) analysis isn’t going to last long as a major driver of biologic insight. Within the next one to two years, people will wake up to “ITEGS”—“It’s the entire genome, stupid.” Technologies are poised to allow analysis of variations in thousands to even hundreds of thousands of people. Do not be surprised when all the people with a disease such as Huntington’s are analyzed for DNA alterations across their entire genome. Groups such as Cure Huntington’s Disease Initiative are already preparing for this world.
2. Proteomic Approaches as an end solution to understanding diseases: Many people believe that following quantitative proteomic analysis which looks at a wide array of proteins that carry out the functional instructions from DNA, will be the key to the next wave of biologic insights. Many today yearn for a world where we could know the levels of all the proteins in a cell to finally functionate the cell—as if knowing all the elements allows one to understand all chemical structures—NOT. It’s unlikely the levels of protein components are the sufficient keys to the puzzle. It’s more likely they will become yet another layer of key information along with readouts on metabolites and RNA. The real decoding of diseases will be driven by those that know what to do with the component lists—be they DNA, RNA, or proteins. The next wave of insights will be in the hands of those that can build network models of what went wrong in the disease states.
3. Biomarker signatures as commercially viable robust markers akin to cholesterol or estrogen receptor positivity for breast cancer. Identifying signatures of certain genes or proteins is currently all the rage among those finding the right drug for the right patient. For the most part, these signatures are done on populations of hundreds to thousands of patients. Many hope to turn these into definitive markers that will guide treatment over decades. But hey VCs, you might want to try investing in other areas. It is likely that all these signatures will be replaced continually. Each time a larger sample is gathered it will allow a refinement that warrants replacing the last signature. Moreover, each time scientists can subdivide patients into more coherent sets of patients new markers will be more predictive. Prepare to live in a world where the platform owners and database organizers have a greater proportion of the value proposition. VCs would be smart to invest in platforms and those who can offer up access to the evolving models of disease from which will spring the dynamic biomarkers.
4. Indications for drugs will be determined by clinical trials performed by the biotech/pharma company developing the therapy. Most drugs today get approved by the FDA even if they only work in a small fraction of patients. This practice is going to end, because once a drug is approved with regards to its safety profile it may well be that the definition of who should get the drug is modified continuously by large trials organized by payers and patients. In real time, these groups will evolve the criteria of who should get which drug by participating in ongoing trials even after the drug has “been approved.” This is something to look forward to, as it will take much of the trial-and-error nature out of prescription medicine. It will be a world of real evidence based therapies
5. Hunter-gatherer approaches where large groups collect massive clinical and genomic information and expect that they as the data generator will be the data analyzer. Funding of large cohort studies like the famous “Framingham Heart Study” that has been following the health of patients in Framingham, MA for more than 60 years have been extremely valuable. The old methods used for these studies assumed that the analysis would be done by the small group of primary collectors of the samples and data. This model, too, will be fading as distributed groups of scientists evolve the knowledge faster and more efficiently than those who generated the data. Remember, this is already how physicists work today. Also remember Jim Gray of Microsoft Research, and his ideas on “The Fourth Paradigm,” which says that scientific theory, experimentation, and large-scale computational simulation will begin to interact and reinforce each other in ways that will speed up scientific progress.
We will need to do biology research in fundamentally different social contexts as we move into this next decade. This means biologists will need to start pooling their knowledge through social networking channels, not unlike how computer scientists have long done for open source software development.
By posting a comment, you agree to our terms and conditions.