Chapter 1Digital Evolution of Biotechnology

For nearly four decades, biotechnology has driven the transformation of many sectors, from healthcare to food and energy, and it has grown into a global industry. It is now being transformed itself by its convergence with information technology (infotech). Biotechnology has been defined as “the use of living systems or molecular engineering to create and manufacture biologic therapies and products for patient care” [1], but it can be more broadly seen as the application of molecular biology across industries.

From the start, biotechnology grew together with other sciences. A first inflection point, Watson and Crick's 1953 discovery of the structure of DNA, depended on the development of X-ray crystallography by Franklin and Wilkins. By 1986, the first automated gene sequencer by Hunkapiller at Applied Biosystems supported Venter's National Institutes of Health (NIH) research, and a later-generation ABI sequencer, introduced in 1998, further enabled his research at Celera. This led to a second inflection point, in 2000, with the draft of the human genome by Celera and the Human Genome Project. The synergy between computing and bioscience continued with the emergence of bioinformatics and the 2003 launch of IBM's Blue Gene supercomputer, with a focus on structural proteomics.

Another technology has evolved over the past three decades, supported by the optimization of gene sequencing: CRISPR (clustered regularly interspaced short palindromic ...

Get Managing Biotechnology now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.