The transition from artisanal biological research to reliable engineering holds the potential to improve the development of drugs, enzymes, crops, and materials. Traditionally, the complexity of biology has made it unpredictable to develop tools that support engineering. However, AI, paired with rapid experimental iteration, presents a new paradigm for engineering complex systems.
Like language, biology is an evolved system that adapts over time in response to a changing environment. Such systems are robust, maintaining functionality despite environmental changes. Because evolved systems do not rely on brittle rules, engineering them from first principles is particularly challenging.
Nothing in biology makes sense except in the light of evolution.
- Theodosius Dobzhansky
Today, biological product development resembles specialized craftsmanship. Small, dispersed teams possess hidden knowledge and techniques, often working in silos. This artisanal approach does not scale due to the lack of general rules and reusable processes. Some challenges of the current paradigm include:
Replication: insights and techniques are siloed by specialist research groups
Scalability: experiments and evaluations are bespoke
Efficiency: time and resources are spent on redundant efforts
Speed: teams develop and optimize methods from scratch, rarely building on top of reliable tools
AI is a core component of the shift towards reliable engineering in biology. The other essential component is rapid wet-lab experimental iteration. I believe these two ingredients, provided by service companies, will reduce the time and money required to engineer biology. This business layer of “picks and shovels” may prove a foundation that enables compounding innovation.
The theory of empiricism
Despite advancements in biotechnology, such as mass-producing insulin and vaccines, the field remains pre-industrial. We have yet to develop a comprehensive set of functions, akin to physical laws, that can be applied to scale bioengineering. Evolved, dynamic systems—such as language and biology—require a different approach to transition from science to engineering.
Demis Hassabis recently said, "If you think of mathematics as the perfect description language for physics, then AI might be the perfect one for biology." AI holds the promise of transforming biology product development by providing a structured approach to design. However, models alone, simply paired with traditional artisanal experimental approaches, will fall short.
In an omniscient piece by Chris Anderson in 2008, The End of Theory: The Data Deluge Makes the Scientific Method Obsolete, the author argues that in fields like biology, with inherent complexity, our simplified hypotheses will continue to fail as we uncover more layers. While Anderson suggests that the scientific method may become obsolete, I believe that components of complex systems can, and should, be accurately described and understood. However, at a certain scale and complexity, these systems become too intricate for humans to design principles that make reliable extrapolations.
Bottlenecks in engineering biology
I recently wrote about how an emerging cohort of software-first, generative biodesign companies could approach the drug development value chain. These firms leverage AI to streamline the design phase of biological products, each focusing on different segments of the value chain. Expanding upon this trend, I believe it is essential for generative biodesign firms to directly address a complementary bottleneck: experimental iteration.
A major restriction in scaling biological engineering lies in the wet-lab phase of testing and validation. While AI can reduce upfront design costs and provide tools to manipulate systems, significant resources are still required for experimental feedback. This is largely because AI tools operate locally and are inherently lossy. These systems are approximate; they hallucinate and often fail in out-of-domain scenarios.
Thankfully, we don’t need a sci-fi “AGI” to overcome these issues. What we need are scalable approaches to iterative experimentation. Therefore, in addition to AI-driven software, we must commoditize wet-lab assays to enable faster and more efficient experimentation. Addressing this bottleneck will be crucial for building a strong foundation for innovation.
Circular services for industrialization
Many discussions in software focus on “vertical” or “horizontal” solutions. Complementing this, I believe services that accelerate generative AI will likely be “circular,” offering specialized software for tasks such as protein design or genetic sequence engineering, alongside wet-lab experimental validation. With chatbots, users can immediately evaluate outputs. In science, we must empirically assess results. Generative biodesign services will provide purpose-built facilities to execute standard experiments that determine general properties of molecular systems, such as stability and safety.
Cloud-based laboratories may hold potential to transform bespoke assays into services, which can be outsourced by numerous businesses and research labs. Such “picks and shovels” can empower a whole layer of innovators to create their own tools and engineer solutions to more complex problems, exemplifying the compounding value of industrialization.
Design-test service providers may catalyze a shift from artisanal to industrial biology. These companies won’t be your grand-daddy’s CRO. They will operate at scale, driving down costs and democratizing access to the bioeconomy.