The bioeconomy’s code base is in biodiversity

15 09 2022 | George Darrah


A lot has been written about how the 21st century will be the century where biology rises to become the pre-eminent industrial manufacturing technology. According to a widely cited figure from McKinsey, biology could provide ‘up to 60% of the world’s physical inputs’ by 2040 . However, to scale biological manufacturing well beyond high-margin, low-volume products, we will need to repeatedly design, build, and test a huge number of new enzymes, the biological machines that catalyse a vast array of different reactions.

Today, we have characterised only a tiny fraction of possible enzymes. Estimates suggest upwards of 1 trillion microbial species on Earth, suggesting that billions of novel enzymes remain to be discovered. Expanding the pool using computational approaches has had some successes, but a limited diversity of known starting points has slowed progress. Conversely, nature’s diversity appears near infinite. And now, due to massive cost reductions in biological hardware such as DNA sequencing and computational processing, we can access the full genetic wealth of nature’s biodiversity. This emerging value chain, from biodiversity into biotechnology, will be critical to sourcing novel enzymes, and therefore growing biology’s role as a manufacturing technology. But to scale, technologists will need to build trust with a new set of stakeholders: governments and communities who manage access to the biodiversity codebase.

Taq polymerase, the enzyme that catalyses the now ubiquitous PCR reaction, was discovered inside extremophile Thermus aquaticus at Yellow Stone National Park.

Biology has the potential to be the world’s most efficient manufacturing technology. Enzymatically catalysed processes typically occur at near ambient temperatures and pressures and without toxic inputs or waste products. CO2 emissions can be a fraction of chemistry-based processes. Early pioneers of biomanufacturing at scale include Fortune 500 companies such as Dupont, Novozymes and Tate and Lyle, and emerging players like Genomatica and Solugen .

However, outside this handful of examples, biological manufacturing is predominantly limited to high-margin, low-volume products, despite massive underlying reductions in the cost of biological technologies. Key barriers include the process engineering challenge of scaling from the lab to a 100kt/year production facility (fickle microbes often prefer 1l flasks to 200 000l reactor tanks), but also the challenges of just finding enzymes that work effectively for the required reaction at scale.

Novamont’s 30kt/year bio-based Butanediol (BDO) facility, powered by Genomatica technology. Using Escherichia coli as their chassis, Genomatica’s engineers have introduced enzymes from other microbes to catalyse the conversion of sugar into BDO. Other industrial biotech companies such as Debut Bio, Fabric Nano and Enginzyme are now scaling with ‘cell-free’ systems, where enzymes operate without a microbial host. Credit Novamont.

Designing new enzymes computationally has immense potential but needs better input data. The complexity of enzymes should not be underestimated; tweaking an enzyme by more than 30% typically involves exploring more variations than there are atoms in the universe. Augmenting enzyme function using AI and rational design principles (where known relationships between enzyme structure and function inform tweaks to enzyme structure directly) have had some successes. However, using these techniques to design new enzymes far from a natural starting point continues to be extremely difficult. To our good fortune, nature has spent billions of years exploring possible enzyme variants and through evolution the enzymes that ‘work’ remain and those that don’t, don’t. The only limit to the number of starting points we have is the amount of biodiversity we have explored.

We need to source more data from nature, and it is only now that we have the tools to rapidly and affordably do this. Previously, biodiscovery efforts have been limited to searching for molecules with specific functionality, almost exclusively for high value pharmaceutical applications. Collecting samples and extracting molecules such as enzymes was operationally expensive. Then testing for molecule functionality was capital intensive, relying on large wet labs tailored to test for specific functional activity. Even with emerging high through-put capabilities in the late 1990’s and 2000’s, broadening lab focus to test for other molecule functions was so expensive that >99% of the genetic data discovered in samples was discarded.

But over the past 5 years rapid advances in technologies such as in-field long-read DNA sequencing and in-silico functional annotation has dramatically reduced this cost base. DNA can now be sequenced at the sample site. Digital sequence information (DSI) can be sent back to the lab near instantaneously, and DNA synthesised in-situ . Wet labs built for screening molecule function have shrunk in size as the ability to predict protein structure in-silico rapidly evolves, with ground-breaking new capabilities powered by Deepmind’s AlphaFold . Basecamp Research, a biodiscovery company, has used these advances to increase the number of enzymes known to science by 50% in the last three years. The tools are now ready to bridge the gap between nature’s biodiversity and biotechnology. This could be a gamechanger for our ability to rapidly design and deploy effective enzymes, a key enabler for scaling biomanufacturing across our economy.

Oxford Nanopore’s MinION DNA sequencer at work in the wild laboratory - this time, the jagged peaks of Antartica. Credit ONT.

The tech stack is ready to access nature’s genetic data but needs strong supporting governance infrastructure to scale. To build a robust value chain for new sources of enzymes, a share of the benefits must accrue to the source biodiversity providers. Concerns about ‘biopiracy’, where genetic assets of a country are taken overseas without permission of the host country, have made many countries wary of developing their genetic assets. Countries with rich genetic resources will not open their doors to biodiscovery without fair compensation. Existing multi-lateral infrastructure governing the transfer of genetic resources across international borders has proved challenging to effectively design and implement. The Nagoya Protocol attempts to outline the basic common standards for countries to share genetic assets across international borders. However, implementation has been unsurprisingly difficult, and the workability of Nagoya remains unproven at scale. Further complications have been introduced by the arrival of DSI , where no physical material needs to leave the origin country, which is not explicitly covered by Nagoya today . Many companies, governments and research institutions consider frictionless transfer of DSI as essential to human and planetary health.

The biotechnology community needs to show that biodiscovery makes economic sense for biodiversity rich countries. Harvard Law Professor Margo Bagley and other have suggested a compromise approach consistent with the language of Nagoya, that DSI use should require monetary (and non-monetary) benefit sharing, but not be constrained by other more onerous Nagoya mechanisms , such as prior informed consent. But without rebuilding trust, negotiations between sources and users of genetic data are likely to continue to flounder. Reducing technology and knowledge disparities between these parties is an important first step, such as supporting source countries in building their own DNA sequencing infrastructure (a form of non-monetary benefit-sharing). This should be prioritised by the international development budgets of biotechnology superpowers such as the USA, UK and EU, but also seen as an opportunity for private sector operators to catalyse the needed data infrastructure. In this way, technologists have an opportunity to help break the Nagoya deadlock by showing a pathway to commercially viable biodiscovery partnerships.

More and better data is a key enabler in scaling biology’s potential as a critical net-zero manufacturing technology. Nature can provide that. The remarkable technological advances of the past decade mean we can now begin a new era of exploration. The prize is immense, for both the countries and companies leading this. But it needs to be done in the right way. Technologists should not wait for Nagoya to be ‘sorted’. Start building the infrastructure now. This means getting out of the lab, into the field, and doing the hard and non-sterile work of building a robust new value chain that could just be critical to planetary survival.

EMAIL

contact@
systemiqcapital.earth

George Darrah