A forgotten industrial revolution offers lessons on scaled innovation
Are we living through “the century of biology”? This is the triumphant rallying call that has animated the biotech world for almost two decades.
On one level, the term points to how we are seeing an increasing rate of fundamental breakthroughs in the life sciences. On another, it captures how these new insights are also rapidly translating into high-value economic applications
The idea that biology will finally be crowned king of the sciences may appear to be a flattering self-portrait by a field traditionally seen as less fundamental than physics and chemistry. Yet the term actually came from Nobel prize-winning chemist Robert F. Curl in 1996.
Two years after the phrase was coined, American economist and social theorist Jeremy Rifkin wrote a New York Times op-ed laying out what a bio-centred society might look like. He predicted that tech based on cell and genetics engineering would drive all major conflicts and material advances, just as physics and chemistry did in the last: genetically engineered crops, cheap bio-factories for pharmaceuticals, mass bio-printed organs for transplants.
One of the most recent iterations of the ‘century of biology’ argument was a substack article by Elliot Hershberg on November 7th 2022. While Hershberg celebrated bio-based replacements for synthetic chemicals, he argued that even this sets the bar too low. He says the most radical implications of new biology will be the biologisation of industry – an entirely new way of organising the economy.
What he means by the biologisation of industry is that humans would mimic the way ecosystems produce. Ecosystems survive by using only local resources and the information in their genomes, negating the need for long logistical chains. Likewise, biologised industry would be decentralised and easily output once-scarce complex goods at low cost, anywhere in the world.
One of the most promising technological bases for such an ecosystem-mimicking economy, he argues, would be a DNA printer – something he believes is possible during our life-times.
The very first, primitive DNA printers would be capable of producing simpler things like antibodies – the ‘workhorses of modern molecular biology’. Eventually, like the digital computer, they could be programmed to make whatever biology could
It is easy to be daunted by visions of metabolic processes toppling mechanised production based around synthesis – the only kind of industry we have ever known. For guidance on how we might make such a leap, we can look to a very similar economic transition that began around 150 years ago, one whose impacts we are still living through now.
The Chemical Revolution, 1870-1930
The history of science and technology is replete with revolutions. The chemical revolution of the 18th century set researchers on a path to uncovering the fundamental nature of matter and its transformations. The first industrial revolution of the early 19th ushered in an economy based around mechanised labour.
Yet the period between 1870 and 1930, now called the second industrial revolution, was when science-driven industrial chemistry came into its own. New synthetic materials and processes for scaling them gave consumer society many of the goods we take for granted today: plastics, synthetic nitrogen fertilizer, and synthetic dyes among them. By reducing the costs of these functional materials, the second industrial revolution contributed to the very formation of mass consumerism as we know it.
Many global companies still around today started life in this period: in Germany, the heart of the chemical bonanza, grew firms like Bayer, Sandoz, and BASF. The US saw the rise of Standard Oil and the Carnegie Steel Company.
Lessons from History
The second industrial revolution is more useful in evaluating pathways for the century of biology than the more famous industrial revolution that swept England between 1780 and 1840.
This is because the innovations which characterised the first industrial revolution increased economic productivity within pre-existing industries and materials. Many innovations of this period centred around the textile industry: the early spinning mule, the spinning jenny, and the waterframe were all textile machines. The steam engine upped output from paper, flour, cotton, and iron mills.
By contrast, present day commercial biotech and the second industrial revolution both deal in entirely novel materials, thus sharing more hurdles in common. Both are capital intensive projects requiring investment into highly specialised forms of scientific knowledge.
For lessons on how an R&D-heavy industry can get off the ground, we can look to Germany from the latter half of the 19th century. At this time, its dominance in industrial chemicals cannot be overstated. For one, Germany was the place where synthetic indigo dye was invented – the breakthrough that really kicked off the spurt in chemical manufacturing.
Although it might seem mundane to modern eyes, synthetic indigo was one of the most important products to emerge during this time. Until this time, this textile dye was an expensive natural extract only available to the rich. The synthetic version opened the floodgates to mass consumption.
It was also the place where chemists made a synthetic version of red textile dye, formerly known as madder and reliant on imports from rose madder cultivators in southeast France. German chemists developed ways of producing the colours from cheaper, readily available chemicals: in the case of indigo, first isatin and later the cheaper naphthalene and in the case of madder, from anthracene, which was a byproduct of tar distillation.
One of the biggest factors that favoured German chemical advances was tight cooperation between industry, government, and the universities. This meant a shared, long term economic vision and knowledge-sharing at multiple levels of society: within the state, between and within enterprises, between universities and industry, and between state and private firms.
Just how critical cooperation was in spurring German industry can be seen by a comparison with England. Try as they might, no British-owned firm managed to figure out the process for synthesising indigo before World War I.
Granted, the British already had natural indigo thanks to a lucrative colonial trading network. Even so, no other fundamental British innovations were made in chemical products at this time. Historians have consistently attributed this to the relatively very thin interconnections between higher education, the state, and business.
The Germans got it right another way: they had large companies with complex managerial hierarchies, as well as a preference for cartel agreements. Britain was still reliant on the family firm while in the US, large corporations existed but without the cartel behaviour that distinguished the German business landscape.
The quality and nature of the higher education system also mattered. The world-leading German chemicals industry was hugely indebted to institutions that taught and researched topics of interest both to academic chemists and industrial ones.
A plethora of German technical and vocational institutions offered courses that teaching the fundamentals of chemistry that also held industrial relevance. By contrast, only two similar institutions existed in Britain. German industrial chemists simply had a more consistent and formal background in the fundamentals of science while at the same time appreciating the problems that came with applying them to industrial contexts. This led to a situation where the most successful entrepreneurs in the British chemical industry were actually German and had been educated in Germany.
Chemists in the British dye-making industry had mostly been trained as factory apprentices and did not receive a formal scientific education. This limited productive dialogue between those versed in the fundamentals of chemical structures and those keen to the technicalities of industrial scaling. Industry and academia did not have a shared language that could allow them to collaboratively tinker to advance industrial processes.
Where Will The Century Of Biology Begin?
Times and technologies may have changed but the same preconditions that gave rise to the synthetic chemicals industry are still relevant in assessing the fortunes of commercial biotech today.
The J. Craig Venter Institute produced a report in June 2020 comparing the prospects for bio-enterprise across a number of countries. By bio-enterprise, they referred to entire systems of biotech innovation and production capacity.
It totted up 53 drivers that encourage growth in innovation and capacity within biotech. These covered not just the business environment or a country’s existing output, but the wider social and economic backdrop: from the presence of a national biotechnology plan to population size to total R&D expenditure relative to GDP.
The report also assigned a grade to each driver to indicate which of them were most important to boosting bio-capacity – the ability to produce large volumes of biotech products – and bio-innovation – the ability to develop new biotech tools and products.
The drivers deemed most important for growing biotech capacity were a country’s share of publications in the top biotech journals, its share of world biotech patents between 2014 and 2019, its share of publications in the top tech journals, total expenditures on R&D, current GDP, and 2050 projected GDP.
What about the biggest drivers of biotech innovations? These were the future policy orientation of the government, projected 2050 GDP per capita, and top universities relative to GDP.
In other words, the report contended that specialist knowledge in the fundamentals of biotech – the kind which tends to develop in university settings – are the best predictors of a country’s success across both bio-capacity and innovation.
It also underlined the critical role that the government must play. It estimated that 70% of their bio-enterprise drivers are the kind that will respond significantly to governmental policy within a 10-to-30-year timeframe. These policies might cover regulation, intellectual property protection, and investments in R&D and public education The remaining 30% of drivers would respond little if at all to government policy.
The report ranked which countries are best placed to pull ahead based on their combined bio-enterprise and bio-capacity driver score. Under a scenario that all countries will retain the advantages they hold today, the USA is expected to come out first in both bio-capacity and innovation by 2050, followed by China and India.
Nonetheless, 2050 outcomes also depend on government intervention. If between now and 2050, it says, China pursues an aggressive policy push for greater bio-capacity and the US’ policy remains on its current trajectory, then China may plausibly overtake the US.
Israel scored highest on bio-innovation while remaining low on capacity, with a sector skewed heavily towards medical biotech. This has owed to government funding, high business investment into R&D and venture capital, foreign investment, and large multinationals alongside world class universities.
Similar country profiles hold for Singapore, South Korea, and Germany: all nations expected to achieve high biotech innovations but remain low on production capacity.
While Israel has cultivated a world-leading biotech sector by fostering a competitive business environment, South Korea has taken a more active top-down approach with policies to encourage growth in science and technology. These policies have targeted university funding, infrastructure-building, and industry-academic collaboration, with one of the highest R&D funding rates at 4.6 % of GDP.
Just as in Germany at the start of the last century, countries now building up their biotech industries generally display a society-wide emphasis on science and technology education, policies for fostering science-based industries, and porous boundaries between the outputs of academia and business.
Picking Up The Pieces Of The Synthetics Revolution
The rapidity and scale of the second industrial revolution means that it offers an aspirational model for commercial biotech today. In one respect, however, today’s emerging bio-based and synbio industries are a mirror image of those that coalesced around chemical breakthroughs between 1870 and 1930. This is because many of today’s spinoff enterprises around the life sciences aim to address the ecological disasters set in motion by the chemical industrial revolution.
Between 1870 and 1930, when polymerisation and petroleum refining originated, new toxins entered the environment in greater volumes than ever before. The number and diversity of non-biodegradable materials increased exponentially. Industrial success was inseparable from environmental breakdown.
But here again, the second industrial revolution can offer some stimulus. If Western societies once installed a system of production capable of destroying the biosphere, it suggests that developed countries can once again mount efforts on a similar scale to restore it.
If this is to be the century of biology, one whose prime promise is that of ecological remediation, sustainable biotech will need to replicate the success of the synthetics explosion at the turn of the last century.
The building blocks needed for this are often decades in the making: universities with high quality research outputs, political stability, and a robust scientific infrastructure are prerequisites that require planning, social consensus, and vision.
In the US, the absolute bio-enterprise leader in the JCVI report, the key elements we saw in Germany during the second industrial revolution are starting to coalesce. The recent federal bioeconomy plan, on top of pre-existing biotech funding pathways and regulatory frameworks, coincides with mergers and collaborations between the largest private biotech companies in the world, including Gingko Bioworks, Genomatica, and Solugen.
Already, it seems clear that industrial biotechnology is in motion. Whether it will manage to pick up the pieces of the synthetics revolution remains to be seen.