How innovation works and why it flourishes in freedom (2020, by Matt Ridley)
This was a very enjoyable read from Matt Ridley. This book takes over from where the Rational Optimist book left, and talks about the how and mechanics of innovation.
He has some controversial opinions in some areas (like fracking, vaping, GMOs) where you sense a somewhat single-sided treatment of the subjects, but these do not take away from the overall theme and arguments in the book.
I guess I didn't learn anything I didn't already know about the process of innovation. However, I think the book helped increase my respect for the importance of innovation compared to just invention.
I recommend the book. The book is worth a read just for the nice anecdotes about the history of innovation in different fields alone. However, the book could have been shorter and better organized (tidier). Also, the book could have made its case for why innovation flourishes in freedom tighter.
Some of my highlights from the book
In Douglas Adams’s The Hitchhiker’s Guide to the Galaxy, Zaphod Beeblebrox’s starship Heart of Gold – a metaphor for wealth – is powered by a fictional ‘infinite improbability drive’. Yet a near-infinite improbability drive does indeed exist, but here on Planet Earth, in the shape of the process of innovation. Innovations come in many forms, but one thing they all have in common, and which they share with biological innovations created by evolution, is that they are enhanced forms of improbability.
Innovation, like evolution, is a process of constantly discovering ways of rearranging the world into forms that are unlikely to arise by chance – and that happen to be useful. The resulting entities are the opposite of entropy: they are more ordered, less random, than their ingredients were before. And innovation is potentially infinite because even if it runs out of new things to do, it can always find ways to do the same things more quickly or for less energy.
The power of the improbability drive is therefore limited only by the supply of energy.
Innovation, then, means finding new ways to apply energy to create improbable things, and see them catch on.
And here is my starting point: innovation is the most important fact about the modern world, but one of the least well understood.
The surprising truth is that nobody really knows why innovation happens and how it happens, let alone when and where it will happen next.
Serendipity plays a big part in innovation, which is why liberal economies, with their free-roving experimental opportunities, do so well. They give luck a chance.
Innovation happens when people are free to think, experiment and speculate. It happens when people can trade with each other.
I tell the stories of steam engines and search engines, of vaccines and vaping, of shipping containers and silicon chips, of wheeled suitcases and gene editing, of numbers and water closets. Let’s hear from Thomas Edison and Guglielmo Marconi, from Thomas Newcomen and Gordon Moore, from Lady Mary Wortley Montagu and Pearl Kendrick, from Al Khwarizmi and Grace Hopper, from James Dyson and Jeff Bezos.
The chief way in which innovation changes our lives is by enabling people to work for each other. As I have argued before, the main theme of human history is that we become steadily more specialized in what we produce, and steadily more diversified in what we consume: we move away from precarious self-sufficiency to safer mutual interdependence.
Most innovation is a gradual process. The modern obsession with disruptive innovation, a phrase coined by the Harvard professor Clayton Christensen in 1995, is misleading.
Innovation often disappoints in its early years, only to exceed expectations once it gets going, a phenomenon I call the Amara hype cycle, after Roy Amara, who first said that we underestimate the impact of innovation in the long run but overestimate it in the short run.
Innovation seems so obvious in retrospect but is impossible to predict at the time.
Watt realized something about Newcomen engines in general that should have been spotted much earlier: three-quarters of the energy of the steam was being wasted in reheating the cylinder during each cycle, after it had been cooled with injected water to condense the steam.
My point is simple: Watt, brilliant inventor though he undoubtedly was, gets too much credit, and the collaborative efforts of many different people too little.
But, hang on, didn’t Thomas Edison invent the light bulb? Yes, he did. But so did Marcellin Jobard in Belgium; and so did William Grove, Fredrick de Moleyns and Warren de la Rue (and Swan) in England. So too did Alexander Lodygin in Russia, Heinrich Göbel in Germany, Jean-Eugène Robert-Houdin in France, Henry Woodward and Matthew Evans in Canada, Hiram Maxim and John Starr in America, and several others. Every single one of these people produced, published or patented the idea of a glowing filament in a bulb of glass ...
[Edison] then set up a laboratory in Menlo Park, New Jersey, in 1876, to do what he called ‘the invention business’, later moving to an even bigger outfit in West Orange. He assembled a team of 200 skilled craftsmen and scientists and worked them ruthlessly hard.
Parsons’s turbine was about 2 per cent efficient at turning the energy of a coal fire into electricity. Today a modern combined-cycle gas turbine is about 60 per cent efficient.
Nuclear power and the phenomenon of disinnovation The twentieth century saw only one innovative source of energy on any scale: nuclear power. (Wind and solar, though much improved and with a promising future, still supply less than 2 per cent of global energy.) In terms of its energy density, nuclear is without equal: an object the size of a suitcase, suitably plumbed in, can power a town or an aircraft carrier almost indefinitely. The development of civil nuclear power was a triumph of applied science, the trail leading from the discovery of nuclear fission and the chain reaction through the Manhattan Project’s conversion of a theory into a bomb, to the gradual engineering of a controlled nuclear fission reaction and its application to boiling water.
Yet today the picture is of an industry in decline, its electrical output shrinking as old plants close faster than new ones open, and an innovation whose time has passed, or a technology that has stalled. This is not for lack of ideas, but for a very different reason: lack of opportunity to experiment. The story of nuclear power is a cautionary tale of how innovation falters, and even goes backwards, if it cannot evolve.
Following the Three-Mile Island accident in 1979, and Chernobyl in 1986, activists and the public demanded greater safety standards. They got them. According to one estimate, per unit of power, coal kills nearly 2,000 times as many people as nuclear; bioenergy fifty times; gas forty times; hydro fifteen times; solar five times (people fall off roofs installing panels) and even wind power kills nearly twice as many as nuclear.
It was an old reactor and would have been phased out long since if Japan had still been building new nuclear reactors. The stifling of nuclear expansion and innovation through costly overregulation had kept Fukushima open past its due date, thus lowering the safety of the system.
Today America is not only the world’s biggest producer of gas; it is also the world’s biggest producer of crude oil, thanks entirely to the shale-fracking revolution. The Permian basin in Texas alone now produces as much oil as the whole of the United States did in 2008, and more than any OPEC country except Iran and Saudi Arabia. America was building huge gas import terminals in the early 2000s; these have now been converted into export terminals. Cheap gas has displaced coal in the country’s electricity sector, reducing its emissions faster than any other country. It has undermined OPEC and Russia, leaving the latter frantically supporting anti-fracking activists to try to defend its markets – with much success in innovation-phobic Europe, where shale exploitation has been largely prevented.
At first environmentalists welcomed the shale gas revolution. In 2011 Senator Tim Wirth and John Podesta welcomed gas as ‘the cleanest fossil fuel’, writing that fracking ‘creates an unprecedented opportunity to use gas as a bridge fuel to a 21st-century energy economy that relies on efficiency, renewable sources, and low-carbon fossil fuels such as natural gas’. Robert Kennedy, Jr, head of the Waterkeeper Alliance, wrote in the Financial Times that ‘In the short term, natural gas is an obvious bridge fuel to the “new” energy economy.’ But then it became clear that this cheap gas would mean the bridge was long, posing a threat to the viability of the renewable-energy industry. Self-interest demanded a retraction by Kennedy, which he duly provided, calling shale gas a ‘catastrophe’.
Recruiting some high-profile stars, including Hollywood actors such as Mark Ruffalo and Matt Damon, the bandwagon gathered pace. Accusations of poisoned water supplies, leaking pipes, contaminated waste water, radioactivity, earthquakes and extra traffic multiplied. Just as the early opponents of the railways accused trains of causing horses to abort their foals, so no charge was too absurd to level against the shale gas industry. As each scare was knocked on the head, a new one was raised. Yet despite millions of ‘frac jobs’ in thousands of wells, there were very few and minor environmental or health problems.
Yet energy itself does deserve to be singled out. It is the root of all innovation if only because innovation is change and change requires energy. Energy transitions are crucial, difficult and slow.
According to the correspondents, Emmanuel Timonius and Giacomo Pylarini, both physicians working in the Ottoman Empire, the pus from a smallpox survivor would be mixed with the blood in a scratch on the arm of a healthy person. The reports were published by the Royal Society but dismissed as dangerous superstition by all the experts in London. More likely to spark an epidemic than prevent it; an unconscionable risk to be running with people’s health; an old wives’ tale; witchcraft. Given the barbaric and unhelpful practices of doctors at the time, such as bloodletting, this was both ironic and perhaps understandable.
Lady Mary did indeed engraft her son Edward, anxiously watching his skin erupt in self-inflicted pustules before subsiding into immunized health. It was a brave moment. On her return to London she inoculated her daughter as well,
So, yet again, innovation proves to be gradual and to begin with the unlettered and ordinary people, before the elite takes the credit.
Vaccination exemplifies a common feature of innovation: that use often precedes understanding. Throughout history, technologies and inventions have been deployed successfully without scientific understanding of why they work. To a rational person in the eighteenth century, Lady Mary’s idea that exposure to one strain of a fatal disease could protect against that disease must have seemed crazy. There was no rational basis to it. It was not until the late nineteenth century that Louis Pasteur began to explain how and why vaccination worked.
After a lengthy trial the judge eventually ruled that the company had met its responsibilities by this innovation. The Jersey City case proved a turning point, a clean-watershed. Cities all over the country and the world began using chlorination to clean up water supplies, as they do to this day. Typhoid, cholera and diarrhoea epidemics rapidly disappeared. But where did Dr Leal get the idea? From a similar experiment in Lincoln in England, he said at the trial. Like most innovators he did not claim to be the inventor.
So penicillin languished as a curiosity, undeveloped as a cure for disease, for more than a decade. Fleming was a denizen of the laboratory, not the clinic or the boardroom.
The story of penicillin reinforces the lesson that even when a scientific discovery is made, by serendipitous good fortune, it takes a lot of practical work to turn it into a useful innovation.
Polio became a worsening epidemic especially in the United States during the twentieth century. Ironically, it was mainly improved public health that caused this, by raising the age at which most people caught the virus, resulting in more virulent infections and frequent paralysis. When everybody encountered sewage in their drinking or swimming water, the population was immunized early, before the virus caused paralysis. With chlorine cleaning up the water supply, people encountered the virus later and more virulently. By the 1950s the polio epidemic in the United States was worsening every year: 10,000 cases in 1940, 20,000 in 1945, 58,000 in 1952. Enormous public interest channelled generous donations into treatment, and the search for a vaccine. Huge fame and great wealth awaited the team that reached the prize, so some corners were cut.
The contaminating virus was eventually isolated, christened SV40 and studied in detail by others. We now know that almost every single person vaccinated for polio in America between 1954 and 1963 was probably exposed to monkey viruses, of which SV40 – the fortieth to be described – was just one. That is about 100 million people. In the years that followed, the health establishment was quick to reassure the world that the risk was small, but they had little reason to be so complacent at the time. Sure enough there has been no epidemic of unusual cancer incidence among those who received contaminated vaccines, but SV40 DNA has been detected in human cancers, especially mesotheliomas and brain tumours, where it may have acted as a co-factor alongside other causes. Saying this remains unpopular to this day.
By the 1980s, with smallpox eradicated, and polio, typhoid and cholera in retreat, one stubborn disease remained the biggest killer, capable of ending hundreds of thousands of lives a year. And it was getting worse: malaria.
By 2010, 145 million nets were being delivered each year. Over a billon have been used to date. Globally, the death rate from malaria almost halved in the first seventeen years of the current century.
The greatest killer of the modern world is no longer a germ, but a habit: smoking. It directly kills more than six million people every year prematurely, perhaps contributing indirectly to another million deaths. The innovation of smoking, brought from the Americas to the Old World in the 1500s, is one of humankind’s biggest mistakes.
About 3.6 million Britons vape, compared with 5.9 million who smoke. The habit is even endorsed by public agencies, the government, charities and academic colleges, not because it is wholly safe, but because it is much safer than smoking. This is in sharp contrast to the United States, where vaping is officially discouraged, or Australia, where it is still – as of this writing – officially illegal.
‘We looked hard at the evidence and made a call,’ wrote Halpern later. ‘We minuted the PM and urged that the UK should move against banning e-cigs. Indeed, we went further. We argued we should deliberately seek to make e-cigs widely available, and to use regulation not to ban them but to improve their quality and reliability.’ That is why this innovation caught on more in Britain than elsewhere, despite furious opposition from much of the medical profession, the media, the World Health Organization and the European Commission. Strong evidence from well-controlled studies now exists that vaping’s risks, though not zero, are far lower than smoking: it contains fewer dangerous chemicals and it causes fewer clinical symptoms. One 2016 study found that after just five days of vaping, the toxicants in the blood of smokers had dropped to the same levels as those of people who quit altogether. A 2018 study of 209 smokers who switched to e-cigarettes and were followed for two years found no evidence of any safety concerns or serious health complications.
The rest of the world soon followed suit. The first railway in America began operating in 1828, in France in 1830, in Belgium and Germany in 1835, in Canada in 1836, in India, Cuba and Russia in 1837, in the Netherlands in 1839. By 1840 America already had 2,700 miles of railway, and 8,750 by 1850.
As John Daniels, the Kitty Hawk resident who took the photograph, put it, they were the ‘workingest boys I ever knew . . . It wasn’t luck that made them fly; it was hard work and common sense.’
Even when people did believe the Wright brothers, they doubted the value of what they had done.
Meanwhile at Fort Myer near Washington, Orville was also wowing the crowds with a duplicate flying machine. On 9 September he twice stayed in the air for more than an hour, circling the field more than fifty times.
Just ten years later, in June 1919, John Alcock and Arthur Brown crossed the Atlantic non-stop from Nova Scotia to Ireland in sixteen hours, through fog, snow and rain. The First World War had by then given rapid impetus to the development of designs and flying skills, though much of it would have happened anyway.
To Orville Wright’s fury, the Smithsonian tried to rewrite history in 1914, resurrecting Langley’s aerodrome, secretly modifying it, flying it briefly, then removing the modifications before putting it on display along with the claim that Langley had therefore designed the first machine capable of powered flight. The Wrights’ flyer was not installed in the Smithsonian museum until 1948, after Orville’s death.
In America, you are now at least 700 times more likely to die in a car, per mile travelled, than in a plane. The decline in air accidents is as steep and impressive as the decline in the cost of microchips as a result of Moore’s Law. How has this been achieved? The answer, as with most innovation, is that it happened incrementally as a result of many different people trying many different things.
The potato was once an innovation in the Old World, having been brought back from the Andes by conquistadors. It provides a neat case history of the ease, and difficulty, with which new ideas and products diffuse through society. Potatoes are the most productive major food plant, yielding three times as much energy per acre as grain. They were domesticated about 8,000 years ago in the high Andes, above 3,000 metres, from a wild plant with hard and toxic tubers.
Slow to arrive, the potato was slow to catch on in Europe. Against it was a combination of practice and prejudice.
The crash came in 1845 when a parasitic blight fungus (Phytophthora infestans) that the potato plant had left behind in the Andes reached Ireland via the United States. That September throughout Ireland the potato crops rotted in the fields both above and below ground. Even stored potatoes turned black and putrid. Within a few years, a million people had died of starvation, malnutrition and disease, and at least another million had emigrated. The Irish population, which had reached over eight million, plunged and has still not returned to the level it was in 1840. Similar if less severe famines caused by blight drove Norwegians, Danes and Germans across the Atlantic.
That nitrogen was a limiting nutrient in the growing of crops had been known, at least vaguely, for centuries. It led farmers to beg, borrow and steal any source of manure, urea or urine they could find. Try as they might, though, they struggled to apply enough nitrogen to enable their crops to realize their full potential.
The science that explained this hunger for nitrogen came much later, with the discovery that every building block in a protein or DNA molecule must contain several nitrogen atoms, and that though the air consisted mostly of nitrogen atoms they were bound together in tight pairs, triple covalent bonds between each pair of atoms. Vast energy was needed to break these bonds and make nitrogen useful. In the tropics, frequent lightning strikes provided such energy, keeping the land a little more fertile, while in paddy-rice agriculture, algae and other plants fix nitrogen from the air to replenish the soil. Temperate farms, growing crops such as wheat, were very often nitrogen-limited, if not nitrogen-starved.
The guano boom made great fortunes, but by the 1870s it was over. It was succeeded by a boom in Chilean saltpetre, or salitre, a rich nitrate salt that could be made by boiling caliche, a mineral found in abundance in the Atacama desert, the result of desiccated ancient seas uplifted into the mountains and left undissolved by the extreme dryness of the climate. Though the mines and refineries were mostly in Peru and Bolivia it was Chileans who worked them, and in 1879 Chile declared war and captured the key provinces, cutting Bolivia off from the sea and amputating part of Peru. By 1900 Chile was producing two-thirds of the world’s fertilizer, and much of its explosive.
He chose to speak about the ‘wheat problem’, namely the looming probability that the world would be starving by 1930 unless a way could be made to synthesize nitrogen fertilizer to replace Chilean nitrate, wheat being then by far the largest crop in the world.
After the Great War, the Haber–Bosch process was used throughout the world to fix nitrogen on a grand scale. The process became steadily more efficient, especially once natural gas was substituted for coal as the source of energy and hydrogen. Today, ammonia plants use about one-third as much energy to make a tonne of ammonia as they did in Bosch’s day. About 1 per cent of global energy is used in nitrogen fixation, and that provides about half of all fixed nitrogen atoms in the average human being’s food. It was synthetic fertilizer that enabled Europe, the Americas, China and India to escape mass starvation and consign famine largely to the history books: the annual death rate from famine in the 1960s was 100 times greater than in the 2010s. The so called Green Revolution of the 1960s and 1970s was about new varieties of crop, but the key feature of these new varieties was that they could absorb more nitrogen and yield more food without collapsing (see next section). If Haber and Bosch had not achieved their near-impossible innovation, the world would have ploughed every possible acre, felled every forest and drained every wetland, yet would be teetering on the brink of starvation, just as William Crookes had forecast.
The ecologist Paul Ehrlich forecast famines ‘of unbelievable proportions’ by 1975; another famous environmentalist, Garret Hardin, said feeding India was like letting survivors of a shipwreck climb aboard an overloaded lifeboat; the chief organizer of Earth Day, in 1970, said it was ‘already too late to avoid mass starvation’; a pair of brothers, William and Paul Paddock, one an agronomist and the other a Foreign Service official, wrote a best-seller called Famine 1975!, arguing for abandoning those countries, like India, that were ‘so hopelessly headed for or in the grip of famine (whether because of overpopulation, agricultural insufficiency, or political ineptness) that our aid will be a waste; these “can’t-be-saved nations” will be ignored and left to their fate’. Never have gloomy and callous forecasts been so rapidly proved wrong. Both India and Pakistan would be self-sufficient in grain within a decade thanks to dwarf wheat.
Mehta listened. India doubled its wheat harvest in just six years. There was so much grain there was nowhere to store it. In his acceptance speech on being awarded the Nobel Peace Prize in 1970, Norman Borlaug said that ‘man can and must prevent the tragedy of famine in the future instead of merely trying with pious regret to salvage the human wreckage of the famine, as he has so often done in the past.’
This fifty-year story of how dwarfing genes were first found in Japan, cross-bred in Washington, adapted in Mexico and then introduced against fierce opposition in India and Pakistan is one of the most miraculous in the history of humankind. Rice quickly followed suit with its own dwarf varieties and higher yields; so did other crops.
About one-third of the maize (corn) grown in the world is now insect-resistant because of introduced Bt genes as well. In America, where 79 per cent of the corn is now Bt, the cumulative benefit to farm income of this technology over twenty years comes to more than $25bn. Bizarrely, the organic-farming sector refused to approve the new plants even though they used the same molecules as their own sprays, because of an objection to biotechnology in principle.
Highly useful scientific discoveries are almost always – ridiculously often – accompanied by frenzied disputes about who deserves the credit. In no case is this more true than in the story of CRISPR, a genetic technique that the world awoke to in 2012 ... Yet arguably neither of these huge American universities with their big budgets and luxurious laboratories deserve as much credit as they seek. That should go to a couple of obscure microbiologists working on practical but unfashionable questions about bacteria, one in a university laboratory tackling a problem of interest to the salt industry, the other in an industrial food-manufacturing company.
It took Mojica more than a year to get his results published, so sniffy were the prestigious journals at the idea of a significant discovery coming from a scientific nobody in a backwater like Alicante.
After hearing about CRISPR at a conference, Horvath had a hunch that it might supply the answer. He soon showed that the bacteria with the most spacers were often the most likely to be the resistant strains, and the ones with spacers derived from a particular phage DNA were resistant to that phage. This proved Mojica right. CRISPR’s job – with the help of Cas – is to recognize a particular sequence and cut it, thus emasculating the virus.
The next step, or leap of logic, was to think ‘maybe we human beings can borrow CRISPR for our own purposes’. Replace the spacers with a gene we want to excise, perhaps combine it with a new sequence we want to insert, and adapt the microbial system as a genetic-engineering tool of uncanny precision.
In 2019 three French scientists reviewed the patenting of CRISPR products and found that Europe was already being dramatically left behind. Whereas America had taken out 872 patent families and China 858, the EU had only taken out 194 and the gap was growing. They concluded: ‘It would be a delusion not to consider the GMO bans in Europe as having had a strong negative impact on the future of biotechnology on the continent.’
Morse’s real achievement, like that of most innovators, was to battle his way through political and practical obstacles.
Once the telegraph was in use, the telephone was bound to follow at some point. In 1876, in what is often cited as a spectacular case of simultaneous invention, Alexander Graham Bell arrived at the patent office to file a patent on the invention of the telephone, and just two hours later Elisha Gray arrived at the same patent office with an application for the very same thing.
On 12 February 1931, at Marconi’s side, the Pope launched Vatican radio in a blaze of global publicity. At a reception afterwards, the Pope thanked both Marconi and God for putting ‘such a miraculous instrument as wireless at the service of humanity’. Others of a less benign intent took notice of the Vatican example. ‘It would not have been possible for us to take power or to use it in the ways we have without the radio,’ noted Josef Goebbels in August 1933.
(A similar pattern was observed in the Rwanda genocide of 1993: the more people in an area who had access to the ‘hate radio station’ RTLM, the greater the violence against Tutsis.)
The impact of radio on the polarization of society was huge – shades of what has happened more recently with social media.
*For reasons that are not entirely clear, network television had the opposite effect of radio, bringing people back towards a social consensus, sometimes stiflingly so, rather than polarizing them.* If there was a moment that encapsulates this shift, it was in April 1954, when the American people got their first glimpse of Senator Joe McCarthy via television. They did not like what they saw and McCarthy’s bubble burst immediately. ‘The American people have had a look at you for six weeks. You are not fooling anyone,’ said Senator Stuart Symington shortly afterwards. It was this centripetal effect that has gone into reverse with the arrival of social media, I think, a polarizing force like early radio.
There is no great invention, from fire to flying, that has not been hailed as an insult to some god. J. B. S. HALDANE
Liberty is the parent of science and of virtue, and a nation will be great in both in proportion as it is free. THOMAS JEFFERSON
Innovation is gradual: There is no day when you can say: computers did not exist the day before and did the day after, any more than you could say that one ape-person was an ape and her daughter was a person.
There is nobody who deserves the accolade of the inventor of the computer. There is instead a regiment of people who made crucial contributions to a process that was so incremental and gradual, cross-fertilized and networked, that there is no moment or place where it can be argued that the computer came into existence, any more than there is a moment when a child becomes an adult.
If innovation is a gradual, evolutionary process, why is it so often described in terms of revolutions, heroic breakthroughs and sudden enlightenment? Two answers: human nature and the intellectual property system. As I have shown repeatedly in this book, it is all too easy and all too tempting for whoever makes a breakthrough to magnify its importance, forget about rivals and predecessors, and ignore successors who make the breakthrough into a practical proposition.
Charles Townes, who won the Nobel Prize for the physics behind the laser in 1964, was fond of quoting an old cartoon. It shows a beaver and a rabbit looking up at the Hoover dam: ‘No, I didn’t build it myself,’ says the beaver. ‘But it’s based on an idea of mine.’
Innovation is recombinant: Every technology is a combination of other technologies; every idea a combination of other ideas.
Recombination is the principal source of variation upon which natural selection draws when innovating biologically. Sex is the means by which most recombination happens. A male presents half his genes to an embryo and so does a female. That is a form of recombination, but what happens next is even more momentous. That embryo, when it comes to make sperm and egg cells, swaps bits of the father’s genome with bits of the mother’s in a process known as crossing over. It shuffles the genetic deck, creating new combinations to pass on to the next generation. Sex makes evolution cumulative and allows creatures to share good ideas. The parallel with human innovation could not be clearer. Innovation happens, as I put it a decade ago, when ideas have sex. It occurs where people meet and exchange goods, services and thoughts. This explains why innovation happens in places where trade and exchange are frequent and not in isolated or underpopulated places: California rather than North Korea, Renaissance Italy rather than Tierra del Fuego. It explains why China lost its innovative edge when it turned its back on trade under the Ming emperors. It explains the bursts of innovation that coincide with increases in trade, in Amsterdam in the 1600s or Phoenicia 3,000 years earlier. The fact that fishing tackle in the Pacific was more diverse on islands with more trading contacts, or that Tasmanians lost out on innovation when isolated by rising sea levels, shows the intimate, mandatory connection between trade and the development of novelty. This explains too why innovation started in the first place.
DNA sequences change by errors in transcription, or mutations caused by things like ultraviolet light. These little mistakes, or point mutations, are the fuel of evolution. But, as the Swiss biologist Andreas Wagner has argued, such small steps cannot help organisms cross ‘valleys’ of disadvantage to find new ‘peaks’ of advantage.
Wagner cites numerous studies which support the conclusion that ‘recombination is much more likely to preserve life – up to a thousand times more likely – than random mutation is.’
Innovation involves trial and error Most inventors find that they need to keep ‘just trying’ things. Tolerance of error is therefore critical.
Innovation is a team sport The myth of the lonely inventor, the solitary genius, is hard to shake. Innovation always requires collaboration and sharing, as exemplified by the fact that even the simplest object or process is beyond the capacity of any one human being to understand.
By contrast, the Ottoman and Mughal empires managed to ban printing for more than three centuries. Istanbul, a great city of culture on the edge of Europe administering a vast empire of Christians as well as Muslims, resisted the new technology. It did so, precisely because it was the capital of an empire. In 1485 printing was banned by order of Sultan Bayezid II. In 1515 Sultan Selim I decreed that printing by Muslims was punishable by death. This was an unholy alliance: the calligraphers defending their business monopoly in cahoots with the priests defending their religious monopoly, by successfully lobbying the imperial authorities to keep printing at bay.
David Hume, writing in the eighteenth century, already realized this truth, that China had stalled as a source of novelty because it was unified, while Europe took off because it was divided.
America may appear an exception, but in fact it proves this rule. Its federal structure has always allowed experiment.
And the really interesting thing is that cities need fewer petrol stations and miles of electrical cable or road – per head of population – as they get bigger, but have disproportionately more educational institutions, more patents and higher wages – per head of population – as they get bigger. *That is to say, the infrastructure scales at a sublinear rate, but the socio-economic products of a city scale at a superlinear rate.*
The nineteenth-century economist William Stanley Jevons discovered a paradox, since named after him, whereby saving energy only leads to the use of more energy. We react to cheaper inputs by using more of them. When electricity is cheap we leave the lights on more. But Andrew McAfee, in his book More from Less, argues that in many sectors the economy is now exhausting the Jevons paradox and beginning to bank the savings. Thus LEDs use less than 25 per cent of the electricity that incandescent bulbs use for the same amount of light, so you would have to leave them on for more than ten times as long to end up using more power: that is unlikely to happen.
My point is that we make a mistake if we insist that science is always upstream of technology. *Quite often scientific understanding comes from an attempt to explain and improve a technical innovation.*
If your project aims to have a monkey recite Shakespeare while on a pedestal, it’s a mistake to invent the pedestal first and leave till later the hard problem of training the monkey to speak.
By the time it was stopped, Theranos had tested almost a million people’s blood, almost certainly giving both false alarm and false reassurance to a large number of people. It was about to roll out its service on a far greater scale through more than 8,000 Walgreens stores. John Carreyrou’s investigation therefore almost single-handedly prevented a health catastrophe. He argues that a general lesson still needs to be learned: ‘hyping your product to get funding while concealing your true progress and hoping that reality will eventually catch up to the hype continues to be tolerated in the tech industry.’
Remember the golden rule of innovation – overcoming the [challenges] will be bound to require trial and error, not just clever forecasting, and may not be cheap.
We need big failures in order to move the needle. If we don’t, we’re not swinging enough. You really should be swinging hard, and you will fail, but that’s okay. JEFF BEZOS
‘Our success at Amazon is a function of how many experiments we do per year, per month, per week. Being wrong might hurt you a bit, but being slow will kill you,’ Bezos once said: ‘If you can increase the number of experiments you try from a hundred to a thousand, you dramatically increase the number of innovations you produce.’
[Bezos's] tendency to hire people into small ‘two-pizza’ entrepreneurial teams often in competition with each other, his allergy to big meetings and PowerPoint presentations and his operation of a sort of reverse-veto policy, whereby a new idea has to be referred upwards by managers even if all but one of them thinks it is rubbish. All of these were designed to encourage innovation, and effectively to allow failure to happen but relatively painlessly. It was this sort of Darwinian process that led Amazon to the discovery of an even bigger business than online retail, namely the provision of cloud computing to outsiders, which became Amazon Web Services. Google and Microsoft were slow to spot what Amazon was doing, and how much it enabled tech startups to get going.
But Astro Teller, the head of X, celebrates rather than laments such failures. In 2016 at TED in Vancouver he spoke of the ‘unexpected benefit of celebrating failure’. One day, X will perhaps generate something so spectacular that it dwarfs Google itself.
As late as the second half of the eighteenth century, Sweden tried to ban coffee no fewer than five times. The regime confiscated coffee cups from its citizens in a desperate effort to enforce the ban, and ceremonially crushed a coffee pot in 1794. ... King Gustav III set out to prove coffee was bad for people through a controlled experiment. He ordered one convicted murderer to drink nothing but coffee while another drank nothing but tea. Magnificently, both men outlived the doctors monitoring the experiment, and even the king himself. The coffee-drinking murderer lived longest of all, of course. Campaigns against coffee none the less continued in Sweden until the twentieth century. Here we see all the characteristic features of opposition to innovation: an appeal to safety; a degree of self-interest among vested interests; and a paranoia among the powerful. Recent debates about genetically modified food, or social media, echo these old coffee wars.
In 2011 the economist Alex Tabarrok argued in his book Launching the Innovation Renaissance that the American patent system, far from encouraging innovation, is now discouraging it.
Inventing something gives you a first-mover advantage, which is usually quite enough to get you a substantial reward.
Tabarrok argues for a three-tier patent system, offering two-, ten- or twenty-year patents, with short patents granted much more quickly, easily and cheaply.
These barriers to entry are designed to increase the rewards of those already practising.
In the mid-2010s, Thiel made the following observation: ‘I would say that we lived in a world in which bits were unregulated and atoms were regulated.’
Some people are arguing that we live in an age of innovation crisis: too little, not too much. The Western world, especially since 2009, seems to have forgotten how to expand its economy at any reasonable speed. The rest of the world is making up for this, with Africa in particular beginning to rival the explosive growth rates that Asia achieved in the previous two decades.
Innovation is the child of freedom and the parent of prosperity. It is on balance a very good thing. We abandon it at our peril.
The future is thrilling and it is the improbability drive of innovation that will take us there.
Comments