Matt Ridley's Blog, page 20
June 21, 2017
Post-election blues
Belated posting of my post-election Times column:
For those of us who want a clean Brexit and who champion freedom and innovation rather than socialism, the election result was a shattering disappointment. It reduced the party that most embraces free enterprise to a minority in the House of Commons and leaves us with a diminished and humiliated government less likely to win crucial concessions from a European Union emboldened to be more punitive — all against a background of teenager-murdering theocracy.
But, as the first shock fades, I am finding a few crumbs of comfort. Not optimism exactly, but glimmers of light amid the gloom. Here is my top ten.
1. When the dust settles, this was a near miss, not a crash. We came within a few thousand votes of electing a version of, and admirer of, Hugo Chávez to run the United Kingdom, but we did not do so. So this was not quite our Trump moment. Thanks to Ruth, queen of Scots, and the fact that even Labour did not realise just how many vulnerable English seats had been left unguarded by the complacent Tories, Britain elected 56 more Tory than Labour MPs.
Jeremy Corbyn, Diane Abbott and John McDonnell got close to power, but are still some distance from it. Let us remember that this upheaval came in a vote, not a revolution. For all the contempt for leaders or parties, and all the talk of uncontrollable populism, the mechanism of parliamentary democracy is still in very good health.
2. Scotland is not going independent. The defeats suffered by the Scottish National Party were deeper than anybody predicted, and the result is that indyref2 is dead and buried. The United Kingdom is less likely to break up than for several years.
3. This was not a defeat for free-market policies for the simple reason that the Conservative manifesto was its least free market since the days of mixed economies and Butskellism in the 1950s. “We do not believe in untrammelled free markets”, it said, and while that was a straw man, the message was clear that the 2017 Conservative Party believed in state interventionism almost as much as Jeremy Corbyn’s Labour Party. The way is still open to the Conservatives to turn Britain into an entrepreneurial and fast-growing economy that brings not just more prosperity but also more opportunity and generosity.
4. This was not a defeat for Brexit, however wishfully many (especially on the Continent) might see it that way. The two parties that did best were both in favour of Brexit. The parties that did worst — apart from Ukip — were the ones that wanted to reverse Brexit, especially the Liberal Democrats, the Greens and the SNP. The Conservatives have a pro-Brexit party, the Democratic Unionists, to help them in the Commons. So Brexit talks can go ahead and David Davis, the Brexit secretary, was one of the few Tories to emerge from the campaign with his reputation and influence enhanced. The talks may not reach a conclusion soon, but they won’t fail soon either.
5. The nostrum that the Brexit vote last year was all about immigration has been exposed as wrong. Theresa May’s obsessive focus on immigration, and her refusal to contemplate making an exception for students, has proved unpopular, especially with the young. There is probably a mandate for a version of Brexit that grabs the benefits of trading more with the world while remaining fairly open to immigration — just not discriminating in favour of European immigrants.
6. Jeremy Corbyn will continue to perform badly in the Commons, and his party will still be riven with factions, not least when Momentum and Unite flex their muscles against Blairites. Many sensible Labour politicians were longing to get rid of Mr Corbyn and his loony-left gang and are now saddled with extremism. They did well in the election because they were not writing a manifesto for governing, but a wildly unrealistic protest document that they (and most who voted for it) never expected to see implemented.
7. Although the Democratic Unionists tend to believe in some old-fashioned things such as young-earth creationism and that homosexuality is a sin, there is little chance of any of this making it on to the statute book and they know it. The Conservatives have passed gay marriage into law and are not about to go back on that. A shiny new bridge or two in Northern Ireland and some welfare pledges will be their price, not a Cromwellian reformation.
8. The barnacles of the 2015 manifesto have at least been scraped off the hull. One reason for the election was to escape the specific promises made then in the expectation of a coalition with the Liberal Democrats. That at least has been achieved. So while Britain has still to work out how to live within its means, and get its public finances in order, the government does at least have a bit more room for manoeuvre on tax and national insurance than it had before. Meanwhile, its precarious parliamentary position suggests it should try to depoliticise the NHS and social care by setting up independent commissions.
9. Project Fear has failed again. Let’s bury it. The next election, probably in 2018, and in all likelihood fought on better boundaries for the Tories, should be contested by the optimistic and authentic Boris Johnson or David Davis, with Ruth Davidson in charge of the manifesto. Whatever happens, Theresa May cannot be allowed to lead another campaign.
10. The political awakening of young people is a good thing, even if they appear never to have learned in school that communism was as big a disaster as fascism. Political parties urgently need to reinvent themselves for the digital age, something Douglas Carswell has been saying for a long time: more Facebook, less door-knocking. I had thought this lesson had been learnt after Vote Leave ran rings round Stronger In during the referendum last year, but obviously it needed rubbing in.
I admit I am clutching at straws a bit with this list. I would have preferred a different outcome to the election, and I still see some horrible possibilities in the present political situation: a recession leading to the fall of the Conservative government and the election of a Marxist one, for example. Nonetheless, crises create opportunities. Let us make the best of this one.
June 5, 2017
Frankenstein's anti-science message was always wrong
My Times column on the 200th anniversary of Frankenstein and the 600th of De Rerum Natura's rediscovery:
It was in May 1817, two centuries ago this month, that Mary Shelley completed the writing of Frankenstein, or The Modern Prometheus, which was published anonymously the next year. That first science-fiction novel has come to represent all that is dangerous about science. As well as being enacted almost 80 times in films, the book lends its plot to almost every film in which a scientist goes too far, as they usually do in films, from Metropolis to Jurassic Park. It inspires every campaign against biotechnology: the green movement fatally christened genetically modified crops “Frankenfoods”.
Gothic fantasy has infected reality. Those of us who argue that biological innovation — tinkering with the stuff of life — has proved a great force for good, and that the risk of hubristic disaster from research is largely a myth, could wish that Mary Shelley had not written the darned book, in which a brilliant scientist brings to life with an electric spark
a monster made from bits of dead people. There is a curious symmetry to this bicentenary. It is also 600 years this year since the discovery of the poem that could be called the anti-Frankenstein: De rerum natura (On the Nature of Things), by the mysterious Roman poet Lucretius.
There is a link between the two. Mary Wollstonecraft Godwin, as she then was, told a curious tale about how her spooky story came to be started. During the cold, wet summer of 1816 — sometimes called the “year without a summer” after the eruption of Mount Tambora in Indonesia — she was at a villa on the shores of Lake Geneva. There too were her lover (Percy Shelley), her half-sister (Claire Clairmont) and the latter’s lover (Lord Byron), gathered around the fire.
“Many and long were the conversations between Lord Byron and Shelley,” she wrote, “to which I was a devout but nearly silent listener. During one of these, various philosophical doctrines were discussed, and among others the nature of the principle of life, and whether there was any probability of its ever being discovered and communicated. They talked of the experiments of Dr Darwin . . . who preserved a piece of vermicelli in a glass case, till by some extraordinary means it began to move with voluntary motion . . . perhaps the component parts of a creature might be manufactured, brought together, and endued with vital warmth.”
This Dr Darwin was Erasmus, grandfather of Charles, who was not just a physician, inventor and scientist but also a famous poet, who would be an early influence on the Romantic poets, until they grew embarrassed at his flowery style and affection for pure reason. But what was his experiment with “vermicelli”, with pasta that came to life?
Erasmus Darwin’s last and longest poem, The Temple of Nature, was heavily influenced by Lucretius, who wrote of the spontaneous generation of life in rotting matter, which “brings forth worms”. Lucretius uses the word vermiculos for worms. Shelley too had read Lucretius. Could Mary have misheard vermiculos as vermicelli?
Darwin and Shelley were not the only poets to admire Lucretius. De rerum natura had inspired and obsessed to varying degrees Michel de Montaigne, Edmund Spenser, Ben Jonson, Francis Bacon, John Donne, Molière, John Dryden, John Milton and Alexander Pope. Indeed, Lucretius could be said to be the greatest single influence on the Enlightenment: cited, quoted or echoed by Galileo, Machiavelli, Spinoza, Newton, Hobbes, Locke, Leibniz, Rousseau and Berkeley. Voltaire called himself the latter-day Lucretius. Thomas Jefferson had five different translations in his library. Botticelli’s Venus depicts the opening scene of De rerum natura.
The reason for this fame was the daringly materialist and humanistic tone of the 7,400-line, unfinished poem. In a world struggling to escape from piety, inquisition and dogma, Lucretius’s radical free thinking was shocking and bold in early modern Europe. He argued that everything in the world consisted of nothing but atoms and voids recombined in various ways — which foreshadowed physics and chemistry, as we now know them.
He anticipated Charles Darwin in suggesting that nature ceaselessly experiments and those creatures that can adapt and reproduce will thrive.
De rerum natura argues that the body, the mind and the world are all the result of physical forces, not divine intervention. It rejects magic, mysticism, superstition and myth, calls providence a fantasy, says that there is no end or purpose to existence, no afterlife, and that there was no golden age of tranquillity and plenty in the distant past. At times Lucretius is so bluntly atheist, he makes Richard Dawkins sound like the Pope.
Written in Rome around the time of Caesar and Cicero, the poem had disappeared from sight for 14 centuries, suppressed by the Christian church as heresy. By 1417 it was only from references in Cicero and Virgil, and denunciations by St Jerome, that anybody knew of it at all. In that year, as recounted in Stephen Greenblatt’s fine book The Swerve, an underemployed papal secretary named Poggio Bracciolini found an intact copy of the entire poem, which had lain unread for centuries, in a monastic library in Germany. He transcribed the work and began disseminating it throughout Europe. It is an exaggeration to say that this caused the Renaissance or the Reformation, let alone the Enlightenment, but it was a significant influence.
Perhaps 1817 marks a sort of reaction against reason and in favour of mysticism again. With the writing of Frankenstein, and a few years earlier Goethe’s Faust, comes the first stirring of the modern mistrust of Promethean science, and dissatisfaction with materialism. Dr Frankenstein’s monstrous creation is the product of reason and experiment. He is brought to life with “galvanism”, the new discovery that dead frogs would twitch their limbs if given electric shocks.
Yet the monsters loose in the world today are the creation not of science, but of ideology and fanaticism. What Mary Shelley called “the nature of the principle of life” has indeed been discovered, in the form of natural selection, the genetic code and biochemistry. Life did turn out to have a comparatively simple cause, and knowing it did led overwhelmingly to beneficial consequences for humankind. Lucretius was right: materialism is less dangerous than superstition. So let us bury Frankenstein’s monster.
Why no mention of enterprise and innovation?
My Times column on Britain's general election and the missing optimism about innovation:
Against the background of a terrorist campaign, a Tory government under a determined woman was cruising towards an easy victory against a socialist Labour party in a June election, but stumbling badly in the campaign. It was a dangerous world, with an impulsive American president and an undemocratic Russia and China. There was a funding crisis in the NHS and dire warnings of global environmental disaster: yes, this was 1987, the year of Margaret Thatcher’s third election victory — and of the Enniskillen bombing, shortly after, which killed 12 and injured 63.
Neil Kinnock’s Labour manifesto of 1987 reads very like Jeremy Corbyn’s: in favour of nationalised utilities and more money for the NHS, against nuclear missiles. The two manifestos said “this general election on June 11 faces the British people with choices more sharp than at any time in the past 50 years” (1987), and “what makes this election different is that the choice is starker than ever before” (2017).
In the case of 1987 we know what happened next. The British people were embarking on a period of prosperity unprecedented in their history, belying the competing pessimisms of the parties’ campaigns. Over the past three decades, per capita GDP is more than one-and-a-half times as large in real terms from £18,033 to £28,488 (in 2016). In 1987 Britain’s GDP per capita was lower than that of both Italy and France. Today it is higher than both.
Despite an increase in population, an increase in women working and the loss of old-style jobs to automation, the employment rate in Britain is higher today, at 75 per cent, than it was then, at 69 per cent. Then, 1987 was considered a boom time but unemployment was 10.6 per cent, compared with 4.6 per cent today.
Compared with three decades ago, hourly wages are up, manufacturing production up, working hours down, food and clothing prices down. The tax threshold is much higher, the top tax rate lower and more of the country’s tax is paid by the richest few per cent. Income inequality is about the same. London has gone from sleepy commercial backwater to the world’s financial centre. Its cuisine is unrecognisably transformed.
These are extraordinary changes for the better. There is more. There are twice as many university places today. In those three decades, Britain’s carbon dioxide emissions have declined from ten to seven tonnes per capita. Although there are nearly twice as many cars on the road, more than twice as many rail passengers and more than three times as many air passengers, the air is much cleaner today.
NHS expenditure has more than trebled since 1987 in real terms. Life expectancy has increased from 75 to 82 years. Age-adjusted cardiovascular death rates among women have halved, as has age-adjusted lung cancer mortality among men. The number of crimes in the Crime Survey for England and Wales (excluding fraud) has halved.
And I have not started on the improvements in technology. A mobile telephone in 1987 weighed 1.5 kilograms, cost £2,000 (£5,000 in today’s money) and had half an hour of battery life after ten hours of charging. There was no internet outside a few institutions; the search engine had not been invented, let alone email or social media. Air travel cost more than double what it does today, telephone calls even more than that.
My point is that none of this promising possibility merited a mention in the manifestos of the day. There were no competing visions of making Britain great, to borrow a phrase. Instead, the discourse then as now was dominated by doom and gloom about the future.
It was in 1987 that Gro Harlem Brundtland, the former Norwegian prime minister, released her eponymous report to the United Nations on environment and development. She warned: “This ‘greenhouse effect’ may by early next century have increased average global temperatures enough to shift agricultural production areas, raise sea levels to flood coastal cities, and disrupt national economies.” She warned that the deserts were advancing, forests were disappearing, acid rain was devastating ecosystems and the ozone layer was in trouble “to such an extent that the number of human and animal cancers would rise sharply and the oceans’ food chain would be disrupted”.
None of which happened, partly because action was taken but mostly because the scares were wildly overblown. Deserts have been retreating for years now, partly thanks to global greening caused by higher levels of carbon dioxide. The net rate of forest loss has fallen to approximately zero according to the UN Food and Agriculture Organisation. Sea-level rise has been modest. The number of people living in extreme poverty more than halved. Famine largely disappeared. Population growth rates have fallen steeply.
So imagine now looking back from 30 years hence, in 2047. We may all have been incinerated by asteroids or war. If Jeremy Corbyn’s socialism comes to power this week, we may have become the world’s basket case, like today’s Venezuela or Cuba. (“Chávez showed us that there is a different and a better way of doing things,” Mr Corbyn said when Hugo Chávez died; he called Fidel Castro a “champion of social justice” when that homophobic brute died.)
But the overwhelming probability is that by 2047 there will have been a vast improvement in living standards, opportunities and knowledge, and that Britain will have enjoyed more than its fair share of this thanks to its language, its law, its science, its open economy and its decision to leave a protectionist and dirigiste Brussels regime to become a global champion of free trade. With the likely exception of our ballooning £2 trillion national debt, many of the problems that preoccupy us today will have been solved or ameliorated. If we are steadfast and sensible, the Islamist terrorists will eventually fail just as the Irish nationalists did.
And this will have come about, not because of what politicians promise to do to the country, but because of what a law-abiding and free people can achieve and do for each other through enterprise, innovation and exchange. The reason that Theresa May has stumbled so badly in this campaign is because she has failed to set out an optimistic vision of mutual prosperity creation in a liberal society, and has instead allowed herself to be drawn into a mean-spirited, zero-sum bidding war funded by taxation.
Notes:
On desertification: - "Despite an increasingly sophisticated understanding of dryland environments and societies, the uses now being made of the desertification concept in parts of Asia exhibit many of the shortcomings of earlier work done in Africa. It took scientists more than three decades to transform a perceived desertification crisis in the Sahel into a non-event. This book is an effort to critically examine that experience and accelerate the learning process in other parts of the world." http://www.springer.com/us/book/9783642160134
On forests: "Meanwhile, the net annual rate of forest loss has slowed from 0.18 percent in the early 1990s to 0.08 percent during the period 2010-2015." http://www.fao.org/news/story/en/item/326911/icode/
On global greening: "Results showed that carbon dioxide fertilization explains 70 percent of the greening effect, said co-author Ranga Myneni, a professor in the Department of Earth and Environment at Boston University. 'The second most important driver is nitrogen, at 9 percent. So we see what an outsized role CO2 plays in this process.' " https://www.nasa.gov/feature/goddard/2016/carbon-dioxide-fertilization-greening-earth
May 25, 2017
Nobody knows how best to tackle obesity
My Times column on obesity:
Even optimists admit that some things are undoubtedly getting worse: things like traffic jams, apostrophe use — and obesity. The fattening of the human race, even in middle-income countries, is undeniable. “Despite sustained efforts to tackle childhood obesity, one in three adolescents is still estimated to be overweight or obese in Europe,” said a report last week to the World Health Organisation. That means more diabetes and possibly a reversal of the recent slow fall in age-adjusted cancer and heart disease death rates.
Perhaps we should remind ourselves first that it is a good problem to have, a symptom of abundance. In Britain a century ago and in much of Africa today, the poorest people were or are the thinnest people. For hundreds of thousands of years it was very difficult to get fat, and very easy to starve or be stunted by hunger and malnutrition. Let’s be thankful that, despite quadrupling the global population in less than a century, we now have a problem of obesity, because of a global cornucopia of fine food unimaginable to past generations.
In western countries, obesity is worst among the poor, so it cannot be a matter of affluence alone. Urban areas of England with the highest levels of income deprivation are also the places with the highest obesity rates among young children. By contrast, among the most affluent people, anorexia is a more lethal disorder, and is increasing fast.
At the weekend Tam Fry of the National Obesity Forum claimed implausibly that obesity now costs the state £24 billion a year. The Institute of Economic Affairs puts the cost at less than £2.5 billion, and argues that “while claims of a crippling cost are a good way to get media attention . . . they irresponsibly incite resentment of a vulnerable group”. Also, if you die younger, you cost the state less, so the financial perspective is the wrong way to look at it.
Recognising that something is a problem is not the same as knowing what to do about it. Obesity is one of those cases where “demands for urgent action” go unheeded, not because of the callousness of our leaders but because there’s no agreement on what action to take. The range of suggestions for dealing with obesity — sugar taxes, bans on junk food on public transport, bans on junk food advertising before 9pm, health warnings on fast food, mock-up pictures of what kids will look like as fat adults, gastric balloons — only serves to remind us that nobody knows how best to reverse the obesity trend. Jamie Oliver, the TV chef, argues that the proposed Conservative policy of means-testing free hot school lunches for infants would worsen obesity.
Advising, hectoring and bribing people to eat less and exercise more appears to be ineffective. We have just about tested that idea to destruction. It isn’t working, and it probably will only work if it becomes fully totalitarian, with police raids on home kitchens to seek out and destroy secret stashes of biscuits.
The one thing we do know is that the simple equation so beloved of the medical profession is not the answer. It is not as simple as an in-out calorie balance sheet: eat less than you burn and lose weight. This fails to take into account a thing called appetite, and the way some people lay down fat while eating not very much, while others burn it easily while eating quite a lot.
As Gary Taubes, the heretical science writer who has made a career out of this issue, put it in the British Medical Journal a few years ago, “efforts to cure the problem by inducing under-eating or a negative energy balance, either by counselling patients to eat less or exercise more, are remarkably ineffective”. Even The Handbook of Obesity, the doctors’ textbook, admits that the result of such dietary therapy is “poor and not long-lasting”.
We all know friends who have shed the pounds through superhuman efforts of self-denial, and then gradually put them back on again afterwards. The public health lobby hardly helps by censoriously attacking all “fat and sugar”, or all “processed food”, and often “red meat” too. Which leaves a diet as depressing as it is unrealistic: steamed cod and boiled kale. The public health lobby must make up its mind whether it thinks carbs are bad or fat is bad: attacking both is silly.
Having spent decades urging people to adopt low-fat diets and watched obesity explode, the nannies cannot bring themselves to admit that this was terrible advice which almost certainly made the problem worse. Why? Because fat is satiating in a way that carbohydrates are not, and the body generally synthesises fat from dietary carbs, not from dietary fat. In the Stone Age, eating fat probably signalled a time of plenty, when laying down stores around the midriff was not urgent.
Logically, the heredity of obesity is almost certainly rising. In a world of food shortages, the only way to get fat was to be rich, so obesity was mainly an environmentally determined trait. In a world where so many can afford lots of cheap food, the ones to get fat will often be the ones who inherit some tendency to eat more or lay down more of their food as fat. Given ad-lib food, a greyhound will stay slim while a labrador balloons — it’s in their genes. Not all the variation in obesity between individuals will be explained by genetics but, statistically speaking, there will be greyhound tendencies and labrador tendencies.
Frankly, we just do not know why some people lay down fat more easily than others. Is it because they burn fewer calories even when not exercising? Is their digestion more efficient? Is their appetite greater, so they do eat more? Do they seek out carbs? Is the difference genetic, with some people having variants of genes that encourage fat deposition? Is it because fat people’s gut bacteria are different — a real possibility supported by increasingly persuasive experiments and transplants? All of these theories have something going for them. But not enough to justify the moralising tone and adamantine certainty that so often accompanies medical professionals’ pronouncements on the topic of obesity. We do not know enough.
What should a government do when there’s great uncertainty about both causes and the right course of action? Experiment, of course. Come up with five policies, ask for volunteers in five different parts of the country, and carefully measure the waistlines of people affected.
The conceptual penis as a social construct
My Times column on an academic hoax:
The latest university prank is embarrassing to academia and hilarious for the rest of us. Philosophy professor Peter Boghossian and mathematician Dr James Lindsay made up a learned paper on the “conceptual penis” as a “gender-performative, highly fluid social construct” that is “the conceptual driver behind much of climate change”, stuffed it full of random jargon and fake references and then got it through peer review into an academic journal.
True, it was a low-grade, pay-to-publish journal of the kind that has proliferated recently as a money-making venture, but the authors were recommended to try that journal by a serious journal, and the peer review was genuine. As the authors have written of their own work: “We don’t understand it either. Nobody does. This problem should have rendered it unpublishable in all peer-reviewed, academic journals.”
This happened last year, too, when Professor Mark Carey published an even more absurd paper arguing that “a critical but overlooked aspect of the human dimensions of glaciers and global change research is the relationship between gender and glaciers” and introducing “feminist glaciology”. In that case, however, the professor continues to insist, against all evidence, that he was serious. Science magazine gave him a lengthy, softball interview to justify his work after it was laughed at on the internet. I still think he’s a joker in deep cover.
Neither paper would have been published if it had not fitted the prejudices of much of academia: leftist, postmodern, relativist, feminist and moralising. “The academy is overrun by left-wing zealots preaching dangerous nonsense,” says Boghossian. “They’ve taught students to turn off their rational minds and become moral crusaders.”
As a system of ensuring quality in research, peer review is in deep trouble. It allows established academics to defend their pet ideas and reward their chums. Its one-sided anonymity, in which the referee retains his anonymity but the author does not, could hardly be better designed to ensure cronyism.
Worse, as a recent report by Donna Laframboise, a Canadian investigative journalist, concluded: “A journal’s decision to publish a paper provides no assurance that its conclusions are sound . . . Fraudulent research makes it past gatekeepers at even the most prestigious journals. While science is supposed to be self-correcting, the process by which this occurs is haphazard and byzantine.”
Peer review’s flaws now allow people with an axe to grind to dismiss even the most rigorous and careful of science along with the nonsense. It’s time for science, and the softer social sciences in particular, to get their house in order.
May 21, 2017
The Red Queen race against computer viruses
My Times column on malware, ransomware and the battle against viruses:
The WannaCry ransomware cyberattack of last week, which briefly crippled much of the National Health Service, may be the biggest, but it will not be the last outbreak of cybercrime. Remember your Through the Looking-Glass. The Red Queen lives in a world where, she says: “It takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that.” We, the good guys, are locked in a Red Queen race with hackers, just as we, the human race, are locked in a race with real viruses, and with antibiotic resistance.
It is a race in which permanent victory is impossible, but so is permanent defeat. Perpetual struggle is inevitable. I say this with confidence because for once the biological analogies are apt. The right way to think about cybersecurity is epidemiological. Indeed, the similarity between a computer virus and a real virus is more than a metaphor: both are pieces of linear digital information (one made of binary electronic digits, the other of quaternary DNA bases) capable of getting themselves replicated and spread. One leading theory is that sexual intercourse evolved, a billion years ago, as a security patch against parasites.
The fact that malware is manmade while maladies are not makes little difference. So long as there are enough actors out there experimenting, both will evolve, through mutation, recombination and selection — through trial and error. That this latest cyberweapon may have been enhanced with something called EternalBlue stolen from America’s National Security Agency is again not altogether surprising to a biologist. Parasites have a habit of stealing good genetic ideas from their hosts.
Computer viruses are as old as computing. The first widespread one, Elk Cloner, spread through Apple computers in 1981 via floppy disks. Ransomware first appeared in 1989, with a trojan horse called AIDS. By the early 1990s you could buy anti-virus software. Especially bad outbreaks occurred in 2003 (the “slammer worm”) and 2009 (the “conficker worm”), just like the bad plague years of AD541, 1346 and 1665. But apocalyptic warnings that computer worms and viruses would eventually win proved wide of the mark. I recall business seminars around the turn of the millennium at which the audience was effectively told that the problem of computer viruses was insoluble so the end of the web was near. This was around the time we were told that computers would fail, and social order would collapse, because software could not cope with the start of a new millennium. In practice, anti-virus protection has evolved just as fast.
Perhaps we were just lucky, then. Despite the supposed heroic but accidental action last week of MalwareTech, an anonymous 22-year-old, I don’t think it is luck. Here is why the good guys will always be able to defeat the bad guys — temporarily: the former can operate in the daylight, the latter must stay in the dark. This was brought home to me about ten years ago when my laptop was infected by a virus and I quickly found a website on which people were freely sharing the latest features of this virus and how to deal with it. Such open sharing is not available to hackers, however large the dark web gets.
Thus Microsoft already has a patch for the WannaCry ransomware, released in March, having been alerted perhaps by the NSA itself. That some organisations, such as the NHS, have plainly done a terrible job of keeping their computer security updated is reprehensible but no great surprise. It is a bit like a community that relaxes its vaccination rate.
Minnesota is currently experiencing a measles outbreak: about 50 people have gone down with the virus, mostly from the Somali immigrant community. This is entirely because vaccination rates in that community have halved thanks to the recent influence of the anti-vaxxer movement and its autism theory. Drop your vaccination guard and the Red Queen will strike. Don’t update your cybersecurity and ditto.
Here’s another parallel. Antibiotic resistance is also a Red Queen phenomenon, in which new antibiotics must continually be introduced to counter antibiotic resistance. In failing to invent new antimicrobials, it is as if we have been failing to update our pharmacological security software.
Notice, too, that hospitals are the epicentres of antimicrobial resistance, plagued by MRSA and C. difficile. This is largely because they are full of ill and vulnerable people, some with fresh holes cut into them — tempting buffets for bacteria. Hence hospitals use lots of antibiotics, putting selection pressure on bacteria to evolve resistance. It is a curious coincidence that hospital computer systems likewise have to be open to sharing data with many partners, making them vulnerable to digital invaders, as we now know.
There is one computer system that is so clogged with old malware that there is hardly any space left for the real programs. Of its code, almost half consists of so-called transposable elements. Some are full viruses, some are attenuated and abbreviated relics of viruses, and some are small vestiges of viruses that piggyback on viruses — parasites of parasites. The entire thing is infested with digital parasites. I am describing the human genome, the computer system inside each of your many trillion cells, the one Mother Nature programmed.
Fortunately the vast majority of these transposable elements and endogenous retroviruses are in a quiescent state, shut down and harmless. Occasionally, though, they seem to wake up and proliferate like real viruses. One called AluJ was last active 65 million years ago, another called AluS is 30 million years old, while a third called AluY sometimes springs to life today, messing up genes when it does so. Take some comfort from the fact that Shakespeare wrote Hamlet and Einstein discovered relativity using mental computers inside whose cells were millions of such digital viruses. Fun fact: birds, which have a greater need to control their weight so they can fly, do a better job of cleaning spam out of their genes than mammals do.
The lesson of this week is eternal vigilance: update your software regularly, keep back-ups, filter mail and be suspicious of attachments. Don’t expect the problem to go away, or to find a silver bullet that kills the problem for ever, but don’t expect malware to defeat us either.
May 15, 2017
Wind is an irrelevance to the energy and climate debate
My Spectator article on the futile numbers behind wind power:
The Global Wind Energy Council recently released its latest report, excitedly boasting that ‘the proliferation of wind energy into the global power market continues at a furious pace, after it was revealed that more than 54 gigawatts of clean renewable wind power was installed across the global market last year’.
You may have got the impression from announcements like that, and from the obligatory pictures of wind turbines in any BBC story or airport advert about energy, that wind power is making a big contribution to world energy today. You would be wrong. Its contribution is still, after decades — nay centuries — of development, trivial to the point of irrelevance.
Even put together, wind and photovoltaic solar are supplying less than 1 per cent of global energy demand. From the International Energy Agency’s 2016 Key Renewables Trends, we can see that wind provided 0.46 per cent of global energy consumption in 2014, and solar and tide combined provided 0.35 per cent. Remember this is total energy, not just electricity, which is less than a fifth of all final energy, the rest being the solid, gaseous, and liquid fuels that do the heavy lifting for heat, transport and industry.
[One critic suggested I should have used the BP numbers instead, which show wind achieving 1.2% in 2014 rather than 0.46%. I chose not to do so mainly because that number is arrived at by falsely exaggerating the actual output of wind farms threefold in order to take into account that wind farms do not waste two-thirds of their energy as heat; also the source is an oil company, which would have given green blobbers a excuse to dismiss it, whereas the IEA is unimpleachable But it's still a very small number, so it makes little difference.]
Such numbers are not hard to find, but they don’t figure prominently in reports on energy derived from the unreliables lobby (solar and wind). Their trick is to hide behind the statement that close to 14 per cent of the world’s energy is renewable, with the implication that this is wind and solar. In fact the vast majority — three quarters — is biomass (mainly wood), and a very large part of that is ‘traditional biomass’; sticks and logs and dung burned by the poor in their homes to cook with. Those people need that energy, but they pay a big price in health problems caused by smoke inhalation.
Even in rich countries playing with subsidised wind and solar, a huge slug of their renewable energy comes from wood and hydro, the reliable renewables. Meanwhile, world energy demand has been growing at about 2 per cent a year for nearly 40 years. Between 2013 and 2014, again using International Energy Agency data, it grew by just under 2,000 terawatt-hours.
If wind turbines were to supply all of that growth but no more, how many would need to be built each year? The answer is nearly 350,000, since a two-megawatt turbine can produce about 0.005 terawatt-hours per annum. That’s one-and-a-half times as many as have been built in the world since governments started pouring consumer funds into this so-called industry in the early 2000s.
At a density of, very roughly, 50 acres per megawatt, typical for wind farms, that many turbines would require a land area [half the size of] the British Isles, including Ireland. Every year. If we kept this up for 50 years, we would have covered every square mile of a land area [half] the size of Russia with wind farms. Remember, this would be just to fulfil the new demand for energy, not to displace the vast existing supply of energy from fossil fuels, which currently supply 80 per cent of global energy needs. [para corrected from original.]
Do not take refuge in the idea that wind turbines could become more efficient. There is a limit to how much energy you can extract from a moving fluid, the Betz limit, and wind turbines are already close to it. Their effectiveness (the load factor, to use the engineering term) is determined by the wind that is available, and that varies at its own sweet will from second to second, day to day, year to year.
As machines, wind turbines are pretty good already; the problem is the wind resource itself, and we cannot change that. It’s a fluctuating stream of low–density energy. Mankind stopped using it for mission-critical transport and mechanical power long ago, for sound reasons. It’s just not very good.
As for resource consumption and environmental impacts, the direct effects of wind turbines — killing birds and bats, sinking concrete foundations deep into wild lands — is bad enough. But out of sight and out of mind is the dirty pollution generated in Inner Mongolia by the mining of rare-earth metals for the magnets in the turbines. This generates toxic and radioactive waste on an epic scale, which is why the phrase ‘clean energy’ is such a sick joke and ministers should be ashamed every time it passes their lips.
It gets worse. Wind turbines, apart from the fibreglass blades, are made mostly of steel, with concrete bases. They need about 200 times as much material per unit of capacity as a modern combined cycle gas turbine. Steel is made with coal, not just to provide the heat for smelting ore, but to supply the carbon in the alloy. Cement is also often made using coal. The machinery of ‘clean’ renewables is the output of the fossil fuel economy, and largely the coal economy.
A two-megawatt wind turbine weighs about 250 tonnes, including the tower, nacelle, rotor and blades. Globally, it takes about half a tonne of coal to make a tonne of steel. Add another 25 tonnes of coal for making the cement and you’re talking 150 tonnes of coal per turbine. Now if we are to build 350,000 wind turbines a year (or a smaller number of bigger ones), just to keep up with increasing energy demand, that will require 50 million tonnes of coal a year. That’s about half the EU’s hard coal–mining output.
Forgive me if you have heard this before, but I have a commercial interest in coal. Now it appears that the black stuff also gives me a commercial interest in ‘clean’, green wind power.
The point of running through these numbers is to demonstrate that it is utterly futile, on a priori grounds, even to think that wind power can make any significant contribution to world energy supply, let alone to emissions reductions, without ruining the planet. As the late David MacKay pointed out years back, the arithmetic is against such unreliable renewables.
MacKay, former chief scientific adviser to the Department of Energy and Climate Change, said in the final interview before his tragic death last year that the idea that renewable energy could power the UK is an “appalling delusion” -- for this reason, that there is not enough land.
The truth is, if you want to power civilisation with fewer greenhouse gas emissions, then you should focus on shifting power generation, heat and transport to natural gas, the economically recoverable reserves of which — thanks to horizontal drilling and hydraulic fracturing — are much more abundant than we dreamed they ever could be. It is also the lowest-emitting of the fossil fuels, so the emissions intensity of our wealth creation can actually fall while our wealth continues to increase. Good.
And let’s put some of that burgeoning wealth in nuclear, fission and fusion, so that it can take over from gas in the second half of this century. That is an engineerable, clean future. Everything else is a political displacement activity, one that is actually counterproductive as a climate policy and, worst of all, shamefully robs the poor to make the rich even richer.
May 14, 2017
The argument for controlling badgers
My Times article on badger culling:
If Theresa May is happy to see a return of foxhunting, she must be consistent and face down the misguided animal welfare lobby with a pledge to cull more badgers. There are three reasons that a continuing, wider and bigger badger cull is the right thing to do for humane, as well as financial and environmental, reasons.
First, badger culls work. They worked in Ireland, where bovine tuberculosis has been largely eliminated. Recent badger culls in Britain, though apparently designed by timid bureaucrats to fail and thereby frighten off politicians, have almost certainly been a success, resulting in a big drop in tuberculosis among cattle. True, the government has been slow to publish this officially — the data are working their way through the scientific journals — but the anecdotal evidence is now strong.
Farmers say that dozens of farms in the cull zones that had been closed down by TB for decades are now going clear, while a few farms that refused to take part in the cull are going down with TB again. Remember this disease has caused suffering among not just cattle but badgers, other wildlife and — emotionally and commercially — farmers themselves.
Second, a wider badger cull would save the hedgehog. Any rise in badgers is by far the most reliable predictor of the disappearance of hedgehogs. Badgers are now invading suburbs, the last refuge of the hedgehog. Controlling badger numbers is the best way to bring hedgehogs back to the countryside. Likewise, bumble bees would benefit — badgers are their biggest enemies.
Third, human beings should not shirk their duty as the apex predator. Having long got rid of the wolf and the lynx, people have unleashed middle-ranking “meso-predators” such as badgers and foxes to reach unnatural densities with devastating effects on other species. To restore an ecological balance, they need to control the numbers of these animals.
We recognise this need with foxes, stoats, weasels, gulls, cormorants, magpies and crows, all of which are clled by conservationists under licence for the benefit of lapwings, curlews, songbirds, hares, fish and the environment in general. There is no conceivable reason we should not do the same with badgers. They were granted protection for conservation reasons, not because they are any more likely to suffer when shot than these other species.
The celebrity-led badger lobby dogmatically insists there be no control, and wages war on the culling policy with propaganda, threats and violence. It does not speak for rural Britain, the vast majority of whose inhabitants know that nature conservation requires balanced management of predators.
May 9, 2017
The Paris climate treaty is weak, so why do climate activists defend it?
My Times column on the Paris climate deal:
President Trump will decide shortly whether to pull the US out of the Paris agreement on climate change. By all accounts, his instincts and his campaign promises encourage him to do so while his daughter Ivanka and his secretary of state Rex Tillerson want him not to. He has already started rolling back the “clean power plan”, which was Barack Obama’s way of meeting America’s commitment under the Paris agreement.
If he does pull out, or send the agreement to the Senate for ratification on the grounds that it is a “treaty” — something Obama took great pains to try to deny so that he would not have to send it to the Senate — there will be a fresh paroxysm of rage among his critics. Climate scepticism is high among reasons that the left hates Trump. By contrast, it is one of the few things on which I half agree with him.
I am not quite sure why his critics mind so much. Indeed, if I were one of those who thought climate change the biggest threat to humankind bar none, then I would be far more critical of the Paris agreement than I actually am. I would rail against the fact that it is a futile gesture, neither legally binding enough to be enforceable, nor of sufficient scale to make a difference to climate change. It’s those people who most worry about global warming who should be most critical of Paris.
To understand why, wind the clock back eight years to the Copenhagen climate conference in 2009, where world leaders were humiliated by their inability to reach agreement on a replacement for the Kyoto treaty of 1997. Kyoto’s fundamental flaw was beyond remedy: it put a heavy burden on the developed world, while rapidly growing countries, such as China, were unaffected. After Copenhagen the whole Kyoto process unwound. The US failed to ratify, Japan said it would never commit to a similar treaty in the future, Russia rejected the second round of commitments and Canada pulled out completely.
Determined not to repeat the Copenhagen fiasco, the climatocracy dominated by the United Nations, the European Union and the green NGOs set about building a new approach. In 2011, at Durban, they got world leaders to agree an EU plan to put in place by 2015 a legally binding treaty that would apply to the whole world and come into force by 2020. Connie Hedegaard, European commissioner for climate action, boasted that “the EU has achieved its key goal for the Durban climate conference” of agreeing to a “roadmap towards a new legal framework by 2015”.
But others were worried. Greenpeace observed that “the Durban Platform still includes wording that could be exploited by the US and its allies to push a voluntary rather than binding approach, and risks locking in the current inadequate level of carbon cuts for a decade”.
So it was clear that the 2015 Paris treaty was to be legally binding, not voluntary, and extreme, not modest, or it would be a failure. In November of that year, as Paris approached, this was reiterated repeatedly. “The Paris agreement must be an international legally binding agreement,” said the EU’s spokesman. The French foreign minister, Laurent Fabius, even rebuked John Kerry, the US secretary of state, for casting doubt on whether a legally binding treaty was possible. Mr Kerry was “confused”, he said.
However, Mr Kerry was right, and during the Paris meeting it became clear that no such agreement was possible. Instead of admitting another failure, the envirocrats decided to change tack: they abandoned any pretence of a legally binding agreement, called for voluntary offers of emission reduction, but covered this all up with a full-volume declaration of victory. When I pointed out the volte-face in a speech in the House of Lords, I was told by my own front bench that only North Korea agreed with me. Said the minister: “If it really was, as he perhaps implies, just a piece of paper and not worth the paper it is written on, why was it so hard an agreement to reach?” I do not follow this logic.
Yet not only was the Paris agreement not legally binding, it was also deeply impractical. The intended nationally determined contributions (INDCs) that each country offered turned out to be feeble. In a peer-reviewed paper, the economist Bjorn Lomborg calculated how much the pledges would reduce warming, using standard models and generous assumptions about how quickly the reductions would be achieved and how long they would be sustained.
He found that all the promises made by the US, China, the EU and the rest of the world, if implemented from the early 2000s to 2030, and then sustained through the rest of the century, would reduce the expected rise in global temperature by only 0.17°C in the year 2100. That is to say, instead of rising by 2, 3 or 4 degrees or so by the time our great grandchildren are adults, world average temperature would rise by 1.83, 2.83 or 3.83 degrees. Lomborg put it this way: “Current climate policy promises will do little to stabilise the climate and their impact will be undetectable for many decades”. A different study by scientists at MIT came to similar conclusions. The INDCs add up to the square root of zilch.
However, and this is the crucial point, Lomborg also points out this invisible achievement would come at a staggering cost, somewhere between $1 trillion and $2 trillion a year: “Paying $100 trillion for no good is not a good deal”. Who could disagree? Lomborg wants Trump to can the Paris agreement, which he rightly judges to be a feelgood gesture that distracts attention from aggressive research into low-emitting, cost-effective energy technologies, which is the only realistic way to reduce fossil fuel consumption.
Thus Paris embodies precisely what the green movement worried about after Copenhagen: that a weak and non-binding agreement would be worse than futile. Yet the disastrous Kyoto story is repeating itself; adherence to Paris has become a totem of global determination to tackle climate change while the agreement seems purpose-built to prevent the very economic sophistication on which any low-carbon future depends.
Britain, meanwhile, remains the only major economy legally bound by statute to reducing carbon dioxide emissions by its 2008 Climate Act: "The Climate Change Act requires the government to set legally-binding ‘carbon budgets’."
May 8, 2017
Britain should adopt the Innovation Principle
Here's my recent Times column:
An open letter to George Freeman MP, chairman of the government’s policy board.
Dear George, as a former biotech venture capitalist, you are a passionate champion of innovation. It has pulled an average of 137,000 people out of extreme poverty each and every day of the past 25 years. It’s the only thing that can pay off our £1.9 trillion national debt while keeping our grandchildren prosperous. You are on record as saying: “We have a once-in-a-generation chance to seize the opportunities to make the UK the innovation capital of the world, defying the doubters and being clear that we will go on leading the world in science and technology.”
I don’t know what it is like down your end of the parliamentary corridor, but at ours we spend very little time talking about how to encourage and reward innovation, despite its clear importance. We are too busy instead with regulation and spending. Although it hardly features in debates or coverage of British politics, there is an entrepreneurial revolution going on in Britain, with about 600,000 business start-ups a year, more than in any other European country — driven by some excellent pro-enterprise policies started under David Cameron.
However, most of this is in the digital economy, where innovation is comparatively “permissionless” — for now. Elsewhere, there is a real problem with bad, inconsistent or burdensome regulation stifling innovation. So here is a suggestion for the Conservative Party manifesto that I think could help unleash beneficial innovation in years to come, or at least prevent it grinding to a halt. It is called the innovation principle.
It was proposed by the European Risk Forum and is actively supported by BusinessEurope and the European Round Table of Industrialists. In essence, it says: examine every policy, plan or political strategy for the impact it could have on innovation, and if you find evidence that the policy is going to impede it, then drop, change or rethink the thing.
Twenty-two chief executives from some of the world’s more innovative companies signed a letter to Jean-Claude Juncker in 2014 asking him to adopt the innovation principle, and the Dutch prime minister, Mark Rutte, endorsed it last year during his country’s presidency of the EU, saying: “I particularly support . . . the innovation principle. This principle requires assessing — as standard — the impact of proposed new legislation on innovation. In my view, we should position the innovation principle more prominently in the years ahead, to make the EU economy stronger and more agile.” So there is a chance that the European Union will get the point, and remember Britain tends to over-apply regulations anyway.
The innovation principle would complement rather than replace the precautionary principle, which is incorporated into the Lisbon treaty. At its best the precautionary principle is a good thing, preventing future thalidomide tragedies. It says: think about the risks before you adopt something new. At its worst, it does huge harm, because it says: banish potential hazards without considering the benefits of an innovation, while ignoring the hazards of an existing technology, and therefore don’t do anything new.
For example, European certification of genetically modified crops is so impossibly slow, uncertain and politicised as to have frightened off all applications in recent years. The result is that Europe has missed out on the organic, insect-resistant “Bt” revolution in plant breeding and is far more reliant on pesticides instead. The newer technology of gene editing, which is being pioneered in British laboratories, has been given a green light by US regulators, while the EU has — get this — asked member states to avoid making a decision at all. What signal does that send?
One chemical manufacturer tells me he estimates that to develop a new herbicide that could safely tackle black-grass, an increasingly problematic weed, it would be necessary to comply with between 15,000 and 40,000 pages of regulations and scientific guidance documents. Few companies in the world have this capability.
Next month the implementation of new European rules on vaping arrives, which restrict e-cigarette advertising, limit tank and refill sizes, and cap the strength of liquids, making it harder for this much safer new technology to continue to displace smoking at a rate of knots. Thus, in the name of the precautionary principle, the EU has been insisting on danger.
Last December, BusinessEurope produced a long catalogue of cases in which EU regulation had affected innovation. The list includes two cases where regulation stimulated innovation (waste policies and sustainable mobility), but far more where it hampered change by introducing legal uncertainty, inconsistency with other regulations, technology-prescriptive rules, burdensome packaging requirements, high compliance costs or excessive precaution. For example, the EU medical devices directive has greatly increased the cost and reduced the supply of new medical devices.
Don’t take comfort from our many digital start-ups. As a chemical industry executive put it to me last week, the digital industry does not realise what’s about to hit it. The General Data Protection Regulation (GDPR) comes into force next year, along with new ePrivacy regulation. The British Information Commissioner’s Office has issued draft guidance that gold-plates the GDPR, with an unduly strict and burdensome interpretation that tilts the balance against the data industry in a manner not required by the regulation. Its proposed rules on opt-outs and consents would make direct marketing as presently practised here impossible. Data analysis is a growing industry in which we have the potential to be a world leader.
An abstract principle, however pious, is not going to change anything. The only way to get momentum for reform is to create a vested interest, preferably with a budget. Precaution has a wealthy constituency in the vast green lobby, whereas innovation’s benefits are dispersed among consumers. So here’s a suggestion. Why not pledge to set up an innovation commission, guided by the innovation principle, whose job is to examine policies and report on their likely impact on innovation? New Zealand has tried something similar, a productivity commission, which “considers whether laws, policies, regulations and institutions best support the wellbeing of New Zealanders”.
Without this, we will be in the economic slow lane, which will hamper our ability to generate funds for social welfare, defence, foreign aid, education and everything else.
Yours optimistically, Matt.
Matt Ridley's Blog
- Matt Ridley's profile
- 2180 followers
