Matt Ridley's Blog, page 24
October 19, 2016
Response to Bob Ward's letter
I have sent the following letter to the president of the Royal Society and the Chairman and director of the Global Warming Policy Foundation in response to a highly misleading letter to me that was copied to them.
To Sir Venki Ramakrishnan FRS, Lord Lawson and Dr Benny Peiser
19.10.2016
Dear Sir Venki, Lord Lawson and Dr Peiser,
You have been sent a letter by Bob Ward of the Grantham Institute at the London School of Economics complaining about what I said about Dr Ranga Myneni in my lecture at the Royal Sociey.
Dr Ranga Myneni has already responded to my lecture and does not make the same complaints about misrepresentation made by Mr Ward.
http://sites.bu.edu/cliveg/mynenis-response-to-lord-ridleys-gwpf-talk/.
Dr Myneni does, however, in his response, make two entirely false accusations against me, saying that I go “on to ignore 30+ years of IPCC assessments!”, when in fact I discussed several such assessments and quoted verbatim from two, one in 1990 and one in 2014; and that I “argue that thousands and thousands of scientists are somehow in cahoots to push the global warming hoax on innocent people of the world …”, when I made no such argument and specifically detailed how my position was different from those who think global warming is a hoax.
Turning to Mr Ward’s own complaints, he correctly notes that I explained that I became aware of Dr Myneni’s work from a 2012 talk, but that I quote from one delivered in 2013. I may not have made this fact very clear in my spoken remarks, but everything I said was properly sourced (including on my slides) and correctly quoted, and in any case the use of the later talk is entirely appropriate since, as Mr Ward also notes, Dr Myneni’s estimates of the amount and attribution of greening changed between late 2012 and mid 2013. This is something I was well aware of, but the change in no way contradicts anything I said. Indeed, it reinforces it. The 2012 version of the talk was based on data suggesting a greening of 20.5% of the land, which Mr Ward quotes; while the equivalent slides in the 2013 version, one of which I reproduced, supports a greening of 30.87% of the land area, which was the estimate to which I referred.
Mr Ward’s letter specifically confirms the accuracy of my claim that at various times Dr Myneni said 31% of the land area has greened, the planet had greened by 14%, and that 70% of the greening can be attributed to carbon dioxide fertilization. Dr Myneni has not claimed he was misquoted on these points.
As I stated in my lecture, Dr Myneni stated in 2015 that “[Ridley] falsely claims that CO2 fertilisation is responsible for the greening of the earth”. Yet a few months later he himself published evidence that “CO2 fertilisation explains 70% of the greening trend”.
I used the word “might” in my suggestion that the publication of these results might have been delayed lest they give sceptics a field day, so there was no accusation, as Mr Ward claims. Dr Myneni says the delay was mainly due to the senior author on the paper returning to China. I remain doubtful that these data would have taken so long to publish if they had shown bad news.
As for Mr Ward’s complaint that I misrepresented Dr Richard Betts, he destroys his own case by quoting another part of the IPCC assessment report where greening is very briefly mentioned, and which I confess I missed because it was so brief and dismissive:
“Warming (and possibly the CO2 fertilisation effect) has also been correlated with global trends in satellite greenness observations, which resulted in an estimated 6% increase of global NPP, or the accumulation of 3.4 PgC on land over the period 1982–1999 (Nemani et al., 2003).”
Since published data (Donohue et al 2013) already pointed to a larger greening over a longer period, and my point was that the mentions of global greening were brief, doubtful and downplayed the effect, this extra quote beautifully illustrates my point. As I put it,
“If that’s a clear and prominent statement that carbon dioxide emissions have increased green vegetation on the planet by 14% and are significantly reducing the water requirements of agriculture, then I’m the Queen of Sheba.”
I will happily add this extra quotation to the written version of my lecture on line since it illustrates my point even better.
I stand by my lecture. Mr Ward is confirming the accuracy of my work while continuing to try to smear my name.
In my lecture I stated that “These days there is a legion of well paid climate spin doctors. Their job is to keep the debate binary: either you believe climate change is real and dangerous or you’re a denier who thinks it’s a hoax. But there’s a third possibility they refuse to acknowledge: that it’s real but not dangerous.”
With best wishes
Yours sincerely
Matt Ridley
October 18, 2016
The misapplication of Malthus
I gave a lecture recently at Haileybury College (the successor to the East India College where the economist Robert Malthus taught), on the topic of "The Misapplication of Malthus". It was based on a chapter of my book The Evolution of Everything:
Parson Malthus casts a long shadow over the past 200 years. He was a good man without a cruel bone in his body. But great cruelty has been done in his name and is still being done in his name. That’s the paradox I wish to explore this evening.
Malthus’s finest legacy is to have sparked Charles Darwin into action. In September 1838, shortly after returning from the voyage of the Beagle, Darwin read, or re-read, Malthus’s essay on population and was struck by the notion of a struggle for existence in which some thrived and others did not, an idea which helped trigger the insight of natural selection.
Why did he read it then? I have a theory. Through his brother Erasmus, Darwin was very friendly at the time with Harriet Martineau, a firebrand radical who campaigned for the abolition of slavery and also for the “marvellous” free-market ideas of Adam Smith. So friendly that his father feared Erasmus or Charles might marry this rather terrifying lady. Martineau was a close confidant of Malthus. I suspect it was at her urging that Darwin read Malthus in 1838.
A wealthy English mathematician, teacher and clergyman with a fine literary style, Malthus is known today for just one short document, the Essay on Population, first published in 1798 and frequently revised in the following years. He is a bit of a hero to many in the environmental movement to this day for his emphatic insistence that there are limits to growth – that population growth must lead to misery, starvation and disease when the land, the food, the fuel or the water runs out.
According to his epitaph in Bath Abbey, he was known for the “spotless integrity of his principles, the equity and candour of his nature, his sweetness of temper, urbanity of manners and tenderness of heart, his benevolence and his piety”. He was clearly not a nasty man, and his chief remedy for overpopulation -- late marriage – was not a cruel one. But he did warn that, if people could not be persuaded to delay marriage, then cruel policies would be needed to halt population growth: we would have to encourage famine and “reprobate specific remedies for ravaging diseases”.
Unfortunately, the lesson most people have taken from Malthus is that you have to be cruel to be kind. This runs right through the eugenic and population movements, and is alive and well today. When I write or speak about falling child mortality in Africa today, I can be sure I will get a response along exactly these Malthusian lines: but surely it’s a bad thing if you stop poor people dying? What’s the good of bringing economic growth to Africa: they will only have more babies.
Let’s call it Malthusian misanthropy. And it is 180-degrees wrong. The way to get population growth to slow, it turns out, is to keep babies alive, to bring health, prosperity and education to all.
The way to reduce the human impact on the planet is to get more technological, not less – to use metal and plastic and gas instead of wood, water and fodder.
There were plenty who thought Malthus’s recommendations cruel in his lifetime and afterwards. Pierre-Joseph Proudhon called it “the theory of political murder; of murder from motives of philanthropy and for love of God.”
Britain’s new Poor Law of 1834, which attempted to ensure that the very poor were not helped except in workhouses, and that conditions in workhouses were not better than the worst in the outside world, was based explicitly on Malthusian ideas – that too much charity only encouraged breeding, especially illegitimacy or “bastardy”.
The Irish potato famine of the 1840s was made infinitely worse by Malthusian prejudice shared by the British politicians in positions of power. The prime minister, Lord John Russell, was motivated by “a Malthusian fear about the long-term effect of relief”, according to a biographer.
The Assistant Secretary to the Treasury Charles Trevelyan had been a pupil of Malthus here at Haileybury: famine, he thought was an "effective mechanism for reducing surplus population” and a “direct stroke of an all-wise and all-merciful Providence” sent to teach the "selfish, perverse and turbulent" Irish a lesson. Trevelyan added: “Supreme Wisdom has educed permanent good out of transient evil.”
As recounted by Robert Zubrin in his book Merchants of Despair, in 1877, a dreamy and Bohemian poet with an opium habit named Robert Lytton, was serving as viceroy of India, sent there by his friend the prime minister, Benjamin Disraeli.
Lytton (who was my great great grandfather) may sound like a harmless if high-born hippy, but unfortunately he or his advisers were Malthusians. A drought afflicted some parts of the country. There was still plenty of food in India as a whole – food exports doubled and doubled again in two years—but taxes and the devaluation of the rupee left the hungry unable to afford relief. Lytton quoted almost directly from Malthus in his response: “the Indian population has a tendency to increase more rapidly than the food it raises from the soil”. His policy was to herd the hungry into camps where they were fed on – literally -- starvation rations so that 94% died each month. Lytton specifically halted several private attempts to bring relief to the starving. Up to ten million people died.
Even Darwin, that most gentle and compassionate of men, was at least briefly tempted by the idea that his beloved natural selection should be a prescription rather than a description. In an explicitly Malthusian passage in the Descent of Man he notes that the “imbecile, the maimed and the sick” are saved by asylums and doctors; and that the weak are kept alive by vaccination. “Thus the weak members of civilized species propagate their kind”, something that any cattle breeder knows is “injurious to the race”. He went on to lament the fact that “the very poor and reckless, who are often degraded by vice, almost invariably marry early, whilst the careful and frugal, who are generally otherwise virtuous, marry late in life”.
It was a hint that was enthusiastically embraced by several of Darwin’s followers, notably his cousin Francis Galton. Galton wanted people to choose their marriage partners more carefully, so that the fit would breed and the unfit would not. “What nature does blindly, slowly, and ruthlessly,” he argued, “man may do providently, quickly and kindly.”
Galton’s followers were soon outdoing each other in their prescriptive rush to nationalize marriage, license reproduction and sterilize the unfit. Many of the most enthusiastic eugenicists, such Sydney and Beatrice Webb, George Bernard Shaw, Havelock Ellis and H G Wells, were socialists, who thought the power of the state would be necessary to implement this programme of selective human breeding. As Jonah Goldberg wrote in his book Liberal Fascism: “Almost all the leading progressive intellectuals interpreted Darwinian theory as a writ to ‘interfere’ with human natural selection. Even progressives with no ostensible ties to eugenics worked closely with champions of the cause. There was simply no significant stigma against racist eugenics in progressive circles.” To be against eugenics was to be uncaring about the future of the human race.
In Germany, Ernst Haeckel, Darwin’s translator, took the Malthusian struggle in a quasi-religious direction, trying to fuse Darwinian with Christian insights in a theory that he called Monism. In a lecture at Altenburg in 1892, Haeckel used phrases from both Malthus and Thomas Hobbes: “Here it was Darwin, especially, who thirty-three years ago opened our eyes by his doctrine of the struggle for existence, and his theory of selection founded upon it. We now know that the whole of organic nature on our planet exists only by a relentless war of all against all.” [my emphasis]. There’s that exact phrase, struggle for existence, first used by Malthus in Chapter 3 of his population essay, and then by Darwin to describe the lesson he took from Malthus.
In 1905 four of Haeckel’s followers founded the German society for racial hygiene, a step that would lead pretty well directly to the Nuremberg laws, the Wannsee conference and the gas chambers. It is therefore not at all hard to trace a clear path that leads from Malthus’s followers’ insistence that we intervene in selective survival to the ash of Birkenau. This is not to blame the innocent mathematician-clergyman for the sins of the Nazis. There is nothing morally wrong in describing a struggle for existence as a feature of human population. What is wrong is prescribing it as deliberate policy. The sin that is committed is one of active interference, of ends justifying means.
It mattered little that scientific support for this policy was weak in the extreme. In fact the discoveries of Gregor Mendel, which became known to the world in 1900, ought to have killed eugenics stone dead. Particulate inheritance and recessive genes made the idea of preventing the deterioration of the human race by selective breeding greatly more difficult and impractical. How were those in charge of breeding the human race supposed to spot the heterozygotes who carried but did not express some essence of imbecility or unfitness? It would take centuries. Yet the genetic facts made no difference to the debate. Driven by a planning fantasy, the political classes, left and right, agitated to nationalize reproduction to prevent the spread of unfit blood lines.
The first International Congress of Eugenics assembled in London in 1912 under the presidency of Leonard Darwin, son of Charles. It was attended by three ambassadors, as well as the Lord Chief Justice and the first Lord of the Admiralty – one Winston Churchill. In his presidential address Leonard Darwin made no bones about the switch from description to prescription: "As an agency making for progress, conscious selection must replace the blind forces of natural selection”. Fortunately, Britain, the founder of the eugenic movement, never enacted a specifically eugenic law, thanks largely to a bloody-minded member of Parliament, Josiah Wedgwood, a great great nephew of Charles Darwin, who spotted the danger and filibustered a eugenic bill in the House of Commons.
In the United States it was a different story. The Eugenics Record Office, established at Cold Spring Harbor, New York, in 1910 by the energetic eugenicist Charles Davenport, with funding from the widow of the railroad magnate, E.H.Harriman, soon began to exert a powerful influence on policy. The second International Congress of Eugenics assembled in in New York in 1921, under the honorary presidency of Alexander Graham Bell, presided over by the grandee Henry Fairfield Osborn and with the invitations sent out by the State Department. This was no fringe event. Leonard Darwin was too unwell to attend but sent a message expressing the “firm conviction…that if wide-spread eugenic reforms are not adopted during the next hundred years or so, our Western Civilization is inevitably destined to such a slow and gradual decay as that which has been experienced in the past by every great ancient civilization.”
Davenport eventually persuaded 30 states to pass laws allowing for the compulsory sterilization of the feeble-minded, insane, criminalistic, epileptic, inebriate, diseased, blind, deaf, deformed and dependent. By the time such laws were struck down in the early 1970s, some 63,000 people had been forcibly sterilised and many more persuaded to accept voluntary sterility.
California was especially enthusiastic about eugenics. By 1933 it had forcibly sterilized more people than all other states combined.
So when the Third International Congress of Eugenics gathered at the American Museum of Natural History in New York in 1932 under the presidency of Charles Davenport, and Davenport asked “can we by eugenical studies point the way to produce the superman and the superstate?”, it was to California that the superman-worshipping German delegates looked for an answer.
One of them, Ernst Rudin of the German society of racial hygiene, was elected to head the International Federation of Eugenics Organisations. Within months, Rudin would be appointed Reichskommissar for eugenics by the incoming Nazi government. By 1934, Germany was sterilizing more than 5,000 people per month. The California conservationist Charles Goethe, who combined a pioneering passion for protecting wild landscapes with an equal passion for sterilizing psychiatric patients without their consent, returned from a visit to Germany overjoyed that the Californian example had “jolted into action a great government of 60 million people."
What happened next has not lost its power to shock. Nazi Germany sterilized 400,000 people in the six years after Hitler came to power, including schizophrenics, depressives, epileptics, and disabled people of all kinds. In 1939, the Nazi government went a step further and began to kill disabled and mentally ill people mainly with lethal injections. In 1941 it began to herd the “unfit” into concentration camps for mass extermination, along with homosexuals, Gypsies, political prisoners and millions of Jews. Under pressure of propaganda, many ordinary Germans inverted their consciences, becoming ashamed of any feelings of sympathy with their Jewish or disabled friends: the morally correct thing to do, they thought, was to override such feelings. Six million human beings died.
After the second world war and with the revelation of the horrendous results of these policies taken to extremes, eugenics fell from fashion.
Or did it? Surprisingly quickly and surprisingly blatantly, the very same arguments resurfaced in the movement to control world population. The son of the prominent pre-war eugenicist Henry Fairfield Osborn, called Fairfield Osborn, published a book in 1948 entitled Our Plundered Planet, which revived Malthusian concerns about the rapid growth of the human population, the depletion of resources, the exhaustion of soil, the over-use of DDT, an excessive reliance on technology and a rush to consumerism. “The profit motive, if carried to the extreme,” wrote the wealthy Osborn, “has one certain result – the ultimate death of the land.” Osborn’s book was reprinted eight times in the year it was published and translated into 13 languages.
At almost the same time William Vogt, a biologist driven by a passion for wildlife conservation, published a similar book The Road to Survival, in which the ideas of the “clear-sighted clergyman” Malthus were even more explicitly endorsed. “Unfortunately,” wrote Vogt, (yes unfortunately!), “in spite of the war, the German massacres, and localized malnutrition, the population of Europe, excluding Russia, increased by 11,000,000 people between 1936 and 1946”. In India, he thought, British rule had contributed to making famines ineffectual, which was a pity because it led to more babies, or to Indians “breeding with the irresponsibility of codfish”.
The population control movement was, to an uncomfortable extent, the child of the eugenics movement.
The link was just as explicit on this side of the Atlantic. Sir Charles Galton Darwin, nephew of Leonard and grandson of Charles, who was a distinguished physicist, published his own pessimistic book in 1952 entitled The Next Million Years. He wrote: “To summarize the Malthusian doctrine, there can never be more people than there is food for. Those who are most anxious about the Malthusian threat argue that the decrease of population through prosperity is the solution of the population problem. They are unconscious of the degeneration of the race implied by this condition, or perhaps they are willing to accept it as the lesser of two evils.”
By the 1960s these ideas had converted many people in positions of power. Osborn’s and Vogt’s books had been read by a generation of students including Paul Ehrlich and Al Gore.
The most influential disciple was General William Draper, whose commission on foreign aid reported to President Eisenhower in 1959 that aid should be tied explicitly to birth control in order to decrease the supply of recruits to communism. Eisenhower did not buy this; and nor did his Catholic successor John F. Kennedy.
But Draper did not give up. His Population Crisis Committee eventually won over Lyndon Johnson in 1966, and population control became an official part of American foreign aid.
Under its ruthless director, Reimert Ravenholt, the Office of Population grew its budget till it was larger than that of the rest of the aid budget. Ravenholt bought up defective birth control pills, unsterilized intrauterine devices and unapproved contraceptives for distribution as aid in poor countries. He made no bones about his views that the prevention of infant mortality in Africa was “enormously harmful to African societies when the deaths prevented thereby are not balanced by prevention of a roughly equal number of births…Many infants and children rescued from preventable disease deaths by interventionist programs during the 1970s and 1980s have become machete-wielding killers”.
Some Western commentators thought starvation was a better course of action. Brian and Paul Paddock wrote a best seller in 1967 called Famine 1975!, which argued that a time of famine was imminent and food aid was futile. America, they said, must divide the underdeveloped nations into three categories: those that could be helped, the walking wounded who would stagger through without help and “those so hopelessly headed for or in the grip of famine (whether because of overpopulation, agricultural insufficiency, or political ineptness) that our aid will be a waste; these ‘can't-be-saved nations’ will be ignored and left to their fate”. India, Egypt and Haiti should be left to die in this way.
A year later, Paul Ehrlich’s The Population Bomb was almost as callous. India could never feed itself, he decided. An unabashed advocate of coercion to achieve population control, he compared humanity to a cancer and recommended surgery. “The operation will demand many apparently brutal and heartless decisions. The pain may be intense.” Population control at home would require “compulsion if voluntary methods fail”. He suggested adding sterilants to the water supply to achieve “the desired population size”. As for overseas, he wanted food aid made conditional on forcible sterilization of all those who had three or more children in India: “Coercion in a good cause” he called it.
When Mrs Gandhi asked the World Bank for loans in 1975, she was told that stronger efforts to control population were a precondition. She turned to coercion, her son Sanjay running a program that made many permits, licences, rations and even housing applications conditional on sterilization. Slums were bulldozed and poor people rounded up for sterilization. Violence broke out repeatedly. In 1976, when eight million Indians were sterilized in a year, Robert McNamara, president of the World Bank, visited the country and congratulated it: “At long last India is moving effectively to address its population problem.”
Yet here is the astounding thing. Birth rates were already falling in India and elsewhere. Food production was rising far faster than population, in a reverse of Malthusian predictions -- thanks to synthetic nitrogen fertilizer and new short-strawed varieties of cereals: the Green Revolution. The answer to the population explosion turned out not to be coercion, or the encouragement of infant mortality, but the very opposite. By far the best way to slow down population growth was to keep babies alive, because then people would have fewer of them as they planned smaller families.
And the even more shocking fact is that this solution was already known to some at the very start of the panic. Even at the very birth of neoMalthusian population alarm in the 1940s there were some who saw how horribly wrong both the diagnosis and the cure were. Far from more babies causing more hunger, they argued that it was the other way round. People increased their birth rate in response to high child death rates. Make them richer and healthier and they would have fewer babies as had already happened in Europe where prosperity had led birth rates down, not up.
The Brazilian diplomat Josue de Castro in his book The Geopolitics of Hunger, argued that “the road to survival, therefore, does not lie in the neo-Malthusian prescriptions to eliminate surplus people, nor in birth control, but in the effort to make everybody on the face of the earth productive.”
In the 1970s Paul Ehrlich’s brand of population pessimism was attacked by the economist Julian Simon in a series of articles and books. Simon argued that there was something badly wrong with a thesis that the birth of a baby is a bad thing, but the birth of a calf is a good thing. Why were people seen as mouths to feed, rather than hands to help? Was not the truth of the past two centuries that human wellbeing had improved as population had expanded?
Famously, in 1980 Simon challenged Paul Ehrlich to a bet about future prices of raw materials. Ehrlich and a colleague, eager to take up the offer, chose copper, chrome, nickel, tin, and tungsten as examples of materials that would grow scarcer and more expensive over ten years. Simon bet against him. Ten years later, grudgingly and while calling Simon an “imbecile” in public, Ehrlich sent Simon a check for $576.07: all five metals had fallen in price in both real and nominal terms. (One of my proudest possessions is the Julian Simon award, made from those five metals.)
The price of commodities has gone down and down while the population has gone up and up. Exactly the opposite of the Malthusian prediction.
Yet the neo-Malthusians are undaunted. Lester Brown is a famous environmentalist who has made a name for himself by repeatedly predicting human starvation. Over the past half century the total harvest of wheat, maize and rice – the three biggest crops and the source of 60% of our calories – has trebled even though the amount of land cultivated has hardly changed. Yet Lester Brown has repeatedly told us doom was imminent.
`Farmers can no longer keep up with rising demand for food, and famine is inevitable”
Lester Brown 1974
``Global food insecurity is increasing”
Lester Brown 1981
`The slim margin between food production and population growth continues to narrow”
Lester Brown 1984
``Population growth is exceeding farmers’ ability to keep up”
Lester Brown 1989
``Seldom has the world faced an unfolding emergency whose dimensions are as clear as the growing imbalance between food and people”
Lester Brown 1994
``Cheap food may now be history”
Lester Brown 2007
The solution to the population explosion turned out to be the green revolution and the demographic transition: emergent phenomena rather than coercion and planning. People started having smaller families because they were richer, healthier, more urban, more liberated and more educated. Not because they were told to. There is only one country where population control was sufficiently coercive to achieve its end – China – and yet all the evidence suggests that coercion there was counterproductive.
China’s one-child policy derives directly from western neo-Malthusian writing.
Despite being a mass murderer, MaoZedong’s approach to population was relatively restrained and humane: known as “Later, Longer, Fewer” it encouraged lower fertility by the delaying marriage, spacing births and stopping at two, but in a flexible and non-prescriptive way. That is roughly what Malthus himself had advocated.
Whether for that reason or because of improving child mortality, China’s birth rate halved between 1971 and 1978.
Then after Mao’s death came a turn to a much more rigid and prescriptive approach.
As Susan Greenhalgh, a Harvard anthropologist, recounts in her book Just One Child, in 1978 Song Jian, a guided missile designer with expertise in control systems, attended a technical conference in Helsinki. While there he heard about two books by neo-Malthusian alarmists linked with a shadowy organization called the Club of Rome. One was Limits to Growth, published in the United States; the other A Blueprint for Survival, published in Britain.
Limits to Growth was a book that sold ten million copies and purported to prove with computer models that humanity was doomed because of overpopulation and the exhaustion of resources.
A Blueprint for Survival was the British equivalent. Heavily influenced by Malthusian ideas, and written by a wealthy businessman Edward Goldsmith but signed by a veritable Who’s Who of the scientific establishment including Sir Julian Huxley, Sir Peter Medawar and Sir Peter Scott, it oozes snobbish disdain for the fact that consumer society with its “shoddy” goods is coming within reach of ordinary people. As for the global poor, “it is unrealistic to suppose that there will be increases in agricultural production adequate to meet forecast demands for food.” They then command that governments must acknowledge the population problem “and declare their commitment to ending population growth; this commitment should also include an end to immigration”. It is a highly reactionary document, of the kind that would embarrass a fringe right-wing party today.
These were the two books that that Song Jian, the father of the one-child policy, picked up in Helsinki.Limits to Growth had applied control systems theory, of the kind Song was an expert in, not to the trajectory of missiles but to the trajectory of population and resource use. Song returned to China, where he republished the main themes of both books in Chinese under his own name, and shot to fame within the regime. Song was proposing social engineering in the most literal sense. At a conference in Chengdu in December 1979 Song silenced his critics worried about the humanitarian consequences, and persuaded the party to adopt coercion.
General Qian XingZhong was put in charge of the policy. He ordered the sterilization of all women with two or more children, the insertion of IUDs into all women with one child (removal of the device being a crime), the banning of births to women younger than 23, and the mandatory abortion of all unauthorized pregnancies right up to the eighth month. Those who tried to flee and have babies in secret were tracked down and imprisoned. In some cases their communities were fined, encouraging betrayal of neighbours. The brutal campaign of mass sterilisation, forced abortion and infanticide was exacerbated by the voluntary murder of baby girls on a genocidal scale as parents tried to ensure that their one legal child was a boy.
And fertility actually rose!
What was the international reaction to this holocaust? The United Nations Secretary General awarded a prize to General Qian in 1983 and recorded his “deep appreciation” for the way in which the Chinese government had “marshaled the resources necessary to implement population policies on a massive scale”. Eight years later, even though the horrors of the policy were becoming ever more clear, the head of the United Nations Family Planning Agency said that “China has every reason to feel proud of its remarkable achievements” in population control, before offering to help China teach other countries how to do it.
A benign view of this authoritarian atrocity continues to this day. The media tycoon Ted Turner told a newspaper reporter in 2010 that countries should follow China’s lead in instituting a one-child policy to reduce global population over time.
In conclusion, Malthus was a good and clever man. But Malthusian ideas pursued in his name are cruel and wrong.
The poor laws were wrong; British attitudes to famine in India and Ireland were wrong; eugenics was wrong; the Holocaust was wrong; India’s sterilization programme was wrong; China’s one-child policy was wrong. These were sins of commission, not omission.
Malthusian misanthropy – the notion that you should harden your heart, approve of famine and disease, feel ashamed of pity and compassion, for the good of the race – was wrong pragmatically as well as morally. The right thing to do about poor, hungry and fecund people was always, and still is, to give them hope, opportunity, freedom, education, food and medicine, including of course contraception, for not only will that make them happier, it will enable them to have smaller families.
Leave the last word to Jacob Bronowski, speaking at the end of his television series The Ascent of Man.
Standing in a pond at Auschwitz-Birkenau, where many of his relatives died, he reached down and lifted some mud: “Into this pond were flushed the ashes of some four million people. And that was not done by gas. It was done by arrogance, it was done by dogma, it was done by ignorance. When people believe that they have absolute knowledge, with no test in reality, this is how they behave. This is what men do when they aspire to the knowledge of gods.”
October 16, 2016
Let in more scientists, not fewer
My Times column on skilled versus unskilled migration and Brexit:
Michael Kosterlitz, one of the four British-born but American-resident winners of Nobel prizes in science this year, is so incensed by Brexit that he is considering renouncing his British citizenship: “The idea of not being able to travel and work freely in Europe is unthinkable to me.” He has been misled — not by Leavers but by Remainers.
It’s not just that the overseas press have consistently portrayed Brexit as a nativist retreat, despite Boris Johnson, Michael Gove and Daniel Hannan consistently saying the very opposite. Throughout the referendum campaign — and, shamefully, since — academics have been told by their lobby groups (such as Universities UK) that Brexit probably means losing access to European research funds, European scientific collaborations and European talent.
They knew, and know, that this was, to borrow a word thrown at Leavers a lot, “a lie”. The main European research funding programme, Horizon 2020, includes as members Norway, Iceland, Tunisia, Georgia, Turkey, Israel, Serbia and eight other non-EU countries. Project co-ordinators, who control the money, are appointed from Iceland more often — not less often — than from any EU country, in proportion to population. Switzerland and Israel get the most grant funding, per capita, from the prestigious European Research Council.
The major European science collaborations — in particle physics (CERN), molecular biology (EMBO), nuclear fusion (ITER), space research (ESA) — are nothing to do with the European Union and include non-EU member nations. Throughout the referendum campaign, the leaders of universities either did not know or chose to ignore these facts. No wonder academics were so alarmed when the result came in.
Where was the contingency planning to reassure their colleagues in the (highly possible) event that the country voted Leave? To say: “Although we hope Britain votes to remain, if it votes to leave, this should not affect our membership of European research collaborations, and it is highly likely we will still be able to access Horizon 2020 funds by joining the programme in the same way as 15 other non-EU European countries. So please don’t worry: we are unlikely to be excluded from a club that includes Albania and Moldova.”
After the vote there was a vogue for research laboratories to take group photographs of their members with flags denoting their country of origin, to show how international they were. This backfired because so many of the flags were Asian, South or North American, African or Australasian — exactly the point we Leavers had been making, that science is a global, not a regional, activity. It’s a fair bet that foreign researchers in British labs who come from continents beginning with A outnumber those from continents beginning with E. Professor Kosterlitz is a prime example of the fact that the top destination for ambitious British scientists heading overseas is America.
Now, to make matters worse, comes the home secretary, Amber Rudd, and her clumsy floating of the idea — which Justine Greening and Michael Fallon have rowed back on — of threatening to name and shame employers, presumably including universities, who employ too many foreigners. This could not have been better calculated to reinforce fears among scientists. Never once during the campaign did I hear anybody prominent on the Leave side ask for anything remotely as xenophobic as this.
Roland Rudd (her brother), a leader of the Remain campaign, said “those of us who want a sensible Brexit, who want Britain to remain a beacon of tolerance and who find the denigration of non-British workers appalling have a duty to speak out”. Steve Hilton, a leader of the Leave campaign, called the proposals “divisive, repugnant and insanely bureaucratic”.
At the very least, there is clearly a problem of people who campaigned for Remain having caricatured the Leave argument among themselves and then believed the caricature. (It’s a given of political debates that people don’t read their opponents’ views so much as their friends’ accounts of their opponents’ views.) Ms Rudd seems to think that northern Britain thinks the way north London intellectuals think it thinks.
It doesn’t. I recall a conversation over a garden fence on a housing estate in Gateshead on referendum day with a shaven-headed, tattooed and pierced, Rottweiler-restraining gentleman. Have you voted, I asked him. Yes, Leave, he replied. Mind if we ask why, said my friend: was it about immigration? No, he replied, I don’t mind immigrants, though I think we’ve been unfair on the Commonwealth ones, wouldn’t mind seeing more of them, and we’ve let in too many east Europeans instead. No, it’s about making our own laws.
Ipsos Mori conducted a poll last year asking people if they wanted more, fewer or the same number of different kinds of immigrants. While 63 per cent of people wanted fewer low-skilled immigrants, just 24 per cent wanted fewer immigrant university students, and only 15 per cent wanted fewer immigrant scientists and researchers.
This is a rational response. For the average Briton, as opposed to the wealthy customers of waiters, plumbers and nannies, unskilled migrants are a potential threat, putting downward pressure on wages and upward pressure on waiting lists for housing and services. Migrant scientists and students, let alone doctors or nurses, by contrast, are unambiguously good for the economy.
People voted to escape the subordination of our laws to unelected Brussels bureaucrats on June 23. To the extent that they also minded about immigration, the great majority of them wanted it controlled, not halted, let alone reversed by discrimination and repatriation. And biased towards people who bring skills, investment and ideas, and away from people who compete for public services: that’s what the “points system” argument was all about.
The impression left by the Conservative Party conference was that foreigners are less welcome, even if they are about to start a business or win a Nobel prize. Where is the expedited academic talent visa, like America has? America, Canada and Australia have a higher proportion of overseas researchers than Britain, Germany or France — alongside stricter general immigration policies.
If we are to thrive, Britain must redouble its efforts to be a beacon for talent, attracting ambitious students, scientists and entrepreneurs from India, China and elsewhere, in the same way that America attracted Michael Kosterlitz and his fellow Nobel laureates.
October 4, 2016
Britain's chance to be the global champion of free trade
My Times column on free trade after Brexit:
The prime minister wants Britain to be “the most passionate, most consistent, most convincing advocate for free trade”. Under either Donald Trump or Hillary Clinton, and with world trade stagnating, it looks as if the job is increasingly likely to be vacant in March 2019, so Britain has both a vital duty and a golden opportunity. It worked for us before.
Next year sees the 200th anniversary of David Ricardo’s insight of “comparative advantage” — the counterintuitive idea that trade benefits “uncompetitive” countries as much as efficient ones. If one country is better at making both cloth and wine than another, it can still pay it to get its wine, for example, by making extra cloth to swap for the other’s wine. Or, as somebody once put it, even if Winston Churchill is a very good bricklayer (he was) it still makes sense for him to write books or run governments, and pay somebody else to build his walls.
So the government’s view of trade should be: the more the better, the freer the better, and unilateral is fine. There is no episode in history of a country opening itself more to world trade without getting richer. The Phoenicians, Athens, Gujarat and Bengal, Venice, the Portuguese, the Dutch, the Victorian British, America, Singapore, Hong Kong, China after Deng Xiaoping — in every single case, countries that opened to trade got much richer very fast.
Dan Hannan, MEP, laments that we seem to have forgotten this, that young people are under the impression that trade is a zero-sum game and makes the rich richer and poor poorer: “We need to make the case for unrestricted commerce, as its earlier advocates did, in the kind of ethical language that the Occupy crowd will hear. Free trade is the ultimate instrument of poverty alleviation, conflict resolution and social justice.” Free trade is the enemy of complacent corporations and their crony conspiracies with corrupt politicians. It ensures that business serves its customers more than its owners.
Yet most of the current debate seems to be locked in a mercantilist view that trade is something governments arrange. As John Longworth, the former head of the British Chambers of Commerce and co-chairman of the new pressure group Leave Means Leave, points out: “Trade consists of a willing seller and a willing buyer. If the buyer wants a product and it is the right quality and price there will be trade. Governments do not trade, they only get in the way.”
Being in the EU has meant having no trade deals at all with America, China, Russia, India, Brazil or just about any large economy, despite decades of desultory negotiation. This hasn’t stopped us trading with them; hasn’t stopped them getting “access” to the single market — that is, selling the same product throughout Europe; hasn’t required them to join the protectionist single regulatory zone; won’t stop us having such access. The truth is, if it’s trade deals you want, the EU is the worst possible entity to be part of: it’s done fewer trade deals than most countries.
If the EU decides to punish its citizens by slapping a tariff on imports from Britain, that’s their funeral. Given the current devaluation of the pound our exports are about 10 per cent more competitive compared with the continent than they were before the referendum. Adding a tariff of roughly 3.5 per cent under WTO rules would — as John Redwood puts it — still leave us 6.5 per cent more competitive, and them 13.5 per cent less competitive. So, as Peter Lilley argues in a new Legatum/Centre for Social Justice pamphlet, written by four ex cabinet ministers: “We should simply announce that for the time being we will maintain our zero tariffs on imports from the EU — unless they choose to impose WTO tariffs on us, in which case we will reciprocate.”
Britain’s comparative advantage lies in high-value-added things such as investment banking, accountancy, law, advertising, design, research, higher education — things the rest of the world wants. As the financier Miles Morland put it in a recent essay, there is more investment banking expertise on the Isle of Dogs than in the whole of continental Europe put together.
The world’s biggest advertising agency is British and its nearest rival is American. Seven of the top ten accountancy firms are headquartered in Britain, and only one elsewhere in the EU; five of the top ten law firms, and none in the rest of the EU; three of the top ten universities (versus none). London’s peers, partners and rivals are primarily the other nine of the top ten financial centres: New York, Hong Kong, Singapore, Tokyo, Seoul, Zurich, Toronto, San Francisco and Washington. That’s where the opportunity for growth, and the competition for business, lie.
We just don’t know how competitive British science, technology, medicine, even precision agriculture could be, once freed from the top-down dirigisme of Brussels with its myriad rules designed to stifle innovation and protect big companies from upstart competition. The distinguished medical scientist Sir John Bell says: “Britain is more inclined towards a relatively liberal risk-based regulatory environment that allows fields to move quickly — to reflect on ethical issues but not to over-regulate. The EU, by contrast, has a record of deep regulatory conservatism, attempting to legislate and control many aspects of science that are not deemed here in the UK to present a significant danger.”
If Britain were to become the global champion of free trade, as it did once before at the behest of Adam Smith, Ricardo and Richard Cobden, there would be creative destruction, for sure, and pain to go with the greater gain. But there’s little doubt that we could find ourselves growing at 4 or 5 per cent a year. Think how that would transform our public services.
It is now more than three months since we voted to leave the EU. Neither Mark Carney’s promised recession, nor George Osborne’s promised punishment budget have happened, or look likely to. Every stock index, producer managers' index, growth forecast, employment rate and house price index that was supposed to plummet has gone up instead. Service sector growth is much higher than forecast. Only the pound is down, giving us a perfect competitiveness boost without significant inflationary risk.
I am reminded of what happened in 1992 when we were ejected by the markets from the exchange-rate mechanism and all the pointy-heads said we faced disaster, but we had a boom instead.
September 27, 2016
Mental illness is the greatest research challenge
My Times column on the Chan-Zuckerberg initiative in basic medical science:
Mark Zuckerberg, chief executive of Facebook, and his wife Priscilla Chan, a paediatrician, have announced their intention to spend $3 billion over ten years on medical research. Having met them last year, I thought I would take the liberty of making a suggestion as to how they spend their money.
Dear Priscilla and Mark,
Your plan to fund medical research and challenge humanity to “cure, prevent or manage all diseases by the end of the century” has certainly caught the world’s attention. As Bill Gates has done with malaria, mainly through the use of bed-nets, and Jimmy Carter with guinea worm, mainly through water filters, a famous philanthropist financing fresh approaches can make a big difference, even with limited funds.
But it will be important to get the most bang for your buck. So I want to suggest a sort of triage: avoid the problems that are too hard and too easy; tackle the ones that are tough but soluble. That means focusing on the brain. Naming the distinguished neuroscientist Cori Bargmann as president of science at the Chan-Zuckerberg initiative is the right move.
You’ve rightly stated that there are four main categories of disease: heart disease, cancer, neurological and infectious diseases. By 2100, with the right policies, heart disease and infectious disease will be easily beatable without your money, or much new research; while most cancer will be undefeated by the end of the century, however well you and others spend research money. It’s neurological disease where you should focus.
Infectious diseases are already in headlong retreat. Deaths from them are now rare in the western world and falling rapidly in the developing world. Malaria mortality worldwide fell by 60 per cent in the first 15 years of this century. HIV mortality is down by 25 per cent in six years. All the tools exist to make the same happen to every other virus, bacterium, fungus, worm and protozoan that used to kill us by the million. Vaccines and drugs are getting better every year.
The ebola epidemic showed how quickly we can get a lethal virus under control even in very poor countries, despite some early mistakes. Influenza will cause scares from time to time, but with modern genomics we can sequence it and design countermeasures faster and faster. Tuberculosis will fight a defiant rearguard action, but we know what to do. Drug resistance will be a recurring problem, but not an insuperable one. Emergent infections such as zika and dengue will need fresh approaches, but they are beatable with the right policies. We’ll crack the common cold eventually.
We will need research to understand another category of disease that is getting worse still: allergies and auto-immune diseases. They are mostly a consequence of our success against infectious disease. The immune system used to “expect” to be down-regulated by parasites and without them it over-reacts.
Heart disease is the leading cause of death in the world, causing about one-third of all deaths, but the mortality rate is falling fast in most countries, and we know how to make it fall faster: less smoking, healthier living and prompt treatment. Much the same is true of stroke. With enough funding and political will, these diseases can be defeated.
By contrast, although we’ve pretty well cracked the causes of cancer, we have failed to work out how to cure it since declaring “war” on cancer half a century ago. Genetic mutations in dividing cells (caused by cigarette smoke or other factors, including bad luck) result in uncontrolled cell division. And then Darwin gets involved: within each tumour there is a ferment of genetic experimentation, as cell lines compete for ways to grab resources, grow and divide, despite the signals from the body telling them not to. Eventually, the competition is for ways to survive and counter the chemotherapy drugs, too, which is why initial cure is so often followed by relapse.
The core of the problem is that cancer risk is a byproduct of ageing. A fascinating paper published by a team from Harvard Medical School last month comes to the remarkable conclusion that the risk of cancer doubles every eight years — for all kinds of cancer — and that this is also the rate at which all other risks of ageing double.
Short of genetically engineering people so they have extra copies of the tumour-suppressor gene p53 (as elephants do), there’s probably nothing we can do about this. We can cure childhood and young-adult cancers and expect to see people live for a long time (childhood leukaemia death rates are down by 90 per cent), but we’re never going to get rid of the increasing risk of cancer in old age.
Neurological disorders are different. Many strike young, and we are still fumbling in the dark with most of them, treating the symptoms and not the causes. We still struggle to explain what triggers the onset of Alzheimer’s or Parkinson’s diseases. Plaques form, yes, but that’s like saying spots form on measles patients. Deeper down, something prevents the proper folding of proteins, or their solubility, or both: but what? My money’s on the study of telomeres (the caps at the end of each strand of DNA that protect our chromosomes), for what it’s worth.
In the case of multiple sclerosis, I’m increasingly confident that we will soon agree that the cause is “endogenous retroviruses” — that is, ancient DNA copies of past infectious agents’ genes, copied into our genomes and reawakened by something, but by what, and how? In the case of schizophrenia, autism and other mental illnesses, we’re even further from a full understanding. Yet there is every reason to think that once we can understand them we can cure them.
Treatments for neurological diseases are still in the dark ages. Talking cures work a bit, sometimes. Some drugs work well, others not so much. But there is every reason to think these problems are soluble. If something goes wrong in a brain it should be fixable, once you can identify the cell and the molecule where the problem is happening. By 2100, I predict, neurology and psychology should be unrecognisably better than today.
So if you want value for money focus on the diseases of the brain: that’s where we need research breakthroughs. At the moment Britain spends about four times as much on cancer research as mental health research. That’s probably the wrong way round.
Good luck and best wishes,
Matt.
September 14, 2016
Junk science on statins, snus and vaping
My Times column on statins, snus and vaping:
One of the most salutary examples of people in authority getting risks wrong is a paper written in 1955 by the first head of the environmental cancer section of the US National Cancer Institute, Wilhelm Hueper. The title was “lung cancers and their causes” and he was absolutely convinced that “cigarette smoking is not a major factor in the causation of lung cancer”, because he thought this was a cheap shot by the chemical industry to divert attention away from pesticides.
We now know that smoking is a major cause of lung cancer, whereas pesticides are not. History is littered with example of experts being too reassuring about some risks, too alarmed about others. Washing hands between dissecting women who died in child birth and delivering babies? No risk, said the nineteenth century medical establishment, ostracizing Ignaz Semelweiss who had had the temerity to suggest otherwise. Dietary fats cause heart attacks, insisted the medical establishment for the best part of five decades till very recently. It was once the consensus that tonsils should be removed; no longer.
So I am no fan of arguing from authority on matters of risk. It’s evidence that counts, especially randomized controlled trials, the gold standard of science. The study published last week finding that statins are relatively safe, and that prescribing them to healthy people at even low risk of heart attacks should save many lives, made this point. The authors argued that the pro-statin evidence comes from randomized controlled trials, while the anti-statin arguments “reflect a failure to recognise the limitations of other sources of evidence”, a polite way of saying they are health-fad claptrap.
The argument is not that statins are risk-free, but that the benefits outweigh the risks: it’s the relative risk that counts. Large, randomised trials find that statins greatly reduce the risk of heart attacks and strokes during each year that they continue to be taken. That benefit far outweighs the relatively modest risks from side-effects. Notice that it’s the evidence that is convincing, not the authority of those presenting it. I could not care less that there is a medical consensus that statins are safe unless that consensus is based on good evidence.
The opposition to vaping provides a contrasting example where prejudice against tobacco products seems to have closed the minds of the medical establishment. Until recently most public health experts made precisely the same mistake with electronic cigarettes that the opponents of statins make – thinking only of the risk of vaping, not the benefit. Recent headlines claiming that “Vaping as bad as fags” (the Sun) or “Vaping as bad for your heart as smoking cigarettes, study finds” (Telegraph) were based not on a randomized controlled trial measuring the risks and benefits of both, but on one very limited small study of one largely irrelevant effect reported at a conference.
Worse, that study compared 30 minutes of vaping with five minutes of smoking and concluded that both can induce temporary arterial stiffening – a well-known effect of nicotine that has not been linked to disease, and that is also produced by caffeine. The damaging effect of smoking, both to the heart and in terms of cancer risk, comes not from nicotine at all but from combustion products. So the headlines were based on junk science. They could have read: “Vaping no worse for you than coffee, study finds”.
Tobacco-control advocates say that the “end-game” for smoking will have been reached in a country when its smoking rate falls below 5%. This is a very long way off as global smoking rates continue to rise and smoking remains common even in countries with strong tobacco-control policies such as Britain. Here 26% of young men still regularly smoke.
However, one country is fast heading towards achieving the end-game result. According to Swedish government figures, only 6% of Sweden’s young men regularly smoked in 2015. The only plausible explanation for this extreme outlier is that 25% of Swedish men use “snus”, a sort of tobacco tea-bag pressed against the gums – and do so instead of smoking.
As a result, not only is Sweden on the way to becoming the first country to get below 5% of men smoking, but it is the country with the lowest rate of lung cancer among men in Europe (it’s higher among Swedish women, who don’t use snus much), as well as low rates of other smoking-related illnesses such as heart disease. A spectacular case of harm reduction that puts even statins in the shade.
Acknowledging this data, America’s Food and Drug Administration last year authorized snus to help protect public health. The European Union has not followed suit. This health-giving product is not just discouraged, it is still wholly illegal through the EU – except in Sweden. Herein lies an interesting tale. When Sweden joined the EU in 1995, its peoples’ snus habit became an issue in the negotiation. The country was reluctant to accept a ban on such a popular product, which, even then, was obviously proving a safer and cleaner alternative to smoking.
So the EU was forced to make an exemption for Sweden. To this day, 21 years later, you can legally buy and sell snus in the European Union only in Sweden. Britain, incidentally, led the charge to ban snus within the EU, one Edwina Currie being the minister at the time, and our government remains impervious to the evidence cited above. Many millions of lives could have been saved without that prohibition.
In preparation for a big conference on the subject, the World Health Organisation last week set out a list of things it wants countries to do to hamper the growth of vaping, such as a ban on the advertising. It says not a word about trying to overturn the total bans on vaping in many countries, even though it is now very clear that vaping is mainly done by smokers and that it reduces the risks of smoking-related illnesses drastically – by 95% according to Public Health England.
There is no fundamental difference between vaping (or snus) and statins? All treat conditions that carry high future risks – smoking addiction and high cholesterol. All carry risks themselves but much smaller ones than their benefits. Just as taking statins is better for you than not taking statins if you are at risk of heart disease, so switching to vaping or snus is far better for you than continuing to smoke.
September 12, 2016
Invasion of the alien species
My essay on invasive species in the Wall Street Journal:
In July, the New Zealand government announced its intention to eradicate all rats, stoats and possums from the entire country by 2050 to save native birds such as the kiwi. It’s an ambitious plan, perhaps impossible to pull off with the methods available today, but it’s a stark reminder that invasive alien species today constitute perhaps the greatest extinction threat to animal populations world-wide.
Birdlife International, a charity that works to save endangered birds, reckons that of the 140 bird species confirmed to have gone extinct since 1500, invasive alien species were a factor in the demise of at least 71—an impact greater than hunting, logging, agriculture, fire or climate change.
Rats, cats and diseases were the biggest culprits, contributing to the extinction of 41, 34 and 16 species, respectively. Most of these were on islands. The dodo on Mauritius, emblematic of extinction, was wiped out less by hungry sailors than by the rats, pigs, dogs and cats they brought with them. Hawaii once had 55 species of honeycreeper; today just 17 remain, thanks largely to rats and avian malaria, transmitted by alien mosquitoes brought by people. Guam has lost nine species of bird to an introduced snake.
But continents aren’t immune to invasion by alien species. In the Mississippi River, it is Asian carp; in the Everglades, Burmese pythons; in the Great Lakes, Russian zebra mussels; in the South, Indochinese kudzu vine. In Australia, cane toads from South America; in Lake Victoria in Africa, water hyacinth from the Amazon; in Germany, Chinese mitten crabs; in the Caribbean, lionfish from the Pacific. A fungus spread by African clawed toads (used in laboratories) has wiped out frogs in Central America.
On my farm in Northern England, three native species of animal are being extinguished by alien invaders from North America: the white-clawed crayfish by the signal crayfish; the water vole by the mink; and the red squirrel by the gray squirrel. Himalayan balsam flowers and Japanese knotweed infest the woods.
Aliens turn into pests away from home because they encounter naive and ill-equipped competitors or prey, and they leave behind their diseases and predators. Globalization is increasing the flow. An insect that would have struggled to survive a long journey by ship can stow away on board a plane. Today only Australia and New Zealand, whose isolated fauna and flora are especially vulnerable to invasives, take biosecurity really seriously.
European countries, by contrast, are lax in allowing exotic pets. In Britain, pet raccoons (native to North America) and raccoon dogs (native to China) have escaped into the wild and may one day establish breeding populations that would devastate native wildlife.
A paper published last month by a team of ecologists, led by Regan Early of the University of Exeter in Britain, points out that whereas most invasive alien species (IAS) have affected rich countries so far, the developing world is increasingly at risk: “Many of the global biodiversity hot spots that are highly vulnerable to invasion are found in countries that our results suggest have little capacity to respond to IAS (in particular Central America, Africa, Central Asia and Indochina).”
None of this is to say that invasive species are always a threat. They can bring positive effects, too, by increasing biodiversity within a region. Ascension Island in the Atlantic was once a barren volcanic rock, but is now much greener thanks to a deliberate policy, suggested by Charles Darwin, of bringing in plants from elsewhere in the tropics to create a forest ecosystem. Dov Sax of Brown University points out that New Zealand once had approximately 2,000 native plant species, has gained approximately 2,000 nonnative species that now have self-sustaining populations, and yet has lost fewer than 10 native plant species.
Another positive effect is that invasive species sometimes improve, rather than harm, ecosystem services—the quality of water, soil or air. Zebra mussels were so effective in filtering the water of Lake Erie that they made its water clear. In the American Southwest, the endangered willow flycatcher has taken to nesting on alien tamarisk bushes, embarrassing conservationists who spent millions trying to eradicate the plant for the sake of the bird.
The best way to fight invasive aliens is often with other aliens: Go back to their native country, find an insect or fungus that eats them, and bring it in to help. Early horror stories when alien predators introduced to control alien prey turned on native wildlife instead—cane toads in Australia, stoats in New Zealand—have given way to much more cautious and careful scientific introductions of highly specific control organisms. Done right, such biological control is indispensable.
The Centre for Agriculture and Biosciences International is an international agency that scours the native homes of invasive alien pests for predators that can control them. It found a rust fungus that has reduced the infestation of rubbervine weed from Madagascar in Queensland, Australia—by up to 90% in some areas. The Centre used two parasitic wasps to control the mango mealybug from Asia, which did huge damage to mango trees in Benin in Africa.
Vaccines that cause sterility are another promising weapon. Spreading food coated with such a vaccine could render a species sterile, causing its numbers to fall. This approach is working well in the lab with pigs—invasive species in various places—and may soon help to fight gray squirrels in Britain.
Genomics is the latest weapon. The Aedes mosquito that spreads dengue and zika in the Americas is an invasive alien, from Africa. A biotech firm called Oxitec has devised a way of suppressing its population using mass releases of genetically modified males (males don’t bite), which father offspring that cannot mature. In trials in Brazil, this method has achieved more than 90% suppression of numbers.
The next step is even craftier. Using a mechanism called “gene drive,” it is possible in the laboratory to create a genetic variant that will gradually infect an entire population of a species with infertility. Whether such a technique would work in the wild, and how it could be safely controlled, or reversed if it began to affect the species back in its native range, are still unanswered questions.
Many nonnative species are here to stay, and many are welcome additions to biodiversity of a country. But scientists are going to be very busy over the next few decades working to reverse the damage done by some and to prevent the arrival of others.
August 29, 2016
An ice-free Arctic Ocean has happened before
My Times column on how the Arctic sea ice has melted in late summer before, between 10,000 and 6,000 years ago:
The sea ice in the Arctic Ocean is approaching its annual nadir. By early September each year about two thirds of the ice cap has melted, then the sea begins to freeze again. This year looks unlikely to set a record for melting, with more than four million square kilometres of ice remaining, less than the average in the 1980s and 1990s, but more than in the record low years of 2007 and 2012. (The amount of sea ice around Antarctica has been increasing in recent years, contrary to predictions.)
This will disappoint some. An expedition led by David Hempleman-Adams to circumnavigate the North Pole through the Northeast and Northwest passages, intending to demonstrate “that the Arctic sea ice coverage shrinks back so far now in the summer months that sea that was permanently locked up now can allow passage through”, was recently held up for weeks north of Siberia by, um, ice. They have only just reached halfway.
Meanwhile, the habit of some scientists of predicting when the ice will disappear completely keeps getting them into trouble. A Nasa climate scientist, Jay Zwally, told the Associated Press in 2007: “At this rate, the Arctic Ocean could be nearly ice-free at the end of summer by 2012.” Two years later Al Gore quoted another scientist that “there is a 75 per cent chance that the entire north polar ice cap, during the summer months, could be completely ice-free within five to seven years” — that is, by now.
This year Professor Peter Wadhams of Cambridge University has a new book out called Farewell to Ice, which gives a “greater than even chance” that the Arctic Ocean will be ice-free next month. Not likely.
He added: “Next year or the year after that, I think it will be free of ice in summer . . . You will be able to cross over the North Pole by ship.” The temptation to predict a total melt of the Arctic ice cap, and thereby get a headline, has been counterproductive, according to other scientists. Crying wolf does not help the cause of global warming; it only gives amusement to sceptics.
Would it matter if it did all melt one year? Here’s the point everybody seems to be missing: the Arctic Ocean’s ice has indeed disappeared during summer in the past, routinely. The evidence comes from various sources, such as beach ridges in northern Greenland, never unfrozen today, which show evidence of wave action in the past. One Danish team concluded in 2012 that 8,500 years ago the ice extent was “less than half of the record low 2007 level”. A Swedish team, in a paper published in 2014, went further: between 10,000 years ago and 6,000 years ago, the Arctic experienced a “regime dominated by seasonal ice, ie, ice-free summers”.
[Here is the abstract of the latter paper:
Arctic Ocean sea ice proxies generally suggest a reduction in sea ice during parts of the early and middle Holocene (∼6000–10,000 years BP) compared to present day conditions. This sea ice minimum has been attributed to the northern hemisphere Early Holocene Insolation Maximum (EHIM) associated with Earth's orbital cycles. Here we investigate the transient effect of insolation variations during the final part of the last glaciation and the Holocene by means of continuous climate simulations with the coupled atmosphere–sea ice–ocean column model CCAM. We show that the increased insolation during EHIM has the potential to push the Arctic Ocean sea ice cover into a regime dominated by seasonal ice, i.e. ice free summers. The strong sea ice thickness response is caused by the positive sea ice albedo feedback. Studies of the GRIP ice cores and high latitude North Atlantic sediment cores show that the Bølling–Allerød period (c. 12,700–14,700 years BP) was a climatically unstable period in the northern high latitudes and we speculate that this instability may be linked to dual stability modes of the Arctic sea ice cover characterized by e.g. transitions between periods with and without perennial sea ice cover.]
This was a period known as the “early Holocene insolation maximum” (EHIM). Because the Earth’s axis was tilted away from the vertical more than today (known as obliquity), and because we were then closer to the Sun in July than in January (known as precession), the amount of the Sun’s energy hitting the far north in summer was much greater than today. This “great summer” effect was the chief reason the Earth had emerged from an ice age, because hot northern summers had melted the great ice caps of North America and Eurasia, exposing darker land and sea to absorb more sunlight and warm the whole planet.
The effect was huge: about an extra 50 watts per square metre 80 degrees north in June. By contrast, the total effect of man-made global warming will reach 3.5 watts per square metre (but globally) only by the end of this century.
To put it in context, the EHIM was the period during which agriculture was invented in about seven different parts of the globe at once. Copper smelting began; cattle and sheep were domesticated; wine and cheese were developed; the first towns appeared. The seas being warmer, the climate was generally wet so the Sahara had rivers and forests, hippos and people.
That the Arctic sea ice disappeared each August or September in those days does not seem to have done harm (remember that melting sea ice, as opposed to land ice, does not affect sea level), and nor did it lead to a tipping point towards ever-more rapid warming. Indeed, the reverse was the case: evidence from stalagmites in tropical caves, sea-floor sediments and ice cores on the Greenland ice cap shows that temperatures gradually but erratically cooled over the next few thousand years as the obliquity of the axis and the precession of the equinoxes changed. Sunlight is now weaker in July than January again (on global average).
Barring one especially cold snap 8,200 years ago, the coldest spell of the past ten millennia was the very recent “little ice age” of AD1300-1850, when glaciers advanced, tree lines descended and the Greenland Norse died out.
It seems that the quantity of Arctic sea ice varies more than we used to think. We don’t really know how much ice there was in the 1920s and 1930s — satellites only started measuring it in 1979, a relatively cold time in the Arctic — but there is anecdotal evidence of considerable ice retreat in those decades, when temperatures were high in the Arctic.
Today’s melting may be man-made, but the EHIM precedent is still relevant. Polar bears clearly survived the ice-free seasons of 10,000-6,000 years ago, as they cope with ice-free summers or autumns in many parts of their range today, such as Hudson Bay. They need sea ice in spring when they feed on seal pups and they sometimes suffer if it is too thick, preventing seals from breeding in an area.
Meanwhile, theory predicts, and data confirms, that today’s carbon-dioxide-induced man-made warming is happening more at night than during the day, more during winter than summer and more in the far north than near the equator. An Arctic winter night is affected much more than a tropical summer day. If it were the other way around, it would be more harmful.
Some time in the next few decades, we may well see the Arctic Ocean without ice in August or September for at least a few weeks, just as it was in the time of our ancestors. The effect on human welfare, and on animal and plant life, will be small. For all the attention it gets, the reduction in Arctic ice is the most visible, but least harmful, effect of global warming.
August 20, 2016
Whatever happened to Adam Smith?
My Times column on economic libertarianism:
Last week both Hillary Clinton and Donald Trump set out their economic policies in set-piece speeches. Mr Trump’s, delivered in Detroit, so far as one could tell from the fractured syntax and the digressions into invective, involves a trade policy designed to punish consumers and protect producers, a recipe for recession. But Mrs Clinton’s, also delivered in Michigan, was even worse. She too wants to pursue the old mercantilist fallacy of restricting imports and helping exports, but while spending more money, unleashing a blizzard of new regulations and doubling the minimum wage.
Never have the American people been faced with such paternalist, protectionist and authoritarian pair of options. The United States, long a beacon of economic libertarianism, is now being offered a choice between two forms of growth-killing, deficit-boosting, zero-sum, big-government economic nationalism. Long gone are the days when both Republicans and Democrats subscribed to some form of free-market economic philosophy while differing mainly over how to fight the cold war and the culture wars.
It’s true that back in the days of Barry Goldwater or Ronald Reagan, the purer forms of economic libertarianism came awkwardly packaged with social and military authoritarianism. And the likes of Richard Nixon, Jimmy Carter and Bill Clinton were only lukewarm in their support of free markets, while being more socially libertarian. But at least liberty was on the menu.
This may be why Gary Johnson, the former two-term governor of New Mexico, and his running mate William Weld, former governor of Massachusetts, are riding high on the Libertarian Party ticket. They are close to the 15% threshold in the opinion polls in several states that would force the presidential television debates to include them. Johnson wants the government out of both the bedroom (he’s pro-choice, anti-war and for drug decriminalization) and the boardroom: he wants small government and low taxes. He climbed Everest, so there is no doubting his toughness, but there is no plausible scenario in which he could win the presidency.
It is the same around the world. Economic liberty is out of fashion. There is almost no country trying the sort of free-market reforms – tax cuts, deregulation, privatisation – that so many countries achieved in the 1980s and 1990s. China and Russia, liberalised briefly in the late twentieth century, seem to be heading back to Big Brother. Brazil has seen its market reforms congeal into crony-corporatism. India and Japan are hardly paragons of small-government economic liberalism. Even here in Britain, I doubt Theresa May took Hayek’s “Road to Serfdom” to Switzerland as holiday reading.
Is Adam Smith’s influence fading? This is what the sage of Kirkcaldy said: “Little else is requisite to carry a state to the highest degree of opulence from the lowest barbarism, but peace, easy taxes, and a tolerable administration of justice; all the rest being brought about by the natural course of things. All governments which thwart this natural course, which force things into another channel, or which endeavour to arrest the progress of society at a particular point, are unnatural, and to support themselves are obliged to be oppressive and tyrannical.”
“Laissez faire, laissez passer” is the most tolerant of all creeds. As Smith insisted, it’s the very opposite of “pro-business” or pro-inequality; the market loves to disrupt complacent cartels. Yet to listen to most of the intelligentsia, you would think that freedom to exchange goods and services – which they prefer to call by the Marxist word “capitalism” – has done terrible harm in the world and needs taming by virtuous government. Further, that small-government philosophy has been terminally discredited, not least by the financial crisis of 2008.
But the financial markets were heavily regulated cartels in the run-up to the crisis. The Insurance giant AIG, whose credit default swaps went belly up, had been, in George Gilder’s words, “supervised and pettifogged by federal, state, local, and global beadles galore, in fifty states and more than a hundred countries”. The explosion in sub-prime lending, far from being the product of deregulation, was the direct result of mandates passed by Congress to increase mortgage lending to low-income and minority people. These mandates were imposed on government–sponsored enterprises (Fannie Mae and Freddie Mac), enforced by law and encouraged by two presidents. George W. Bush added regulations to the US economy at the rate of up to 78,000 pages a year.
Show me a country suffering from too much economic freedom. Somalia? No: it has too much government – competing forms of it called warlords. Haiti? No: its red tape is the despair of investors and aid donors. Chile? It has a socialist president. The experiment of too much economic liberty has not been tried.
There is a long list of countries that were transformed by free-market reforms: post-war Germany under Ludwig Erhard, China under Deng Xiaoping, New Zealand under Roger Douglas, America under Ronald Reagan, Britain under Margaret Thatcher, Estonia under Mart Laar, India under Manmohan Singh. South Korea, Taiwan, Vietnam, Peru... Or Peel and Gladstone’s Britain, seventeenth century Holland, the city states of Renaissance Italy, Song dynasty China, ancient Athens, Tyre and Sidon under the Phoenicians. In every case, trade did that.
Hongkong is probably the most successful economy of the last half century, going from abject poverty to opulence without a natural resource of any kind. It did so largely because one man, Sir John Cowperthwaite, the financial secretary of the colony in the 1960s, insisted on minimal government interference in commerce, on low taxes and little regulation, infuriating his LSE-educated superiors in London with his refusal to follow their socialist plans. Yet when I was in Hongkong recently and met the free-market Lion Rock think-tank, I was struck by how pessimistic they felt about winning the argument for small government, even there.
By contrast, I can point you to a list as long as your arm of countries ruined by too much government. Venezuela, North Korea, Belarus and Zimbabwe are top of the list today, but Hitler, Mao, Stalin and Pol Pot (plus most empires) are egregious reminders that government is a more dangerous toy than markets ever could be.
Why is economic libertarianism out of favour? Unlike welfare-socialism and crony-capitalism, it fails to create vested interests dependent on its subsidies. The whole point of running for president is to be able to hand other people’s money to your favourite causes and generate grateful patronage. Laissez-faire robs you of that treat.
August 12, 2016
Getting the rich to pay for conservation
My Spectator article on the similarity between trophy hunting in Africa and grouse shooting in Durham. Both have huge benefits for non-target species of wildlife.
The vast Bubye Valley Conservancy in southern Zimbabwe is slightly larger than County Durham, as well as much hotter and drier. Yet both contain abundant wildlife thanks almost entirely to the hunting of game. In Bubye Valley, it’s lions and buffalo that are the targets; in the Durham dales, it’s grouse. But the effect is the same — a spectacular boost to other wildlife, privately funded.
Bubye Valley was a cattle ranch, owned by Unilever, until 1994 when it was turned over to wildlife. A double electric fence was put round the entire 850,000-acre reserve. Gradually the buffalo, giraffe, wildebeest, zebra and antelope numbers grew. Elephants and rhinos were moved there from areas more vulnerable to poaching, and the conservancy now has the third-largest black rhino population in the world. Seventeen lions were introduced and there are now more than 500 — so many that they are reducing the numbers of cheetahs and wild dogs as well as their normal prey, and may need to be culled.
Being hot, dry and featureless, the thick bush of Bubye Valley does not make good photo-tourism country, so the reserve derives income from selling licences to rich hunters to stalk and shoot buffalo, lion and other species. Shocking? No: the income from the licences — as well as the meat — is shared with local communities, and goes to build clinics and schools. The conservancy also employs hundreds of people. This is a self-funding conservation triumph in which the rich pay and the poor benefit.
It’s the same all over the world: properly controlled hunting provides an incentive and a reward for conservation. One cold, foggy morning in early May this year, at 4 a.m., I was sitting in a little tent on a Durham moor watching what I think is Britain’s most spectacular wildlife sight: the lek, or communal display, of male black grouse. As more than 25 black-and-white cocks with red eyebrows strutted their stuff just a few yards from where I sat, I was serenaded by curlews, golden plovers, lapwings, snipe, redshanks, oyster-catchers, red grouse, partridge, skylarks, cuckoos and many other birds.
The reason for this extraordinary abundance, which you would not find in the hills of Wales, Dartmoor or the Lake District, is simple — gamekeepers. Grouse moors have a zero-tolerance policy towards foxes, crows, magpies and stoats, all of which eat the eggs and chicks of ground-nesting birds. In Britain today, where the number of crows and foxes has rocketed because of road kill, landfill and a lack of natural predators, ground-nesting birds cannot thrive without human intervention.
It is thanks to gamekeepers that black grouse numbers, after declining for decades, have almost doubled in recent years in the Pennines. With the help of the Game and Wildlife Conservation Trust, black grouse are beginning to recolonise dales from which they were lost decades ago.
On another North Pennine moor, a survey of breeding birds was carried out this spring. The results have gobsmacked conservationists. On this one grouse moor, there were at least 400 pairs of curlews breeding. This is about as many as in the whole of Wales. There were 800 pairs of lapwings, 100 pairs of golden plovers, 50 pairs of oyster-catchers, 40 pairs of redshanks, 200 pairs of snipe, 50 pairs of woodcocks, 60 pairs of common sandpipers.
The report stated: "To conclude a very special place for birds in my opinion & the figures speak for themselves as does the visual spectacle of the waders in the breeding areas. A very well managed estate & home for a lot of key bird species."
and the person who did the survey commented privately to the estate owner:
"I travel widely around the globe studying breeding birds including the arctic circle which is the bread basket for all wader breeding in the world & even there I have never seen the density of wader breeding in such a small area ! Dartmoor is now down to a single pair of curlew & 3 pair of lapwing on the whole moor (365 SQ miles) which always fail due to corvids ,foxes etc ! Only on the high moor are there about 4 pair of dunlin but that is it. Golden plover extinct breeder & merlins among others. This is the stark contrast & reality of a moorland with no control & management in place compared with the jewel you have created and the sanctuary for our endangered wader species ! A superb place for birds across the board & a credit to the team involved !"
In the early 2000s, at Otterburn in Northumberland, the trust did a neat experiment in which two areas had gamekeepers and two did not, then they swapped for four years. The results were astonishing. With gamekeepers, the breeding success of golden plovers, curlews and lapwings more than doubled, and their numbers rocketed. This is from the GWCT's website:
"The breeding success of some species was significantly improved with predator removal (see Figure 1). For lapwing, golden plover, curlew, red grouse and meadow pipit, on average a three-fold improvement was seen, from 23% of pairs fledging young without predator removal to 64% of pairs fledging young with predator removal...
For all the wader species (curlew, golden plover, lapwing, we found increases in abundance with predator removal (mean annual change +37%) and decreases in abundance with no predator removal (mean annual change -28%), but the changes in abundance were statistically significantly different only for curlew (with a three-year lag) and lapwing."
All these birds thrive in the Pennines (and the Angus glens of Scotland) precisely because red grouse, the sportsman’s quarry, also thrive in the absence of their nest predators. As well as tasting delicious, red grouse are ornithologically special: arguably the British Isles’ only endemic bird, found nowhere else in the world. Officially a subspecies of the willow grouse, they look very different (refusing to go white in winter, for example) and have uniquely adapted to a diet of heather, a plant that also thrives here more than anywhere else.
To have developed techniques of moorland management, including heather-burning and the control of ticks and parasitic worms, that allow red grouse to thrive wild in numbers that rival the wildebeest of the Serengeti is an achievement in itself. To have done so while benefiting other wildlife and providing employment and welcome income to the dales and glens is magnificent.
Grouse shooting has saved this special habitat, with its rare mosses, spiders and moths, from being ruined, while making very little demand on the taxpayer — indeed while paying hefty taxes. Managing heather moorland for grouse means not planting it with subsidised Sitka spruce trees, or over-grazing it with subsidised sheep, or wrecking it with subsidised windmills.
It is a myth, by the way, that moorland is worse at retaining water and preventing flooding than forest. Spruce plantations in the peaty uplands lack absorbent mossy undergrowth and are scarred by deep ditches, which increase the rate of runoff during storms. Moorland owners have blocked the ditches they were bribed to dig in the 1970s. They now look with bemusement across the fence at the Forestry Commission digging new ditches in deep peat — for which they would be prosecuted if they did it.
Other rare, red-listed birds are also thriving on this one moor, which is typical of all the North Pennine grouse moors in its wildlife: cuckoos, ring ouzels, tree pipits, whinchats, pied flycatchers, marsh tits. And merlins — Britain’s smallest and rarest falcon is doing well here, watched over by gamekeepers in case egg collectors come calling. Merlins rarely succeed in breeding if foxes or crows are on the prowl.
The same is true of hen harriers, the bird that the RSPB makes a huge fuss about because it alleges that gamekeepers persecute them — as some may occasionally do, because hen harriers like to congregate in small breeding colonies and eat grouse. Hen harriers, which are ground-nesting birds vulnerable to foxes, are thriving in Scotland, especially on fox-free islands such as the Orkneys or in places where gamekeepers control fox numbers. But they have struggled to recolonise England. An experiment in the 1990s at Langholm in southern Scotland proved the point: when hen harriers increased, grouse numbers crashed, so gamekeepers lost their jobs, at which point foxes returned and hen harrier numbers also crashed.
Last year 12 hen harriers nested in England. The seven nests under the control of the Royal Society for the Protection of Birds reared just one chick. The other five nests, not controlled by the RSPB, reared 17 chicks. Yet the RSPB has the cheek to lecture private landowners on how to protect hen harriers.
You will hear little of this from the BBC. An independent review of the BBC’s impartiality by Heather Hancock, now chairman of the Food Standards Agency, in 2014 found that BBC newsrooms had an unhealthy dependence on the RSPB’s vast press operation for all of their countryside coverage. Chris Packham, a BBC presenter and vice-president of the RSPB, goes around demanding that the ‘evil’ pastime of grouse shooting be banned, apparently indifferent to the collapse in numbers of curlews and black grouse that would undoubtedly follow. (The BBC has now announced that it has launched an investigation into Packham’s outbursts, following a complaint from the Countryside Alliance.)
Throughout the world, hunters benefit conservation. In America, this is explicitly recognised, with hunting licence fees used to fund conservation. Even the International Union for the Conservation of Nature recognises that ‘recreational hunting can contribute to biodiversity conservation’. In Africa, the contrast is stark between Kenya, which banned hunting in 1977 and has seen its wildlife populations plunge by 70 per cent, and Namibia, which encourages trophy hunting and has seen wildlife populations increase steadily.
Here is an extract from a 2014 article about Kenya's wildlife:
"Kenya’s much-praised ban on hunting, in fact, has had an impact opposite to its intent: wild animals are disappearing at an accelerating rate. “Charismatic megafauna” — elephants, lions, rhinos, the larger antelopes — are in a true death spiral. When Kenya’s hunting ban was passed in 1977 in response to the “Ivory Wars” that were ravaging the nation’s elephants, it was hailed as a new and progressive paradigm for wildlife management. With the hunting pressure off, animal lovers opined, the game would bounce back. And it’s true that elephants did recover modestly over the ensuring two decades. But now the slaughter has begun anew, driven by an unrelenting demand from a prosperous Asia for ivory objets d’art. Meanwhile, everything else is going down the tubes, including carnivores and antelopes. By best estimates, Kenya’s wildlife has declined by more than 70 percent over the past 20 years.
What happened? While the ban played well in the developed world, it was catastrophic for the people who lived in the rural hinterlands of Kenya – the places where wildlife actually exists. Basically, folks out in the bush had the responsibility for maintaining wildlife on their lands, but they were deprived of any benefit from the animals. Such a situation is intolerable for subsistence pastoralists and farmers.
Subsequent to the ban, they could not respond – legally – when an elephant raided their maize and stomped their goats, or when a lion killed a cow. But laws made in Nairobi are seldom if ever applied with rigor in the Kenyan bush. Even as animal rights groups lionized Kenya’s no-kill policy and urged its adoption across Africa, the killing has continued unabated. Carnivores are poisoned, antelope snared, elephants speared and shot: Crops can thus be raised and the livestock grazed in peace.
Michael Norton-Griffiths, who has served as the senior ecologist for Tanzania’s Serengeti National Park and the manager of the Eastern Sahel Program for the International Union for the Conservation of Nature, likened the situation to owning a goat. Assume, says Norton-Griffiths that you’re a poor pastoralist in rural Kenya, and your assets consist of a goat. You can eat this goat, or milk it. You can sell it, gaining hard currency that you can use to buy necessities. Or you can breed it, increasing your asset base in the form of another goat. But now imagine that a law is passed that forbids you to eat, sell, or breed that goat. In fact, the only thing you can do with it is allow tourists to take pictures of it. Even then, you obtain no benefit; the money derived from the tourists photographing the goat goes to the owner of the “eco-lodge” they are patronizing. By substituting wildlife for the goat, says Norton-Griffiths, you have the situation that exists in Kenya today."
Yet far from being thanked for their efforts, the conservationists of Bubye Valley and the Durham dales are under almost constant attack from animal-welfare groups and their allies in government and the media. The critics hate the idea that rich people are pouring millions into saving the best habitats and species in exchange for harvesting some free-range meat. They seem to prefer conservation fully nationalised and bloodless, even if unsuccessful.
Neither lions nor grouse can control their populations when habitat is limited, other than by starvation and disease. Human hunting, especially if the rich pay and the income is captured by locals and invested in wildlife, is morally, economically and environmentally a better solution. Many of those who shoot and eat grouse this month will do so precisely because of, not despite, the fact that they are keen conservationists.
Matt Ridley's Blog
- Matt Ridley's profile
- 2180 followers
