Matt Ridley's Blog, page 40

May 11, 2014

E-cigarettes are making tobacco obsolete, so why ban them?

My article in the Spectator (3 May) on vaping
versus smoking:



If somebody invented a pill that could cure a disease that kills
five million people a year worldwide, 100,000 of them in this
country, the medical powers that be would surely encourage it, pay
for it, perhaps even make it compulsory. They certainly would not
stand in its way.



A relentless stream of data from around the world is showing
that e-cigarettes are robbing tobacco companies of today’s
customers — and cancer wards of their future patients. In
Britain alone two million now use these devices regularly. In study
after study, scientists are finding e-cigarettes to be effective at
helping people quit, to show no signs of luring non-smokers into
tobacco use and to be much safer than their noxious
competitors.



So what in heaven’s name explains the fact that Dame Sally
Davies, the government’s chief medical officer, when asked by
the New Scientist in March what was the biggest
health challenge we face in Britain, named three things, one of
which was the electronic cigarette? That’s like criticising
contraception because you prefer abstinence.



The NHS is confident that these devices are about 1,000 times
less harmful than cigarettes. The government confirmed this figure
in a parliamentary answer to me. It’s the tar in smoke that kills,
not the nicotine — a substance that is about as harmful as
caffeine.



We know vaping (as it’s known) works better than any other
method of giving up smoking. A forthcoming study by Professor
Robert West of University College London finds that e-cigarettes
proved 60 per cent more successful as a method of quitting than
nicotine patches, gums or going cold turkey. By a country mile,
free enterprise devices are outstripping the health results of
medicinally regulated devices. And for many vested interests that
is the problem.



We know that most people use e-cigarettes to cut down or give up
smoking. This has been confirmed by three big surveys, the latest
of which, conducted by Ash, the anti-smoking group, was published
this week: two thirds of users in the survey were smokers and one
third were ex-smokers. That means in the few years since the
products first appeared, hundreds of thousands of people have used
them to give up or cut down.








We know that e-cigarettes are not proving to be a gateway into
tobacco. In the biggest global survey, 0.4 per cent of vapers were
non-smokers and not one of them went on to smoke. In the UK, 20 per
cent of 15-year-olds are regular smokers: they are mostly the ones
who try vaping, so even in the young the technology is a gateway
out of smoking, not into it. (And it makes snogging taste better.)
Yet what is the UK government’s main legislative response to
e-cigarettes so far? To ban sales of e-cigarettes to
children.



Do the maths. If e-cigarettes are 1,000 times less harmful than
cigarettes, then for every youngster who goes from smoking to
vaping, there would have be a thousand going the other way before
there is net harm. If anything, the ratio is the other way round:
in one American study, nine out of ten school-age vapers had
started as smokers.



The firms that make e-cigarettes — which are mostly small
start-up companies, the technology having come from China
— are not allowed to claim they can save your life. Imagine
what they could sell if they could. Instead their adverts try to
hint that vaping is cool, which feeds the puritan suspicion that
somebody somewhere might be enjoying themselves.



This argument that vaping is going to ‘renormalise’ smoking is
the one the British Medical Association has been pushing, and, as
Ash is now saying, it is clearly nonsense. With that gone, what
arguments are left to justify regulating the advertising, public
use and product strength of this life-saving technology to the
point of discouraging it?



Some medics probably just hate the thought that a
near-miraculous cure for a big cause of death came from the private
sector and not from the nanny state. The people selling these
things are doing so for — gasp! — profit, not because
they want to save lives.



In several conversations I have had with senior medics, they
immediately raised the horrifying fact that the tobacco industry
has recently started producing e-cigarettes. For them this was a
clinching argument against the technology.



No, I replied, that is the best news of all. The fact that even
the tobacco industry is going to be competing against tobacco is
great news. It shows that big tobacco can read the writing on the
wall and is trying to get out of selling smoke before it goes the
way of Kodak film. The number of people smoking is falling fast.
Imperial Tobacco recorded a 16 per cent decline in UK sales last
year. One US investment broker reckons vaping will be bigger than
smoking by 2023. The tobacco industry is panicking.



It means you have won, I tell medics. Forget your bans on
smoking in cars with children in, or banning brand names on
packets. These were never going to make more than a marginal
difference anyway. The cigarette is going the way of the top hat
and the crinoline, if we encourage the safer, cleaner alternative.
Here’s a life–saving technology on a massive scale that needs no
funding. Are you sure that you — swearers of the Hippocratic
oath — want to be the last people standing in its way, when
everybody else can see the benefits?



The opposition to vaping has had an unfortunate result already.
By insisting on including e-cigarettes in the EU’s tobacco products
directive, the opponents have left them unregulated till the
directive comes into force by 2017. And then over-regulated,
pushing up prices and reducing choice after 2017. So unless
the UK government makes its own helpful intervention, for the
next two-and-a-half years there is little to stop rogue operators
importing fake or adulterated vaping fluids from some crook. Plus
the battle over regulation, as so often, helps the big guys and
hurts the little guys.



By the way, where’s the left in all this? Smoking is
increasingly concentrated in lower socioeconomic groups. How can we
get e-cigarettes into the hands of the poor quickly? The high
up-front costs of e-cigarettes (followed by lower ‘running’ costs)
means their take-up by poorer people has been slower. Why are
libertarians doing all the hard work?



Next time you hear somebody say that they worry about the
potential risks of e-cigarettes, remind them of Voltaire’s dictum
— don’t let the best be the enemy of the good.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on May 11, 2014 03:53

May 4, 2014

Technology is often the mother of science, not vice versa

My Times column is on the relationshio between
science and technology, especially in the UK:



The chancellor, George Osborne, made a speech on science in Cambridge last
week
in which he contrasted Britain’s “extraordinary”
scientific achievements with “our historic weakness when it comes
to translating those scientific achievements into commercial gain”.
It’s a recurring complaint in British science policy that we
discover things, then others make money out of them.



Britain’s astonishing ability to gather scientific firsts — we
are second only to the US in Nobel prizes — shows no sign of
abating. We have won 88 scientific Nobel prizes, 115 if you add
economics, literature and peace. This includes 12 in the past ten
years and at least one in each of the past five years. But we filed
fewer patents last year than the US, Japan, Germany, France, China
or South Korea, and we have seen many British discoveries
commercialised by others: graphene, DNA sequencing, the worldwide
web, to name a few. So yes, we are good at science but bad at
founding new industries.



The government’s response to this is to encourage clusters of
high-tech companies, such as the one around Cambridge, with its
1,500 technology-based firms, and to support technology “catapults”
to help infant industries to get going in eight technologies in
which Britain has a potential lead. An example is the manufacturing
centre in Coventry with its 3D printing expertise.



Most of the firms in the Cambridge cluster moved there to get
close to the university; they did not spin out of the university.
Technology tends to come from technology, and to use science to
help the process along, rather than to be born of science. The idea
that innovation happens because you put science in one end of the
pipe and technology comes out the other end goes back to Francis
Bacon, and it is largely wrong. History shows that (public) science
is the daughter of (private) technology at least much as it is the
mother. Good universities recognise this and adjust their research
programmes to what interests industry.



The steam engine led to the insights of thermodynamics, not vice
versa. The dye industry drove chemistry. The centrifuge and X-ray
crystallography, developed for the textile industry, led to the
structure of DNA. DNA sequencing (a British technology) led to
genomics (an international science). The development of mobile
telephones, horizontal drilling for oil, and search engines owed
almost nothing to university research. Sure, the firms that made
these breakthroughs later went to universities in search of
educated staff, and to help to solve problems through contracted
research. But the breakthroughs owed less to the philosophical
ruminations of scientists than to the tinkering of engineers.



Twenty years ago the economist Partha Dasgupta pointed out that the “republic of science”,
with its insistence that results must be shared and that rewards
come in the form of prizes and prestige, was very different from
the privatised world of technology, where patents and profits were
what mattered. In a paper with Paul David, he said: “Modern
societies need to have both communities firmly in place and attend
to maintaining a synergistic equilibrium between them.”



So perhaps the British problem is that we are good at the public
bit, but not the private bit. Being good at science, we share our
results with the world, rather than benefiting from them ourselves.
It is not quite that simple. Terence Kealey, the vice-chancellor of
Buckingham University, one of the foremost authorities on the
economics of science, has made a strong case — recently buttressed
with that badge of economic respectability, a mathematical model — that science is not a
pure “public good”, like light from a lighthouse.



Although knowledge is shared among scientists, it is still not
automatically accessible to “any passer-by or person of average
curiosity”, say Professor Kealey and his co-author, Martin
Ricketts. To join the conversation you need the tacit knowledge
that comes from training in the particular field of science itself.
And that’s why private firms are keen to cluster round Cambridge,
to get the expertise and contacts, and to eavesdrop on the
scientific chat in the university as well as in each other’s coffee
rooms.



The mathematical model shows that there is a “pinch point” in
research, where researchers need encouragement to start sharing
knowledge with each other. There are lots of examples of government
giving that encouragement and doing it well. So Professor Kealey
says that inasmuch as Mr Osborne is creating new institutions of
knowledge sharing and trust between previously separate entities
(particular universities and industry), he is doing something that
both theory and history show can be of benefit: it is one area
where government action can be shown to have been a good thing.



Incidentally, history provides little support for the commonly
held view that munificent funding of science by government results
in faster economic growth. In the late 19th and early 20th century,
France and Germany provided much public funding to science while
Britain and America did not. The anglophone countries grew faster.
And for every example of an unexpected spin-off from public funding
of science — the worldwide web was invented by Sir Tim Berners-Lee
so that physicists could share their results — there are plenty of
cases of private funding having public effects. Sputnik, the
pioneering Russian space probe, relied extensively on prewar
research privately funded by Robert Goddard, supported by the
Guggenheims.



In 2003 the OECD published a paper on “sources of growth in OECD
countries” between 1971 and 1998, finding to its explicit surprise
that, whereas privately funded research and development stimulated
economic growth, publicly funded research had no economic impact
whatsoever. It is possible that government spending on the wrong
kind of science stops people working on the right kind of science
as far as economic growth is concerned — too many esoteric projects
of no interest to nearby industries.



Not that this means the public funding of science should cease.
Given that the government takes close to half of GDP and spends it
on many things, it would be a shame if none of that money found its
way back to science, one of the great triumphs of our culture. The
American physicist Robert Wilson, when asked in a congressional
hearing during the cold war how a particle accelerator would
contribute to national defence, replied that it had “nothing to do
directly with defending our country except to help make it worth
defending”.

 •  0 comments  •  flag
Share on Twitter
Published on May 04, 2014 11:56

April 30, 2014

Why most resources don't run out

My Saturday essay in the Wall Street Journal on
resources and why they get more abundant, not less:



How many times have you heard that we humans are "using up" the
world's resources, "running out" of oil, "reaching the limits" of
the atmosphere's capacity to cope with pollution or "approaching
the carrying capacity" of the land's ability to support a greater
population? The assumption behind all such statements is that there
is a fixed amount of stuff—metals, oil, clean air, land—and that we
risk exhausting it through our consumption.



"We are using 50% more resources than the Earth can sustainably
produce, and unless we change course, that number will grow fast—by
2030, even two planets will not be enough," says Jim Leape, director general of the World
Wide Fund for Nature International (formerly the World Wildlife
Fund).



But here's a peculiar feature of human history: We burst through
such limits again and again. After all, as a Saudi oil minister
once said, the Stone Age didn't end for lack of stone. Ecologists
call this "niche construction"—that people (and indeed some other
animals) can create new opportunities for themselves by making
their habitats more productive in some way. Agriculture is the
classic example of niche construction: We stopped relying on
nature's bounty and substituted an artificial and much larger
bounty.



Economists call the same phenomenon innovation. What frustrates
them about ecologists is the latter's tendency to think in terms of
static limits. Ecologists can't seem to see that when whale oil
starts to run out, petroleum is discovered, or that when farm
yields flatten, fertilizer comes along, or that when glass fiber is
invented, demand for copper falls.



That frustration is heartily reciprocated. Ecologists think that
economists espouse a sort of superstitious magic called "markets"
or "prices" to avoid confronting the reality of limits to growth.
The easiest way to raise a cheer in a conference of ecologists is
to make a rude joke about economists.



I have lived among both tribes. I studied various forms of
ecology in an academic setting for seven years and then worked at
the Economist magazine for eight years. When I was an ecologist (in
the academic sense of the word, not the political one, though I
also had antinuclear stickers on my car), I very much espoused the
carrying-capacity viewpoint—that there were limits to growth. I
nowadays lean to the view that there are no limits because we can
invent new ways of doing more with less.



This disagreement goes to the heart of many current political
issues and explains much about why people disagree about
environmental policy. In the climate debate, for example,
pessimists see a limit to the atmosphere's capacity to cope with
extra carbon dioxide without rapid warming. So a continuing
increase in emissions if economic growth continues will eventually
accelerate warming to dangerous rates. But optimists see economic
growth leading to technological change that would result in the use
of lower-carbon energy. That would allow warming to level off long
before it does much harm.



It is striking, for example, that the Intergovernmental Panel on
Climate Change's recent forecast that temperatures would rise by
3.7 to 4.8 degrees Celsius compared with preindustrial levels by
2100 was based on several assumptions: little technological change,
an end to the 50-year fall in population growth rates, a tripling
(only) of per capita income and not much improvement in the energy
efficiency of the economy. Basically, that would mean a world much
like today's but with lots more people burning lots more coal and
oil, leading to an increase in emissions. Most economists expect a
five- or tenfold increase in income, huge changes in technology and
an end to population growth by 2100: not so many more people
needing much less carbon.



In 1679, Antonie van Leeuwenhoek, the great Dutch microscopist,
estimated that the planet could hold 13.4 billion people, a number
that most demographers think we may never reach. Since then, estimates have bounced around
between 1 billion and 100 billion, with no sign of converging on an
agreed figure.



Economists point out that we keep improving the productivity of
each acre of land by applying fertilizer, mechanization, pesticides
and irrigation. Further innovation is bound to shift the ceiling
upward. Jesse Ausubel at Rockefeller University calculates that the amount of land required to
grow a given quantity of food has fallen by 65% over the past 50
years, world-wide.



Ecologists object that these innovations rely on nonrenewable
resources, such as oil and gas, or renewable ones that are being
used up faster than they are replenished, such as aquifers. So
current yields cannot be maintained, let alone improved.



In his recent book "The View from Lazy Point," the ecologist
Carl Safina estimates that if everybody had the living standards of
Americans, we would need 2.5 Earths because the world's
agricultural land just couldn't grow enough food for more than 2.5
billion people at that level of consumption. Harvard emeritus
professor E.O. Wilson, one of ecology's patriarchs, reckoned that
only if we all turned vegetarian could the world's farms grow
enough food to support 10 billion people.



Economists respond by saying that since large parts of the
world, especially in Africa, have yet to gain access to fertilizer
and modern farming techniques, there is no reason to think that the
global land requirements for a given amount of food will cease
shrinking any time soon. Indeed, Mr. Ausubel, together with his
colleagues Iddo Wernick and Paul Waggoner, came to the startling conclusion that, even
with generous assumptions about population growth and growing
affluence leading to greater demand for meat and other luxuries,
and with ungenerous assumptions about future global yield
improvements, we will need less farmland in 2050 than we needed in
2000. (So long, that is, as we don't grow more biofuels on land
that could be growing food.)



But surely intensification of yields depends on inputs that may
run out? Take water, a commodity that limits the production of food
in many places. Estimates made in the 1960s and 1970s of water
demand by the year 2000 proved grossly overestimated: The world
used half as much water as experts had projected 30 years
before.



The reason was greater economy in the use of water by new
irrigation techniques. Some countries, such as Israel and Cyprus,
have cut water use for irrigation through the use of drip
irrigation. Combine these improvements with solar-driven
desalination of seawater world-wide, and it is highly unlikely that
fresh water will limit human population.



The best-selling book "Limits to Growth," published in 1972 by
the Club of Rome (an influential global think tank), argued that we
would have bumped our heads against all sorts of ceilings by now,
running short of various metals, fuels, minerals and space. Why did
it not happen? In a word, technology: better mining techniques,
more frugal use of materials, and if scarcity causes price
increases, substitution by cheaper material. We use 100 times
thinner gold plating on computer connectors than we did 40 years
ago. The steel content of cars and buildings keeps on falling.



Until about 10 years ago, it was reasonable to expect that
natural gas might run out in a few short decades and oil soon
thereafter. If that were to happen, agricultural yields would
plummet, and the world would be faced with a stark dilemma: Plow up
all the remaining rain forest to grow food, or starve.



But thanks to fracking and the shale revolution, peak oil and
gas have been postponed. They will run out one day, but only in the
sense that you will run out of Atlantic Ocean one day if you take a
rowboat west out of a harbor in Ireland. Just as you are likely to
stop rowing long before you bump into Newfoundland, so we may well
find cheap substitutes for fossil fuels long before they run
out.



The economist and metals dealer Tim Worstall gives the example of tellurium, a key
ingredient of some kinds of solar panels. Tellurium is one of the
rarest elements in the Earth's crust—one atom per billion. Will it
soon run out? Mr. Worstall estimates that there are 120 million
tons of it, or a million years' supply altogether. It is
sufficiently concentrated in the residues from refining copper
ores, called copper slimes, to be worth extracting for a very long
time to come. One day, it will also be recycled as old solar panels
get cannibalized to make new ones.



Or take phosphorus, an element vital to agricultural fertility.
The richest phosphate mines, such as on the island of Nauru in the
South Pacific, are all but exhausted. Does that mean the world is
running out? No: There are extensive lower grade deposits, and if
we get desperate, all the phosphorus atoms put into the ground over
past centuries still exist, especially in the mud of estuaries.
It's just a matter of concentrating them again.



In 1972, the ecologist Paul Ehrlich of Stanford University came
up with a simple formula called IPAT, which stated that the impact of
humankind was equal to population multiplied by affluence
multiplied again by technology. In other words, the damage done to
Earth increases the more people there are, the richer they get and
the more technology they have.



Many ecologists still subscribe to this doctrine, which has
attained the status of holy writ in ecology. But the past 40 years
haven't been kind to it. In many respects, greater affluence and
new technology have led to less human impact on the planet, not
more. Richer people with new technologies tend not to collect
firewood and bushmeat from natural forests; instead, they use
electricity and farmed chicken—both of which need much less land.
In 2006, Mr. Ausubel calculated that no country with a GDP per head
greater than $4,600 has a falling stock of forest (in density as
well as in acreage).



Haiti is 98% deforested and literally brown on satellite images,
compared with its green, well-forested neighbor, the Dominican
Republic. The difference stems from Haiti's poverty,
which causes it to rely on charcoal for domestic and industrial
energy, whereas the Dominican Republic is wealthy enough to use
fossil fuels, subsidizing propane gas for cooking fuel specifically
so that people won't cut down forests.



Part of the problem is that the word "consumption" means
different things to the two tribes. Ecologists use it to mean "the
act of using up a resource"; economists mean "the purchase of goods
and services by the public" (both definitions taken from the Oxford
dictionary).



But in what sense is water, tellurium or phosphorus "used up"
when products made with them are bought by the public? They still
exist in the objects themselves or in the environment. Water
returns to the environment through sewage and can be reused.
Phosphorus gets recycled through compost. Tellurium is in solar
panels, which can be recycled. As the economist Thomas Sowell wrote
in his 1980 book "Knowledge and Decisions," "Although we speak
loosely of 'production,' man neither creates nor destroys matter,
but only transforms it."



Given that innovation—or "niche construction"—causes ever more
productivity, how do ecologists justify the claim that we are
already overdrawn at the planetary bank and would need at least
another planet to sustain the lifestyles of 10 billion people at
U.S. standards of living?



Examine the calculations done by a group called the Global Footprint Network—a think tank founded
by Mathis Wackernagel in Oakland, Calif., and supported by more
than 70 international environmental organizations—and it becomes
clear. The group assumes that the fossil fuels burned in the
pursuit of higher yields must be offset in the future by tree
planting on a scale that could soak up the emitted carbon dioxide.
A widely used measure of "ecological footprint" simply assumes that
54% of the acreage we need should be devoted to "carbon
uptake."



But what if tree planting wasn't the only way to soak up carbon
dioxide? Or if trees grew faster when irrigated and fertilized so
you needed fewer of them? Or if we cut emissions, as the U.S. has
recently done by substituting gas for coal in electricity
generation? Or if we tolerated some increase in emissions (which
are measurably increasing crop yields, by the way)? Any of these
factors could wipe out a huge chunk of the deemed
ecological overdraft and put us back in planetary credit.



Helmut Haberl of Klagenfurt University in Austria is a rare
example of an ecologist who takes economics seriously. He points out that his fellow ecologists have been
using "human appropriation of net primary production"—that is, the
percentage of the world's green vegetation eaten or prevented from
growing by us and our domestic animals—as an indicator of
ecological limits to growth. Some ecologists had begun to argue
that we were using half or more of all the greenery on the
planet.



This is wrong, says Dr. Haberl, for several reasons. First, the
amount appropriated is still fairly low: About 14.2% is eaten by us
and our animals, and an additional 9.6% is prevented from growing
by goats and buildings, according to his estimates. Second, most
economic growth happens without any greater use of biomass. Indeed,
human appropriation usually declines as a country industrializes
and the harvest grows—as a result of agricultural intensification
rather than through plowing more land.



Finally, human activities actually increase the production of
green vegetation in natural ecosystems. Fertilizer taken up by
crops is carried into forests and rivers by wild birds and animals,
where it boosts yields of wild vegetation too (sometimes too much,
causing algal blooms in water). In places like the Nile delta, wild
ecosystems are more productive than they would be without human
intervention, despite the fact that much of the land is used for
growing human food.



If I could have one wish for the Earth's environment, it would
be to bring together the two tribes—to convene a grand powwow of
ecologists and economists. I would pose them this simple question
and not let them leave the room until they had answered it: How can
innovation improve the environment?

 •  0 comments  •  flag
Share on Twitter
Published on April 30, 2014 00:49

April 26, 2014

We can't wreck the climate unless we get rich, but if we get rich, we won't wreck the climate

My Times column is on economic projections for the
year 2100.



In the past 50 years, world per capita income
roughly trebled in real terms, corrected for inflation. If it
continues at this rate (and globally the great recession of recent
years was a mere blip) then it will be nine times as high in 2100
as it was in 2000, at which point the average person in the world
will be earning three times as much as the average Briton earns
today.



I make this point partly to cheer you up on Easter Monday about
the prospects for your great-grandchildren, partly to start
thinking about what that world will be like if it were to happen,
and partly to challenge those who say with confidence that the
future will be calamitous because of climate change or
environmental degradation. The curious thing is that they only
predict disaster by assuming great enrichment. But perversely, the
more enrichment they predict, the greater the chance (they also
predict) that we will solve our environmental problems.



Past performance is no guide to future performance, of course,
and a well aimed asteroid could derail any projection. But I am not
the one doing the extrapolating. In 2012, the Intergovernmental
Panel on Climate Change (IPCC) asked the Organisation for Economic
Cooperation and Development (OECD) to generate five projections for the economy of
the world, and of individual countries, in 2050 and 2100.



They make fascinating reading. The average per capita income of
the world in 2100 is projected to be between three and 20 times
what it is today in real terms. The OECD’s “medium” scenario, known
as SSP2, also known as “middle of the road” or “muddling through”,
sounds pretty dull. It is a world in which, in the OECD’s words,
“trends typical of recent decades continue” with “slowly decreasing
fossil fuel dependency”, uneven development of poor countries,
delayed achievement of Millennium Development Goals, disappointing
investment in education and “only intermediate success in
addressing air pollution or improving energy access for the
poor”.



And yet this is a world in which by 2100 the global average
income per head has increased eight-fold [corrected from 13-fold]
to $60,000 [corrected from $100,000 in the original article] (in
2005 dollars) compared with $7,800 today. Britain will have trebled
its income per head. According to this middling scenario, the
average citizen of the Democratic Republic of Congo, who today
earns $300 a year, will then earn $42,000, or roughly what an
American earns today. The average Indonesian, Brazilian or Chinese
will be at least twice as rich as today’s American.



Remember this is in today’s money, corrected for inflation, but
people will be spending it on tomorrow’s technologies, most of
which will be cleverer, cleaner and kinder to the environment than
today’s — and all for the same price. Despite its very modest
assumptions, it is an almost unimaginable world: picture Beverly
Hills suburbs in Kinshasa where pilotless planes taxi to a halt by
gravel drives (or something equally futuristic). Moreover, the OECD
reckons that inequality will have declined, because people in poor
countries will have been getting rich faster than people in rich
countries, as is happening now. All five storylines produce a
convergence, though at different rates, between the incomes of poor
and rich countries.



Can the planet survive this sort of utopian plutocracy?
Actually, here it gets still more interesting. The IPCC has done
its own projections to see what sort of greenhouse gas emissions
these sorts of world would produce, and vice versa. The one that
produces the lowest emissions is the one with the highest income
per head in 2100 — a 16-fold increase in income but lower emissions
than today: climate change averted. The one that produces the
highest emissions is the one with the lowest GDP — a mere trebling
of income per head. Economic growth and ecological improvement go
together. And it is not mainly because environmental protection
produces higher growth, but vice versa. More trade, more innovation
and more wealth make possible greater investment in low-carbon
energy and smarter adaptation to climate change. Next time you hear
some green, doom-mongering Jeremiah insisting that the only way to
avoid Armageddon is to go back to eating home-grown organic lentils
cooked over wood fires, ask him why it is that the IPCC assumes the
very opposite.



In the IPCC’s nightmare high-emissions scenario, with almost no
cuts to emissions by 2100, they reckon there might be north of 4
degrees of warming. However, even this depends on models that
assume much higher “climate sensitivity” to carbon dioxide than the
consensus of science now thinks is reasonable, or indeed than their
own expert assessment assumes for the period to 2035.



And in this storyline, by 2100 the world
population has reached 12 billion, almost double what it was in
2000. This is unlikely, according to the United Nations: 10.9
billion is reckoned more probable. With sluggish economic growth,
the average income per head has (only) trebled. The world economy
is using a lot of energy, improvements in energy efficiency having
stalled, and about half of it is supplied by coal, whose use
has increased tenfold, because progress in other technologies such
as shale gas, solar and nuclear has been disappointing.



I think we can all agree that this is a pretty unlikely future.
It’s roughly like projecting forward from 1914 to a wealthy 2000
but with more people, lots more horse-drawn carriages and
coal-fuelled steamships, and no clean-air acts. But the point is
that making these sorts of assumption is the only way you can get
to really high levels of carbon dioxide in 2100. And even so,
remember, the average person is three times as rich. If the food
supply had collapsed and fossil fuels had run out, then there would
hardly be 12 billion people burning ten times as much coal and
living like kings, would there? You cannot have it both ways.



These IPCC and OECD reports are telling us clear as a bell that
we cannot ruin the climate with carbon dioxide unless we get a lot
more numerous and richer. And they are also telling us that if we
get an awful lot richer, we are likely to have invented the
technologies to adapt, and to reduce our emissions, so we are then
less likely to ruin the planet. Go figure.



[Post-script: Bjorn Lomborg arrives at similar conclusions - that the
IPCC's own figures show clearly that the cure is worse than the
disease.]

 •  0 comments  •  flag
Share on Twitter
Published on April 26, 2014 04:46

April 13, 2014

Britain's employment and productivity puzzle

My column in last week's Times was on the rise in
employment, reforms to welfare and the productivity puzzle in
Britain:



 



Successful innovations are sometimes low-tech:
corrugated iron, for example, or the word “OK”. In this vein, as
Iain Duncan Smith will say in a speech today in South London, a
single piece of paper seems to be making quite a difference to
Britain’s unemployment problem. It’s called the “claimant
commitment” and it has been rolling out to job centres since
October last year; by the end of this month it will be
universal.



The claimant commitment is really a contract between an
unemployed person and his or her “work coach” at Jobcentre Plus,
though government lawyers refused to let it be called a “contract”.
It turns out that requiring people to sign a paper that commits
them to certain actions — such as producing a CV, or checking
regularly for jobs — has had an instant effect.



Some people walk out, not wanting to perjure themselves about
cash-in-hand jobs they already have. Others get serious about
looking for work: evidence suggests that the introduction of the
claimant commitment has doubled the number of job searches people
do.



Work coaches also take their responsibilities more seriously
after signing the paper. And when they “sanction” a delinquent
client by withholding benefits, they can point to the commitment:
“You agreed to do these things and you haven’t.” Usually such
sanctions have to happen only once and the lesson is learnt. Mr
Duncan Smith’s hope is that unemployed people should think they
have a job like anybody else — the job being to find work — and an
employer like anybody else: the boss being the work coach.



It’s too soon to credit the claimant contract for much impact,
but Britain’s employment statistics grow ever more startling.
Considering the length and depth of the recession, the rise in
unemployment was smaller and the subsequent fall faster than almost
every expert expected. After years of economic pain, economic
inactivity and the number of workless households should be at
stubbornly high levels. In fact, they are at record low levels.
Total employment is at record high levels.



This is all the more surprising when seen in an international
context. Not only are most eurozone countries, outside Germany,
stuck with high unemployment, the United States, usually such a
fast generator of jobs, seems to be experiencing a disappointingly
jobless recovery, with labour-force participation at a level not
seen since Jimmy Carter was president. In the past year Britain’s
employment rate has grown faster than those of Germany, France,
Canada, Italy, the United States and the G7 as a whole. Some of the
American pioneers of “workfare” reforms that began in Wisconsin in
the 1990s are beginning to say that Britain is ahead of them.



Seen from a local angle, too, Britain’s employment picture is
surprising. The North East of England, where I live, depends
heavily on public sector employment, a shrinking category. Yet
unemployment in the North East, while still much higher than in the
South East, is falling. The region is a long way from booming, but
it has surprised everybody by not busting. The country as a whole
has added 1.7 million private sector jobs and lost 382,000 public
sector jobs since the election.



However, if employment has been more buoyant than economic
growth suggests then productivity must be stagnant or falling,
which in the long run is bad news for prosperity. There are plenty
of explanations floating around for this “productivity puzzle”,
among them that in the uncertain years of 2011-2013, the private
sector was reluctant to invest its growing cash pile in new
machinery. Another possibility, popular in the Treasury, is that
output is being systematically underestimated because so much of it
is now online.



My preferred explanation is that 2 per cent growth while the
public sector is shrinking is a heck of a lot faster than 2 per
cent growth while the public sector was growing, as it was in the
mid-2000s: some of that growth was actually increased public
spending.



I remember complaining to a Bank of England grandee in the
Noughties that the growth of the public sector in the North East
was starving the private sector of talent; employers were
complaining that people would not apply for jobs in private
companies, preferring to wait for better paid, better pensioned
ones within Leviathan. She said the Bank had the opposite problem —
unable to hire good people because they all wanted to work for
Goldman Sachs. (We were, of course, both right, in different
regions.)



There does seem to be something peculiarly job-ful about our
current recovery and maybe it does reflect the welfare reforms. It
is getting harder to ignore the argument that this Government is a
great deal better than the last one at getting people off welfare
and into work. Mr Duncan Smith likes to point out that under
Blair-Brown, in times of boom, 1.4 million people spent most of the
last decade on out-of-work benefits. They cannot all have been
unable to find work.



The Labour Party is running out of ways to see the employment
glass as half empty. It can no longer claim that the work being
created is all part-time jobs: full-time employment is up 430,000
in a year. It can no longer emphasise long-term unemployment and
youth unemployment, which are both falling.



It can say that there’s a long way to go before Britain’s
unemployment rate is anything like as low as Germany’s, and it can
say that many people are on low wages. Better that, says IDS, than
out of the habit of work altogether; and most people do not stay
long on low wages, but progress relatively fast to better pay.
Universal Credit, while much delayed by computer snafus, is
starting to roll out and promises to help people over the benefits
chasm where the rewards for working slightly longer hours are
virtually nil.



Labour also points to the growth of food banks as evidence that
welfare reform is hitting poor families. But the OECD’s recent
“social indicators” survey found that the percentage of families
saying they cannot afford food is lower today in Britain than it
was in 2007 and is the lowest in all 24 countries of the OECD
survey.



No Labour government has ever left office with unemployment
lower than when it started, despite the name of the party. The
coalition, for all its faults, will almost certainly avoid this
ignominious fate.

 •  0 comments  •  flag
Share on Twitter
Published on April 13, 2014 11:21

April 8, 2014

A rough ride to the future

My review for The Times of James Lovelock's new
book, A Rough Ride to the Future.



 



This book reveals that James Lovelock, at 94, has not lost his
sparkling intelligence, his lucid prose style, or his cheerful
humanity. May Gaia grant that we all have such talents in our tenth
decades, because the inventor of gadgets and eco-visionary has
lived long enough to recant some of the less sensible views he
espoused in his eighties.



Eight years ago, at the height of global warming alarmism,
Lovelock turned uncharacteristically pessimistic in his book The
Revenge of Gaia. He’d been got at by the greens. Despite all our
efforts, he thought, “we may be unable to prevent a global decline
into a chaotic world ruled by brutal warlords on a devastated
Earth”. Billions would die, he said, and the few breeding pairs of
human beings who survived would be in the Arctic.



In his new book, he now thinks he “tended to exaggerate the
immediacy of global warming”, that “we may muddle through into a
strange but still viable new world”, and that we can “keep our cool
as the Earth gently warms, and even enjoy it when we can”. He
admits that “the global average temperature has not risen as
expected”, having “hardly warmed at all since the millennium”, and
that he was “led astray” by the ice cores that seemed to imply
changes in carbon dioxide were the dominant cause of changes in
temperature. He thinks it is a mistake to take the
Intergovernmental Panel on Climate Change’s “projections almost as
if written in stone”; instead we “need to stay sceptical about the
projections of climate models”.



For those of us who have been saying such things for a while,
and who were told more than once (as I was by the head of the
Science Museum among others), that if Lovelock was very worried so
should I be, this is delicious to read. Welcome to the Lukewarmer
Society, Jim.



He regrets that huge sums have been “squandered on the renewable
energy sources”, many of which are “ugly and hopelessly
impractical” and threaten a “green satanic change” to Britain’s
landscape. Yup. He thinks that Greenpeace is “a great and powerful
negative feedback on all that enlightened technological progress
stands for”. Amen to all that.



He still thinks climate change will happen, of course, as I and
most people do, but he expects us to adapt to it, especially in the
design of our cities. Singapore, he points out, is a very habitable
city in a climate far warmer than expected for most of the world by
the end of the century. He expects us, by combining our biological
and our electronic brains, to “give Gaia to the wisdom to proceed
to the next step, whatever that may be, with or without us as the
lead species”.



Ah, Gaia. Lovelock famously borrowed this name from Greek
theology to label his idea that life alters the physics and
chemistry of the planet in ways that are self-regulating. If the
planet gets too hot, for instance, living things get whiter, which
cools it down. I have always had difficulty with Gaia, because I am
never sure how seriously Lovelock wants us to take her. If he means
by Gaia that the Earth has a tendency to self-correct, which has
kept it lukewarm for billions of years through changes in the
atmosphere unconsciously aided by evolution among life forms, I’m
with him. But I never quite feel he does enough to disavow the idea
that in some sense this tendency has become conscious or mystical.
The book does little to clear up my confusion, but there are some
fascinating ideas to enjoy along the way.



One of these is that he thinks Thomas Newcomen’s atmospheric
steam engine, invented in 1712, marks a turning point in the
history of the planet — when we began to tap the almost limitless
energy of fossil fuels, accessing cheap and abundant energy.
Thereby we began to transform not only our population and our
prosperity, but the ecology of the planet itself. I agree, and
would go further, because I think Lovelock misses the fact that
this was in effect the first occasion on which we linked heat with
work.



Till Newcomen we had heat energy, from wood and so on, and work
energy (motion mainly), from wind, oxen and so on, but the twain
did not meet — except instantaneously in the barrel of a gun. Today
nearly all the work done in the world starts out as heat. That is
what has enabled cultural evolution to change at a breakneck
pace.



Lovelock is a lone scientist, a species that he says is now “as
rare as ectoplasm”, and he values the independence to think that
comes with loner status. He comes up with plenty of thoughts that I
happen to think are bunk, but no matter: there’s lots of marvellous
ideas too. As the autobiographical snippets in this fine book
illustrate, he is at least as much an inventor as a scientist,
exemplifying in his career the fact that technology drives science
at least as much as vice versa.



Roll on Lovelock’s eleventh decade: he’s getting better all the
time.

 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2014 04:39

April 6, 2014

Adapting to climate change

My Spectator article on the IPCC's new emphasis
on adaptation:



Nigel Lawson was right after all. Ever since the Centre for
Policy Studies lecture in 2006 that launched the former chancellor
on his late career as a critic of global warming policy, Lord
Lawson has been stressing the need to adapt to climate change,
rather than throw public money at futile attempts to prevent it.
Until now, the official line has been largely to ignore adaptation
and focus instead on ‘mitigation’ — the misleading term for
preventing carbon dioxide emissions.



That has now changed. The received wisdom on global warming,
published by the Intergovernmental Panel on Climate Change, was
updated this week. The newspapers were, as always, full of stories
about scientists being even more certain of environmental
Armageddon. But the document itself revealed a far more striking story: it emphasised, again and again, the need to adapt
to climate change. Even in the main text of the press release that
accompanied the report, the word ‘adaptation’ occurred ten times,
the word ‘mitigation’ not at all.



The distinction is crucial. So far, the debate has followed a
certain bovine logic: that global warming is happening, so we need
to slow it down by hugely expensive decarbonisation strategies —
green taxes, wind farms. And what good will this do? Is it possible
to stop global warming in its tracks? Or would all these green
policies be the equivalent of trying to blow away a hurricane? This
question — just how much can be achieved by mitigation — is one not
often addressed.



There is an alternative: accepting that the planet is warming,
and seeing if we can adjust accordingly. Adaptation means investing
in flood defences, so that airports such as Schiphol can continue
to operate below existing (and future) sea level, and air
conditioning, so that cities such as Houston and Singapore can
continue to grow despite existing (and future) high temperatures.
It means plant breeding, so that maize can be grown in a greater
range of existing (and future) climates, better infrastructure, so
that Mexico or India can survive existing (and future) cyclones,
more world trade, so that Ethiopia can get grain from Australia
during existing (and future) droughts.



Owen Paterson, the Secretary of State for the Environment, in
repeatedly emphasising the need to adapt to climate change in this
way, has been something of a lone voice in the government. But he
can now count on the support of the mighty IPCC, a United Nations
body that employs hundreds of scientists to put together the
scientific equivalent of a bible on the topic every six years or
so. Whereas the last report had two pages on adaptation, this one
has four chapters.



Professor Chris Field is the chairman of Working Group 2 of the
IPCC, the part devoted to the effects of climate change rather than
the cause. ‘The really big breakthrough in this report,’ he says, ‘is the new idea of thinking about
managing climate change.’ His co-chair Vicente Barros adds: ‘Investments in better preparation can
pay dividends both for the present and for the future … adaptation
can play a key role in decreasing these risks’. After so many
years, the penny is beginning to drop.



In his book An Appeal to Reason, Lawson devoted a
chapter to the importance of adaptation, in which he pointed out
that the last IPCC report in 2007 specifically assumed that humans
would not adapt. ‘Possible impacts,’ the report said, ‘do not take into account any changes or
developments in adaptive capacity.’ That is to say, if the world
gets warmer, sea levels rise and rainfall patterns change, farmers,
developers and consumers will do absolutely nothing to change their
habits over the course of an entire century. It is a ludicrous
assumption.



But this assumption was central, Lawson pointed out, to the
estimated future cost of climate change the IPCC reported. A
notorious example was the report’s conclusion that, ‘assuming no
adaptation’, crop yields might fall by 70 per cent by the end of
the century — a conclusion based, a footnote revealed, on a single
study of peanut farming in one part of India.








Lawson pointed out that adaptation had six obvious benefits as a
strategy, which mitigation did not share. It required no
international treaty, but would work if adopted unilaterally; it
could be applied locally; it would produce results quickly; it
could capture any benefits of warming while avoiding risks; it
addressed existing problems that were merely exacerbated by
warming; and it would bring benefits even if global warming proves
to have been exaggerated.



Ask yourself, if you were a resident of the Somerset Levels,
whether you would prefer a government policy of adapting to
anything the weather might throw at you, whether it was exacerbated
by climate change or not, or spending nearly £50 billion (by 2020)
on low-carbon technologies that might in a few decades’ time, if
adopted by the whole world, reduce the exacerbation of floods, but
not the floods themselves.



It is remarkable how far this latest report moves towards
Lawson’s position. Professor Field, who seems to be an eminently
sensible chap, clearly strove to emphasise adaptation, if only
because the chance of an international agreement on emissions looks
ever less likely. If you go through the report chapter by chapter
(not that many people seem to have bothered), amid the usual
warnings of potential danger, there are many sensible, if
jargon-filled, discussions of exactly the points Lawson made.



Chapter 17 concedes that ‘adaptation strategies … can yield
welfare benefits even in the event of a constant climate, such as
more efficient use of water and more robust crop varieties’.
Chapter 20 even acknowledges that ‘in some cases mitigation may
impede adaptation (e.g., reduced energy availability in countries
with growing populations)’. A crucial point, this: that preventing
the poor from getting access to cheap electricity from coal might
make them more vulnerable to climate change. So green policies may
compound the problem they seek to solve.



In short, there is a great deal in this report to like. It has,
moreover, toned down the alarm considerably. Even
the New Scientist magazine has noticed that the report ‘backs off from
some of the predictions made in the previous report’ and despite
the urgings of Ed Davey to sex up the summary during last week’s
meeting in Yokohama, New Scientist noticed that
‘the report has even watered down many of the more confident
predictions that appeared in the leaked drafts’.



For instance, references to ‘hundreds of millions’ of people
being affected by rising sea levels were removed from the summary,
as were statements about the impact of warmer temperatures on
crops. The report bravely admits that invasive alien species are a
far greater threat to species extinction than climate change
itself. Even coral reefs, the report admits, are threatened mostly
by pollution and overfishing, which might be exacerbated at the
margin by climate change. So why don’t we have intergovernmental
panels on invasive species and overfishing?



As these examples illustrate, perhaps most encouraging of all,
the report firmly states that the impact of climate change will be
small relative to other things that happen during this century:
‘For most economic sectors … changes in population, age structure,
income, technology, relative prices, lifestyle, regulation and
governance will be large relative to the impacts of climate
change.’ So yes, the world is heating up. But in many ways, it will
be a better world.



The report puts the global aggregate economic damage from
climate change at less than 2.5 per cent of income by the latter
years of the century. This is a far lower number than Lord Stern
arrived at in his notorious report of 2006, and this is taking the
bleak view that there will be a further 2.5˚C rise from recent
levels. This is the highest of nine loss estimates; the average is
only 1.1 per cent.



And the IPCC is projecting two thirds more warming per increment
of carbon dioxide than the best observationally based studies now
suggest, so the warming the IPCC outlines is not even likely with
the highest emissions assumption.



In other words, even if you pile pessimism upon pessimism,
assuming relatively little decarbonisation, much global enrichment
and higher climate ‘sensitivity’ than now looks plausible — leading
to more rapid climate change — you still, on the worst estimate,
hurt the world economy in a century by only about as much as it
grows every year or two. Rather than inflict an awful economic
toll, global warming would make our very rich descendants — who are
likely to be maybe eight or nine times as rich as we are today, on
global average — a bit less rich.



To avoid this little harm, we could go for adaptation — let poor
people get as rich as possible and use their income to protect
themselves and their natural surroundings against floods, storms,
potential food shortages and loss of habitat. Or we could go for
mitigation, getting the entire world to agree to give up the fossil
fuels that provide us with 85 per cent of our energy. Or we could
try both, which is what the IPCC now recommends.



But the one truly bonkers thing to do would be to go
unilaterally into a policy of subsidising the rich to install
technologies that drive up the cost of energy, desecrate the
countryside, kill golden eagles, clear-cut swamp forests in North
Carolina, turn grain into motor fuel, so driving up the price of
food and killing people, and prevent poor people in Africa getting
loans to build coal-fired, cheap power stations instead of inhaling
smoke from wood fires cut from virgin forests.



All this we are doing in this country, with almost no prospect
of cutting carbon emissions enough to affect the climate. That’s
the very opposite of adaptation — preventing the economic growth
that would enable us to adapt while failing to prevent any climate
change.



The report is far from ideal (don’t worry, Professor Field, I
know that endorsement from the likes of me would kill your career).
As Rupert Darwall, author of The Age of Global
Warming, has pointed out, it systematically ignores the
benefits of climate change and makes the unsupported claim that
crop yields have been negatively affected by climate change, its
only evidence being recent spikes in crop prices — a big cause of
which was climate policy, not climate change, in the shape of
biofuels programmes that diverted 5 per cent of the world’s grain
crop into fuel.



Did you gather from the press that the report warns of rising
deaths from storms and droughts, falling crop yields, spreading
diseases, and all the usual litany? Did you conclude from this that
deaths from storms will increase, crop yields will fall, and
diseases will kill more people? Oh, how naive can you get!



No, no, no — what they mean is that the continuing fall in
deaths from storms, floods and disease may not be as steep as it
would be without climate change, that the continuing rise in crop
yields may not be as fast as it would be without climate change,
and that the continuing retreat of malaria might not be as rapid as
it would be without climate change. In other words, the world will
probably heat up — but it’s not going to end. It’s going to be
healthier and wealthier than ever before, just a tad less wealthy
than it might otherwise have been. Assuming we do not adapt, that
is.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on April 06, 2014 08:47

April 5, 2014

The Tyranny of Experts

My review of William Easterly's book The Tyranny of Experts for The Times:



 



Imagine, writes the economist William Easterly, that in 2010
more than 20,000 farmers in rural Ohio had been forced from their
land by soldiers, their cows slaughtered, their harvest torched and
one of their sons killed — all to make way for a British forestry
project, financed and promoted by the World Bank. Imagine that when
the story broke, the World Bank promised an investigation that
never happened.



That is, says Easterly, what occurred in Mubende District in
Uganda. It exemplifies all that is wrong with development in
Easterly’s view. It is too top-down, too crony with despots, too
remotely technocratic and too indifferent to the political and
economic freedom of local people. It is run by a tyranny of
experts.



This book is not an attack on aid from rich to poor. It is an
attack on the unthinking philosophy that guides so much of that aid
from poor taxpayers in rich countries to rich leaders in poor
countries, via outsiders with supposed expertise. Easterly is a
distinguished economist and he insists there is another way, a path
not taken, in development economics, based on liberation and the
encouragement of spontaneous development through exchange. Most
development economists do not even know they are taking the
technocratic, planning route, just as most fish do not know they
swim in a sea.



Easterly traces the history of this mistake back to the first
half of the 20th century, when semi-colonial Western powers in
China, in order to preserve their interests, used big charitable
donations to support an autocratic regime under Sun Yat-sen and
then Chiang Kai-shek, who got the message that development was the
card to play in justifying despotism.



In the 1930s, the British had to scramble to find a new excuse
for their colonies — whose occupation had always been justified on
grounds of racial superiority, an argument looking threadbare as
the depression and Nazism made pith-helmeted district commissioners
seem less god-like. A retired colonial office civil servant named
Lord Hailey came up with a technocratic justification instead —
that we were guiding the development of India and Africa. He called
for “a far greater measure of both initiative and control on the
part of the central government”.



During the Second World War Hailey got the Americans to go along
with this, by suggesting a similar line used to uphold southern
segregation — economic betterment would come first; political
liberation could wait. The Cold War meant a new justification for
the same policy in Latin America: use aid to prop up dictators.



The consequence was that it was assumed that the newly liberated
Third World was best ruled by autocrats. “The masses of the people
take their cue from those who are in authority over them,” said
theUnited Nations Primer for Development in 1951. Nanny
state knew best. Top-down development by LSE graduates was not just
the best way; it was the only way. And it was frequently
disastrous.



To this day, the head of the World Bank tours China, praising
its “leadership” and “steady implementation with a determined
will”, as atrocities abound. Tony Blair’s African Government
Initiative believes in “strengthening the government’s capacity to
deliver programs” in its poster-boy of Ethiopia, a country whose
ruler uses aid to crush opposition and grab land through
“villagisation”. Nobody seems to mind.



Easterly believes history undermines the argument that
dictatorship, even of a benevolent kind, is necessary for economic
development. The story of the West’s rise, the roaring of the east
Asian tigers and of China’s sudden growth surge are actually cases
of spontaneous order, unplanned innovation and liberation from
top-down rule, not central planning.



For instance, Deng Xiaoping gets the kudos for China’s miracle
when all he did was recognise after the fact a spontaneous
rebellion against the continuing failure of collective farms. And
Lee Kuan Yew of Singapore was sensible enough not to prevent (and
then to take the credit for) an organic improvement in a city state
exposed to world trade and populated by mercantile Fujian
Chinese.



The decades-old view that conscious policy design offers the
best hope for ending poverty, is just another a form of
creationism, embodying the fallacy of intelligent design – that
because something is ordered and intricate, it must have been
ordained by an intelligent mind. In fact, as Adam Smith and
Friedrich Hayek (and Charles Darwin) realised, no expert can ever
know enough to rival the information that emerges from the
spontaneous interactions of many people.



Technocrats also tend to have a “Blank Slate” view that the
history of a country does not matter much; have traditionally
neglected trade; and have often ignored regional or individual
trends in favour of national ones. Easterly describes the success
of the Mourides from Senegal as a rebuke to the experts. Go up to
an African street retailer in New York, Paris, Madrid or Milan and
ask him where he comes from. The chances are he is a Mouride, a
merchant embedded in a supportive web of credit, trust and
remittances that this religious brotherhood maintains — a bit like
Jews in medieval Europe. The Mourides were practising microfinance
for decades before the development industry discovered it. But
partly because they don’t fit inside a country, conventional
development economics misses such folk.



“It was an unhappy accident,” writes Easterly, “that development
thinking stressed development at the unit of the nation and was
scornful of trade at the moment of independence of many new nation
states.”



Easterly is a fluent writer and a good economic historian, at
home describing the differences between Friedrich Hayek (a
proponent of bottom-up development) and Gunnar Myrdal (top-down),
as he is recounting the history of one particular block in New York
city, which he has studied as a case history of spontaneous
development. This group of houses on Greene Street was once a
freed-slave small-holding, then part of a larger farm, then a
brothel, then a garment factory, then an artist’s studio and is now
full of posh apartments and an Apple store.



The book’s weakness is that having set up a strong historical
and theoretical argument against technocracy and for bottom-up
development, Easterly does not then follow through with some
examples of how the latter might work in practice. Nor does he
tackle the question of whether at least some parts of the modern
aid industry, especially among NGOs and charities, might be getting
rather better at helping in bottom-up ways. It would have been good
to see a manifesto for how Easterly would run the World Bank or for
that matter the Gates Foundation.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on April 05, 2014 01:07

April 4, 2014

There is no simple explanation for the missing airliner

My Times column is on the missing airliner and
Occam's razor.



 



The tragic disappearance of all 239 people on
board flight MH370 in the Indian Ocean has one really peculiar
feature to it: none of the possible explanations is remotely
plausible, yet one of them must be true.



The usual rule on these occasions – choose the simplest
explanation or, as William of Ockham taught, make the fewest
assumptions – simply does not work. There is no simple explanation.
Whether the cause was an accidental decompression, a terrorist act
or a suicide, all three require us to assume that an outlandish and
bizarre sequence of events happened.



I don’t know about you, but I have had conversations about MH370
with many people recently, some of whom were fairly confident that
they knew what had happened. Yet every story they told was baroque
in its contrivance to the point of implausibility, requiring a
chain of events that stretched my credulity. Yet, as I say, one
such story will turn out to be right.



Consider the sequence of events. Very shortly after entering an
air-traffic control dead-spot, somebody switches off both
communication devices and changes course. He then changes course
twice more, possibly rising very high and then dropping low (this
does not seem to be established for certain), and heads for a
region of sea with no hope of landing, apparently choosing to run
out of fuel slowly rather than land or crash sooner.



An onboard accident of any kind seems highly unlikely, because
somebody was in control for at least some of the time. Yet whoever
this was remained silent. That surely makes it unlikely that a
terrorist took control. A secretive disappearance with no message
to the world and no claim of responsibility seems an unlikely way
for a terrorist group to act.



There was no emergency radio message, so a struggle between
pilots and terrorists for control of the plane seems equally
unlikely. A conspiracy between the two pilots seems still more
unlikely and nothing in their backgrounds suggests such a thing.
That’s five unlikely things already.



We are back to suicide, of the pilot or co-pilot, with only the
thinnest of motives (the pilot is said to have been upset following
the break-up of his marriage and the conviction of the opposition
leader for sodomy), the narrowest of opportunities (his colleague
left for the toilet just as the plane left Malaysian air-traffic
control?), an implausibly callous indifference to the happiness of
passengers and their families, and the oddest of methods – who
wants to kill themselves very slowly over seven hours and why would
he care to leave no trace? All this, too, sounds utterly
ridiculous. Yet it may prove to be the least ridiculous theory and
thus satisfy Ockham’s razor.



I’ve never had a great deal of time for Sherlock
Holmes’s bon mots, which are just that little bit too
smug. One of the smuggest is the put-down of the long-suffering Dr
Watson in The Sign of Four, which seems appropriate
to this case: “You will not apply my precept,” Holmes says to
Watson, shaking his head. “How often have I said to you that, when
you have eliminated the impossible, whatever remains, however
improbable, must be the truth? We know that he did not come through
the door, the window, or the chimney. We also know that he could
not have been concealed in the room, as there is no concealment
possible. Whence, then, did he come?”



This seems to sum up the problem of MH370 rather well, but
Arthur Conan Doyle proves to be no help either, for the answer to
Holmes’s question turns out to be that the villain came through a
trap-door in the roof. This hardly qualifies as more improbable
than an impossibility, as Sherlock had suggested. It’s rather a
simple (and irritatingly deus-ex-machina)
explanation, after all.



Indeed, I am struggling to find any unsolved case of mass
disappearance that is remotely as baffling as this one, even from
before the age of satellites. Take Flight 19, the five torpedo
bombers that vanished off the coast of Florida on a routine
training exercise in 1945, and whose planes have never been
found.



The case was much beloved of the UFO crowd (the pilots return to
Earth — still youthful — in the film Close Encounters of
the Third Kind
), but it takes only a brief internet search to
find out that there is nothing mysterious here at all. The flight
leader was on the radio admitting he was lost and saying he thought
he was over the Florida Keys, rather than the Bahamas. This led him
to fly further out to sea, rather than back towards the coast.



The disappearance of Sir John Franklin’s expedition to find the
North-west passage after 1845 seemed mightily mysterious to the
Victorian public because two whole ships and 129 men vanished
altogether, and nine years of searching by lots of expeditions
turned up nothing.



As with MH370, they were looking in the wrong place at first.
But eventually buttons, medals, spoons and clothing in the
possession of Inuit natives led searchers to the west shore of King
William Island, where they found a trail of artefacts, bodies, two
messages and a boat.



These eventually told a coherent, if confused, story of two
winters stuck in the same stretch of pack ice, deaths from scurvy
(abetted by lead poisoning, as 20th-century analysis showed) and
the abandonment of the ship by the 105 survivors in an attempt to
reach mainland Canada, dragging the boats overland intending to row
them up a river. Mysteries remain, including why the boat that was
found on a sled was apparently being dragged back towards the ships
and contained not just bodies but also soap, silk handkerchiefs,
silver, 40lb of chocolate and several books including a copy of
theVicar of Wakefield.



But these are minor mysteries — not like the Malaysian airliner.
The black box will stop transmitting in about a week, so we may
struggle ever to find it and know what actually happened.



If the cause remains mysterious for years, as now seems
possible, that need not prevent us from learning lessons. Planes
will surely now be fitted with satellite tracking devices. And
given that some kind of human intervention seems to be at the root
of the disappearance, as I argued two weeks ago, pilotless planes
may come to be seen as less dangerous than piloted ones.

 •  0 comments  •  flag
Share on Twitter
Published on April 04, 2014 00:31

March 31, 2014

Muting the alarm on climate change

The United Nations' Intergovernmental Panel on Climate Change
will shortly publish the second part of its latest report, on the
likely impact of climate change. Government representatives are
meeting with scientists in Japan to sex up—sorry, rewrite—a summary
of the scientists' accounts of storms, droughts and diseases to
come. But the actual report, known as AR5-WGII, is less frightening than its predecessor seven
years ago.



The 2007 report was riddled with errors about Himalayan
glaciers, the Amazon rain forest, African agriculture, water
shortages and other matters, all of which erred in the direction of
alarm. This led to a critical appraisal of the report-writing
process from a council of national science academies, some of whose
recommendations were simply ignored.



Others, however, hit home. According to leaks, this time the
full report is much more cautious and vague about
worsening cyclones, changes in rainfall, climate-change refugees,
and the overall cost of global warming.



It puts the overall cost at less than 2% of GDP for a 2.5
degrees Centigrade (or 4.5 degrees Fahrenheit) temperature increase
during this century. This is vastly less than the much heralded
prediction of Lord Stern, who said climate change would cost 5%-20%
of world GDP in his influential 2006 report for the British
government.



The forthcoming report apparently admits that climate change has
extinguished no species so far and expresses "very little
confidence" that it will do so. There is new emphasis that climate
change is not the only environmental problem that matters and on
adapting to it rather than preventing it. Yet the report still
assumes 70% more warming by the last decades of this century than
the best science now suggests. This is because of an overreliance
on models rather than on data in the first section of the IPCC
report—on physical science—that was published in September
2013.



In this space on Dec. 19, 2012, I forecast that the IPCC was
going to have to lower its estimates of future warming because of
new sensitivity results. (Sensitivity is the amount of warming due
to a doubling of atmospheric carbon dioxide.) "Cooling Down Fears of Climate Change" (Dec.
19), led to a storm of protest, in which I was called
"anti-science," a "denier" and worse.



The IPCC's September 2013 report abandoned any attempt to
estimate the most likely "sensitivity" of the climate to a doubling
of atmospheric carbon dioxide. The explanation, buried in a
technical summary not published until January, is that "estimates
derived from observed climate change tend to best fit the observed
surface and ocean warming for [sensitivity] values in the lower
part of the likely range." Translation: The data suggest we
probably face less warming than the models indicate, but we would
rather not say so.



The Global Warming Policy Foundation, a London think tank,
published a careful survey of all the reliable studies
of sensitivity on March 5. The authors are British climate
scientist Nic Lewis (who has no academic affiliation but a growing
reputation since he discovered a glaring statistical distortion
that exaggerated climate sensitivity in the previous IPCC report)
and the Dutch science writer Marcel Crok. They say the IPCC's
September report "buried good news about global warming," and that
"the best observational evidence indicates our climate is
considerably less sensitive to greenhouse gases than climate
scientists had previously thought."



Messrs. Lewis and Crok argue that the average of the best
observationally based studies shows the amount of immediate warming
to be expected if carbon dioxide levels double after 70 years is
"likely" to be between one and two degrees Centigrade, with a best
estimate of 1.35C (or 2.4F). That's much lower than the IPCC
assumes in its forthcoming report.



In short, the warming we experienced over the past 35
years—about 0.4C (or 0.7F) if you average the measurements made by
satellites and those made by ground stations—is likely to continue
at about the same rate: a little over a degree a century.



Briefly during the 1990s there did seem to be warming that went
as fast as the models wanted. But for the past 15-17 years there
has been essentially no net warming (a "hiatus" now conceded by the
IPCC), a fact that the models did not predict and now struggle to
explain. The favorite post-hoc explanation is that because of
natural variability in ocean currents more heat has been slipping
into the ocean since 2000—although the evidence for this is far
from conclusive.



None of this contradicts basic physics. Doubling carbon dioxide
cannot on its own generate more than about 1.1C (2F) of warming,
however long it takes. All the putative warming above that level
would come from amplifying factors, chiefly related to water vapor
and clouds. The net effect of these factors is the subject of
contentious debate.



In climate science, the real debate has never been between
"deniers" and the rest, but between "lukewarmers," who think
man-made climate change is real but fairly harmless, and those who
think the future is alarming. Scientists like Judith Curry of the
Georgia Institute of Technology and Richard Lindzen of MIT have
moved steadily toward lukewarm views in recent years.



Even with its too-high, too-fast assumptions, the recently
leaked draft of the IPCC impacts report makes clear that when it
comes to the effect on human welfare, "for most economic sectors,
the impact of climate change will be small relative to the impacts
of other drivers," such as economic growth and technology, for the
rest of this century. If temperatures change by about 1C degrees
between now and 2090, as Mr. Lewis calculates, then the effects
will be even smaller.



Indeed, a small amount of warming spread over a long period
will, most experts think, bring net improvements to human welfare.
Studies such as by the IPCC author and economist Professor Richard
Tol of Sussex University in Britain show that global warming has
probably done so already. People can adapt to such change—which
essentially means capture the benefits but minimize the harm.
Satellites have recorded a roughly 14% increase in greenery on the
planet over the past 30 years, in all types of ecosystems, partly
as a result of man-made CO2 emissions, which enable plants to grow
faster and use less water.



There remains a risk that the latest science is wrong and rapid
warming will occur with disastrous consequences. And if renewable
energy had proved by now to be cheap, clean and thrifty in its use
of land, then we would be right to address that small risk of a
large catastrophe by rushing to replace fossil fuels with
first-generation wind, solar and bioenergy. But since these forms
of energy have proved expensive, environmentally damaging and
land-hungry, it appears that in our efforts to combat warming we
may have been taking the economic equivalent of chemotherapy for a
cold.



Almost every global environmental scare of the past half century
proved exaggerated including the population "bomb," pesticides,
acid rain, the ozone hole, falling sperm counts, genetically
engineered crops and killer bees. In every case, institutional
scientists gained a lot of funding from the scare and then quietly
converged on the view that the problem was much more moderate than
the extreme voices had argued. Global warming is no different.

 •  0 comments  •  flag
Share on Twitter
Published on March 31, 2014 11:45

Matt Ridley's Blog

Matt Ridley
Matt Ridley isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Matt Ridley's blog with rss.