Matt Ridley's Blog, page 37
October 28, 2014
What is WHO up to?
My Times column is on the World Health
Organisation's odd priorities: its early complacency about ebola,
while it attacks a new technology that saves the lives of smokers
by getting them off tobacco, and obsesses about climate change:
Is there a connection between ebola and
e-cigarettes?I don’t mean to imply that vaping has caused the
epidemic in west Africa. But the World Health Organisation (WHO)
now has serious questions to answer about its months of complacency
over ebola. WHO’s director-general, Margaret Chan, made a speech
only two weeks ago implying that tobacco control and the fight
against e-cigarettes is a more important issue.
On October 13 Dr Chan gave her apologies for not being able to
attend a conference on ebola and made a speech instead at a WHO summit in Moscow on
tobacco. This is what she said there: “Some people speculated that
I would not attend this meeting because I am so busy with so many
other outbreaks of communicable diseases [ebola was third on her
list, after flu and Mers coronavirus].No. No. No. I will not cancel
my attendance at this meeting because it is too important . . .
Tobacco control unquestionably is our biggest, surest and best
opportunity to save some millions of lives . . .The next challenge
is that the tobacco industry is increasing its dominance over the
market for electronic cigarettes.”
The $20 million Moscow meeting happened behind closed doors,
with even accredited journalists excluded. High on the agenda was
vaping. WHO has long been trying to define e-cigarettes as tobacco
products, though they are not, so as to bring them under the aegis
of its tobacco “framework convention”.
The outcome of the Moscow meeting was the suggestion that more
countries should ban e-cigarettes, despite the lack of scientific
evidence that they do harm and ignoring the growing evidence that
they save lives. Such bans would be convenient for pharmaceutical
companies, with which WHO has close links, whose sales of nicotine
gum and patches have been in free fall because of e-cigarettes.
WHO is not wrong to fight tobacco; but in fighting e-cigarettes,
it is protecting tobacco and doing real harm. Louise Ross, of the
NHS Stop Smoking Service in Leicester, reports a drop in interest in vaping as a
method of quitting cigarettes at the time the WHO report came out.
Urban myths say e-cigarettes are no better than cigarettes. People
tell Ms Ross: “I’ve seen it on telly, it’s safer to carry on
smoking.”
Yet Philip Morris reports that countries that are liberal
about vaping, such as Poland, have seen tobacco sales fall by more
than 10 per cent in the past year alone. Countries that discourage
vaping, such as Italy and Spain, have seen falls of less than 2 per
cent. A French government survey of 13,000 Parisian
schoolchildren has found a startling 45 per cent reduction in
tobacco smoking among young teenagers in three years, since
e-cigarettes have rebranded tobacco as “dirty and nerdy”. This
technology is a breakthrough in the fight against smoking. Yet
WHO’s director-general regards fighting it as a priority.
Meanwhile, WHO’s sluggish response to ebola has infuriated many
health experts. In April Médecins Sans Frontières issued a stark warning that the outbreak was
out of control and was contradicted by a WHO spokesman. In June Dr
Chan was warned that WHO was hindering the fight against the virus,
but little changed till August.
A leaked report says that WHO’s “failure to see that
conditions for explosive spread were present right at the start”
was a big mistake. In April public health measures would have
stopped the disease; now it could be too late and we may have to
wait many months for vaccines instead. Big international agencies
have unique responsibilities: if the WHO was reassuring, why would
anybody else react with urgency?
Vaping was not the only thing on Dr Chan’s mind this autumn. In
September, when the world had at last woken up to what is going on
in west Africa, she was keen to comment on another great crisis,
almost as important as tobacco: climate change.
Writing on The Huffington Post website, she
called climate change the “defining issue of the 21st century” and
even seemed to imply it was more important than ebola, because “it
cannot be contained by doctors in hazmat suits, patients in
isolation wards, or hopes that a vaccine or cure is somewhere on
the horizon.” The article was so hysterical about climate change it
would have embarrassed a green pressure group. It claimed
inaccurately that “records for extreme weather events are being
broken a record number of times” and it abandoned the current
scientific consensus by suggesting that worsening malaria will
result from climate change. This is out of date. Scientists now agree that efforts to combat malaria are
likely to progress despite any small changes in the range of
mosquitoes as a result of climate change.
The article was especially naughty in citing the figure of seven
million people dying each year from air pollution. More than half of these die from indoor air
pollution because they cook over (renewable) wood and dung fires,
the phasing out of which is being delayed worldwide by the
increasing reluctance of western governments to fund fossil-fuel
energy, lest it contribute to climate change. So some of these
dying people are the victims of climate change policy, rather than
climate change itself. WHO’s own figures suggest indoor smoke kills
30-150 times as many people as global warming does.
Dr Chan rightly says that the ebola epidemic underscores the
need for better public health infrastructure in poor countries. But
the lack of these in Sierra Leone and Liberia is a symptom of
poverty, not climate change (let alone nicotine). That lack should
have alerted WHO to the risk of an ebola epidemic much earlier.
Lest you think I am unfairly selecting Dr Chan’s pronouncements,
go on WHO’s home page and look under “director-general” where on
Sunday only two recent speeches were listed: the Moscow one, and
another delivered in Tunis eight days ago. On that
occasion, ebola, though discussed, was still given third billing in
the causes of shocks humanity is facing: “whether caused by extreme
weather events in a changing climate, armed conflict or civil
unrest, or a deadly and dreaded virus spreading out of control.”
The delay of the United Nations’ key agency in raising the alarm
about ebola appears to have stemmed at least partly from its
obsession with other, politically fashionable and scientifically
dubious, priorities. Is WHO fit for purpose?
October 22, 2014
Cheaper oil is good news
My Times column on the faling oil price:
So ingrained is the bad-news bias of the
intelligentsia that the plummeting price of oil has mostly been
discussed in terms of its negative effect on the budgets of oil
producers, both countries and companies. We are allowed to rejoice
only to the extent that we think it is a good thing that the
Venezuelan, Russian and Iranian regimes are most at risk, which
they are.
Yet by far the greater benefit of the oil price fall comes from
the impact on consumers. Making this essential resource cheaper
allows everybody, whatever their nationality, to spend less money
on dull things like heat, transport, metal and plastic, which
leaves them more money for things like movies, holidays and pets,
which gives other people new jobs, which raises everybody’s living
standards.
The oil price peaked at almost $150 a barrel in 2008, just
before the financial crisis. That is probably no coincidence.
Although the crisis was fuelled by a credit bubble, rocketing oil
prices helped
trigger the bust. All over the world, but especially in
America, people were saddling themselves with longer and longer
commutes to find houses they could almost afford, a phenomenon
known among American mortgage brokers as “drive till you qualify”.
The doubling of fuel prices in the US between 2005 and 2008 killed
that strategy and began the collapse of the housing market.
The price of Brent crude oil has fallen from about $115 a barrel
in June to about $85 today. That will make a tank of petrol cheaper
(though not by as much as it should, because of taxes) but it will
also make everything from chairs to chips to chiropody cheaper too,
because the cost of energy is incorporated into the cost of every
good and service we buy. The impact of this cost deflation will
dwarf any effect of, say, a fall in the price of BP shares in your
pension plan.
It is true that part of the reason oil prices are falling is
that world economic growth is slowing. But economists reckon that
every 10 dollars off the price of a barrel of crude oil transfers
0.5 per cent of world GDP from countries that export oil to
countries that import it — and the latter tend to spend the money
more quickly, accelerating the velocity of money and encouraging
investment and innovation.
The industrial revolution itself was built around abundant cheap
energy, mainly in the form of coal, which enabled mechanisation,
which vastly amplified the productivity of the average worker and
therefore his income. Today a typical British family of four uses
as much energy as if it had 400 slaves in the back room pedalling
eight-hour shifts on exercise bicycles. It would use even more if
it also fed those slaves!
The falling oil price is largely the Americans’ fault. By
reinventing the extraction process for first gas, then oil, with
horizontal drilling and hydraulic fracturing, engineers have almost
doubled the country’s output of oil in six years. That ingenuity
was made possible by the high price of oil, which promised fabulous
riches to those who could get oil out of shale, but it is no longer
dependent on the high price of oil. It is often said that the cure
for high oil prices is high oil prices and so it has proved.
The International Energy Agency (IEA) says that most shale oil
production remains profitable at
$80 a barrel. One North Dakota oilman tells me that his rate of
return on fracked wells drops to 10 per cent only when oil prices
reach $55, and that’s without taking into account the falling price
of fuel for his vehicles. Every month American oil production rises
by 100,000 barrels a day at the moment and that is not going to
change for a year or so whatever happens to the price of oil. The
number of rigs drilling for oil will start to drop off if the oil
price falls further, but only slowly. Of course, unlike the
nationalised decisions of Opec, oil production in America is the
sum of the investment decisions of hundreds of independent
companies.
The IEA reckons that only about 3 per cent of world production
needs a break-even price above $80. Much of that is in China,
Indonesia, Malaysia, Nigeria and Russia, where costs are high
largely because of big government taxes and royalties. It is
governments, in other words, that are most likely to take the
spring out of the consumer’s step, with Opec’s impending
decision on whether to constrain production being the best hope of
the spoilsports.
Talking of spoilsports, here in Britain the opposition has
managed to deprive people of the benefits of lower energy prices.
Ed Miliband’s promise that he would freeze the price of energy from
2015 came just before world energy prices began to fall smartly.
The energy utilities, reluctant to have their prices frozen when
they are low, have therefore done their utmost to avoid dropping
their prices in case Mr Miliband becomes prime minister.
As Paul Massara, head of
npower, told the regulator Ofgem in August: “We are acutely
aware that if the Labour party were to implement their proposed
price freeze, we will be living with the consequences of our
standard rate tariff price for a very long time and beyond the
level of risk that we could manage in the wholesale market.”
Not that the Liberal Democrat part of the coalition government
has covered itself in glory on energy costs either. In 2010 Chris
Huhne’s Department of Energy and Climate Change assumed that gas
prices would double by 2020, leading renewable energy subsidies to
wither away. Instead gas prices have fallen. In a normal market
that would mean lower electricity prices, but in DECC’s wonderland,
we have to pay the difference between gas prices and the costs of
wind. In other words, this country’s official energy policy
insulates consumers and manufacturers to some extent from the
benefits of falling costs. That way lies uncompetitiveness.
But is cheap fossil fuel not bad news for the climate? A new
paper in
Nature magazine argues that when the gas boom sparked by
fracking goes global, prices will fall fast, economic growth will
accelerate and so we will end up using more energy and producing
more emissions than before, even if we give up coal. It forgets to
mention that if we get that much richer, we will also abolish much
more poverty, disease and misery, and have the investment funds to
invent new, cheap and low-carbon forms of energy too.
October 19, 2014
Ebola needs beds on the ground
My Times column on Ebola:
It is not often I find myself agreeing with
apocalyptic warnings, but the west African ebola epidemic deserves
hyperbole right now.
Anthony Banbury, head of the UN ebola emergency response
mission, says: “Time is our enemy. The virus is far
ahead of us.” Dr David Nabarro, special envoy of the UN
secretary-general, says of ebola: “I have never encountered a
public health crisis like this in my life.”
However, this is a case where the hype could serve a purpose if
it motivates action and thereby proves itself wrong.
Two things could happen over the next few months. The more
probable is that the brave aid workers, soldiers and medical teams
heading for the region, and brave local health workers and burial
teams, will gradually get on top of the epidemic in the three
affected countries, Sierra Leone, Guinea and Liberia, the infection
rate will peak and start to drop, and the crisis will pass.
There will be cases in other countries, including Britain (and
panicky reactions), but they will peter out. The epidemic’s worst
legacy will be an upsurge in the death rate in west Africa from
malaria and other diseases that are going untreated now for lack of
spare beds and doctors.
The other possibility is that the number of cases will continue
doubling every four weeks, as it is now, so that hundreds of
thousands are dead by early next year. Superstitious fear of
doctors and treatment centres will worsen, civil society will
collapse in the region and all hope of fighting the epidemic by
isolating victims will be lost.
Ebola-carrying refugees will spread the virus to mega-cities
such as Lagos, and even to areas controlled by terrorists such as
Boko Haram. In the rich world, travel restrictions will multiply
and people will start staying away from transport hubs, plunging
the world economy into trouble.
If that happens, and the infection rate is still accelerating at
the turn of the year, I fear the ground war may be lost and the
world may have to wait for a vaccine or new supplies of ZMapp
monoclonal antibodies (made, please note, by genetic engineering in
plants).
Even if these work, manufacturing them faster than the epidemic
manufactures new cases will be tough and it may take years for
cures to overtake cases. That’s why the public-health ground battle
is so crucial. If we lose that in Sierra Leone, Liberia and Guinea,
then we are not facing paper tigers such as Sars or bird flu, but
something much more like the great plague of Justinian, in AD541,
or the Black Death eight centuries later.
Compared with those bubonic plague pandemics, we have enormous
advantages. We can know with certainty that epidemics are caused by
germs and not by Jews or sin. We can deploy protective clothing,
gloves, disinfectant, rehydration therapy and blood transfusions
from survivors. We can identify, sequence and probe the
vulnerabilities of the pathogen. We can fly well-trained health
workers around the world. We really should be able to cope.
However, modernity also means that the virus can fly from one
continent to another in hours, meeting hundreds of strangers along
the way. And the conditions under which most of those in Monrovia
and Freetown live are far too similar for comfort to the conditions
of Constantinople in 541 or Pisa in 1348. Ebola has never got a
hold in an urban setting before, let alone in three of the very
poorest countries in the world — in cities without many hospitals
or doctors, without reliable sewage systems, running water or
electricity. The lesson is clear: prosperity is the best
disinfectant.
The world’s complacency when ebola appeared in Guinea last
December was understandable. In every one of the 33 previous
outbreaks of ebola, public health measures proved able to contain
it; and that means essentially isolation of patients and their
contacts.
Though unusually lethal, this is not a very contagious disease.
Whereas each case of measles in an unvaccinated population can lead
to 17 more, even in this epidemic each ebola case is resulting in
less than two more. Get that number below one with a few simple,
low-tech precautions, and you soon get ebola under control. Hence
the failure of governments to order a vaccine.
The virus has now killed nearly three times as many people as in all
previous outbreaks of ebola put together. That has another sinister
implication. Ebola has now spent ten months jumping from one human
being to another, outside its natural habitat of fruit-bat blood,
much longer than it has ever lived in our species.
Natural selection means that it is bound to be getting better at
spreading among people, not because it is becoming airborne
(viruses very rarely change their mode of transmission) but perhaps
because it will find ways to become infectious before its victim
becomes seriously ill. At present, ebola victims are unusual in
that they do not spread the virus till they are really quite ill,
and corpses are especially infectious. That is why healthcare
workers are so at risk. If ebola evolves lower virulence, it may
become endemic, like HIV and malaria.
Having terrified myself with such thoughts, I can also reassure
myself by reciting the numbers that are going in the right
direction and that suggests the ground war is still winnable.
Liberia has increased its burial teams from six to 54. By
mid-November the Americans should have installed 1,700 treatment
beds in Liberia, and the British 700 in Sierra Leone, more than
trebling the capacity of the two countries to treat cases.
That point is still a month away. There are far more cases than
can be treated in treatment centres, which is why there is now a push to supply families with kits
including gloves and disinfectant to treat cases at home. Hundreds
of thousands of such kits are now being distributed. These may not
save many lives of those already infected but if they can lower the
infection rate from live victims, and better equipped burial teams
can lower the infection rate from dead bodies, then it should be
possible to slow and then halt the spread of the virus.
Because we neglected to develop a vaccine when we had time, beds
on the ground will have to win this war, not high technology
or medical science.
Science can only be the back-up plan if the disease takes
root.
October 12, 2014
Bees and pesticides
My Times column on how banning neo-nicotinoid
pesticides is proving counter-productive for bees:
The European Union’s addiction to the
precautionary principle — which says in effect that the risks of
new technologies must be measured against perfection, not against
the risks of existing technologies — has caused many perverse
policy decisions. It may now have produced a result that has proved
so utterly foot-shooting, so swiftly, that even Eurocrats might
notice the environmental disaster they have created.
All across southeast Britain this autumn, crops of oilseed rape
are dying because of infestation by flea beetles. The
direct cause of the problem is the two-year ban on pesticides
called neonicotinoids brought in by the EU over British objections
at the tail end of last year. The ban was justified on the
precautionary ground that neonics might be causing the mass decline
of bees. There is, by the way, no mass decline of bees, as I shall
explain.
Neonics are primarily used as a seed dressing: seeds are soaked
in the chemical so that the plant grows up protected from pests and
— crucially — often does not need to be sprayed. The beauty of this
is that it targets pests, such as flea beetles, that eat the plant,
but not the bystanders such as other insects. In the laboratory,
bees exposed to high doses of neonics do indeed die or become
confused. So they should — that’s what the word “insecticide”
means.
Yet large-scale field studies and real world evidence
consistently demonstrate that rape pollen does not contain a high
enough dose to have an impact on bee colonies. The Department for
Environment, Food and Rural Affairs report on the subject concluded
that lab studies used to justify the EU ban severely overdosed
their bees and that bees are not affected by neonics under normal
conditions. Australian regulators claim that neonics have actually
improved the environment for bees by replacing older pesticides.
And in the US, the Department of Agriculture and the Environmental
Protection Agency have so far resisted calls to ban neonics for
much the same reason.
Even though there was literally no good science linking neonics
to bee deaths in fields, they were banned anyway for use on
flowering crops in Europe. Friends of the Earth, which lobbied for
the ban, opined that this would make no difference to farmers. Dave
Goulson, a bee activist and author of a fine book on bumblebees
called A Sting in the Tale, was widely quoted as
saying that farmers were wasting their money on neonics anyway;
though how he knew this was not clear. Presumably he thinks farmers
are stupid.
Well, the environmentalists were wrong. The loss of
the rape cropthis autumn is approaching 50 per cent in
Hampshire and not much less in other parts of the country.
Farmers in Germany, the EU’s largest producer of rape, are also
reporting widespread damage. Since rape is one of the main flower
crops, providing huge amounts of pollen and nectar for bees, this
will hurt wild bee numbers as well as farmers’ livelihoods.
Farmers are instead reluctantly using pyrethroids. These older
insecticides are less effective against pests (flea beetles are
becoming resistant to them), more dangerous to other insects,
especially threatening to aquatic invertebrates when they seep into
streams and less safe to handle. So the result will be more insect
deaths. In a panic, Defra has just announced that it will allow the
use of two neonics, but — and here you have to laugh or you would
cry — both are
sprayed on the flowering crop, rather than used to dress seed!
So they definitely can harm bees.
The ban was brought in entirely to placate green lobby groups,
which have privileged and direct access to unelected European
officials in policymaking. They hotted up their followers, using
the misleading lab studies, to bombard politicians on the topic.
The former health commissioner, Tonio Borg, felt so inundated by
emails that he had to do something. Owen Paterson, as environment
secretary, received 85,000 emails to his parliamentary address
alone. Yet he warned colleagues that a ban was unjustified and
would be counterproductive. He was
right.
Back to bees. What decline? The number of honeybee hives in the
world is at a record high. The number in Europe is higher than it
was in the early 1990s when neonics were introduced. Hive mortality
in Britain was unusually low in the year before the neonic ban.
It’s a myth that honeybees are in dire straits.
That’s not to say beekeepers don’t have
problems. There was a severe problem eight years ago caused by
the mysterious colony collapse disorder — a phenomenon that has
happened throughout history and seems once again to have
disappeared. Greens tried to blame it on genetically modified
crops, but it happened in countries with no GM crops. The battle
against the varroa mite continues to be hard. A newly virulent
strain of tobacco ringspot virus has made the rare leap from
infecting plants to infecting bees.
What about wild bees, and bumblebees in particular? Having read
again and again of the terrible decline of bumblebees, I set out to
find some graphs or tables. I came away empty-handed. In Britain
some species contracted their ranges and some expanded during the
20th century. The specialist species seem to have suffered while
the generalists have thrived. But claims of a continuing fall in
the abundance of bumblebees over the past 20 years seem to be
entirely anecdotal.
As Dr Goulson recounts in his book, it’s hard to study bumblebee
nests because so many get destroyed by badgers. The huge expansion
of the badger population in recent years cannot have helped the
populations of their favourite prey.
Full disclosure: I have a farm. My oilseed rape is looking all
right this year, but the farmer is not happy at having to use
pyrethroids and nor am I. The local beekeeper is hopping mad about
the neonic ban, which he thinks has done more harm than good. And
he’s genuinely worried about a new threat to honeybees from the
small hive beetle, which is spreading in Italy, a major source of
honeybees and queens for Britain. Currently there is free movement
of potentially contaminated bees from Italy into the UK. In short,
nobody’s taking any precautions about the real threats.
October 9, 2014
Bitcoin and block-chain could transform the world
My Times column on who started bitcoin and what
it means:
Amid the hurly-burly of war, disease and politics,
you might be forgiven for not paying much attention to bitcoin, the
electronic form of money favoured by radical libertarians and drug
dealers. Yet it is possible that when the history of these days
comes to be written, bitcoin’s story will loom large. Unnoticed
except by the tech-obsessed, the technology behind bitcoin may be
slowly giving birth to a brave new world, with eventual
implications well beyond money.
So argues a new book (Bitcoin:
The Future of Money?) by the financial commentator and
comedian Dominic Frisby. He makes the case that it is just possible
that bitcoin and its rivals — known as altcoins — and the
“blockchain” technology that lies behind them have the potential to
spark a radical decentralisation of society itself. They could
change the way governments finance themselves, make banks redundant
and transform the ways companies are run. In the words of Jeff
Garzik, a bitcoin developer, bitcoin could be “the biggest thing
since the internet — a catalyst for change in all areas of our
lives”.
If he is right, then the founder of bitcoin will take his place
alongside the great inventors. So who is he? To this day he remains
carefully anonymous, and though Frisby makes a strong case for
having unmasked him, the man he has identified is scarcely more
visible than the disguise he uses. It is an alluring thought that
history could be changed anonymously.
Bitcoin went live in January 2009, on the day that the British
government announced a second bailout of the banks, an event
referred to in a segment of computer code hidden inside the first
bitcoins: it quoted a headline from The Times:
“Chancellor on brink of second bailout for banks”. That is
significant for two reasons. Satoshi Nakamoto, the pseudonym of
bitcoin’s founder, was clearly of the view that bitcoin’s purpose
was to replace a flawed system of banking and currency, and he was
also hinting that he was British and read The
Times.
Satoshi’s Britishness extends to his language, which uses
British phrases and spelling, and to the fact that his various
messages are always time-stamped in Greenwich Mean Time. In fact he
was doing his utmost not to sound like the American he is. The
timing of his postings on a bitcoin forum suggested he was either a
late riser on the US east coast or an early riser on the west
coast.
Meanwhile “cypherpunks” were a group of programmers who had come
together in Santa Cruz under the auspices of the computer pioneer
Tim May in 1992. Their aim was to undermine what they saw as
creeping government and corporate control over the nascent internet
by inventing methods of encryption that people could use. “Arise!
You have nothing to lose but your barbed wire fences,” May told
them. One of their first obsessions was a reliable form of
electronic cash, a substance that the economist Milton Friedman had
predicted would be the internet’s most important gift to
humanity.
Satoshi seems to have emerged from this group, which narrows
Frisby’s search. After analysing Satoshi’s writing style, typing
habits, computer coding skills and likely age, he eventually
concludes that the main and perhaps only person behind Satoshi is a
Californian with a degree in computer science and a doctorate in
law named Nick Szabo. Of course, Szabo has denied it on
Twitter.
Before his writing dried up around the time that bitcoin was
being created, Szabo wrote a long essay on the history of money,
called Shelling Out. In it he explored a
throwaway remark by the evolutionary biologist Richard Dawkins that
“money is a formal token of delayed reciprocal altruism”, and drew
upon other books in evolutionary psychology (including one of
mine).
What Szabo was after and what Satoshi achieved was to emulate
online the difficulty of mining precious metals. It requires vast
amounts of computing power to “mine” each bitcoin today — and
mining consists of the solving of massive mathematical puzzles by
hard computer grind. He also set out to emulate the trustworthiness
of paper money without a third party such as a bank or government
to verify it, which is the real genius of bitcoin.
Furthermore, the system is designed so it cannot produce more
than 21 million bitcoins, so debauching the currency by printing
money is impossible. The number that can be mined halves every four
years. Approximately 13 million have been mined so far, worth $6
billion today. Satoshi owns a large chunk of them and has not
cashed in lest it reveal his identity.
How does bitcoin achieve these feats? Frankly, I don’t fully
understand it (I am not sure anybody outside computer science does)
and they seem unable to translate it into simple English. But in
very broad outline, bitcoin is in effect a public ledger — a
compendium of previous transactions, stored by bitcoin users all
over the world. To participate in mining bitcoins you create a new
block in that ledger and share it with others in encrypted form.
This then makes bitcoin infallible as a register of who has
transferred value to whom: every bitcoin carries its ancestral
history in its code as verification that it is what it says it is.
No third party is involved.
A new currency is just one application of such an idea. The
blockchain’s special feature is that it cuts out the need for
somebody else to verify that something is what it says it is. This
opens the possibility of self-enforcing “smart contracts” and
“distributed autonomous organisations”, which the digital expert
Primavera de Filippi describes as networks that “once they have
been created and deployed on to the blockchain . . . no longer need
(nor heed) their creators”.
A simple and primitive example is Twister, a blockchain-based
social network with no corporate headquarters that a repressive
regime might shut down. The next step might be this: imagine in the
future summoning a taxi that is not only driverless but ownerless.
It belongs to a network that has raised funds, signed contracts and
taken delivery of vehicles autonomously, even though its
“headquarters” is distributed all over the net.
Perhaps you can now see why Satoshi Nakamoto would not want to
take credit for all this. Governments get jealous of people who
make them look irrelevant. Look what happened to Bernard von
NotHaus, who started openly selling tokens called “liberty dollars”
in 1998 to people who wanted a hedge against inflation. As the
economist Kevin Dowd recounts in a fine essay on the future of money, after nine years of
tolerating this, suddenly the US federal government arrested and
prosecuted him on the flimsiest of grounds for counterfeiting,
fraud and conspiracy. His real crime was to show the devaluing of
real dollars. No wonder Satoshi keeps his head down.
September 30, 2014
How we got to now
My review of Steven Johnson's book How We Got To Now appeared in the Times:
The meteorologist Edward Lorenz famously asked, in the title of
a lecture in 1972: “does the flap of a butterfly’s wings in Brazil
set off a tornado in Texas?”, and the phrase “the butterfly effect”
entered the language. If Steven Johnson’s book How We
Got to Nowcatches on — and it deserves to — then the
“humming bird effect” will also become common parlance.
Humming birds exist because flowers needed to find a way to
spread pollen over long distances, and they invented nectar to
attract insects. Birds were not part of the deal at all until much
later. That the evolutionary emergence of flowers would lead to a
radical redesign of the anatomy of some birds could not have been
foreseen.
Likewise, the history of human innovation is riddled with
examples of unexpected consequences of new technologies. As Johnson
tells it, Gutenberg made printed books cheap, which triggered a
rise in literacy, which created a market for spectacles, which led
to the invention of microscopes and telescopes, which led to the
discovery that the earth went round the sun. Then, during the
American Civil War, the Union blockade of southern ports led to a
shortage of ice, which created a market for the newly invented
refrigeration machine, which later enabled a man named Clarence
Birdseye to get very rich after inventing flash-frozen food. The
invention of the railway led to the standardisation of time. The
invention of flash photography led to campaigns for improving the
living conditions in New York tenements.
Johnson is one of the world’s best chroniclers of innovation and
in this book he brings a plethora of insights to the history of
glass, refrigeration, sound, hygiene, time and light. The
unintended consequences, for good and ill, that follow each
innovation form only one of these insights.
He points out that inventions are nearly always “ripe” or
inevitable in the sense that many people come up with the same idea
around the same time. The basic idea behind the light bulb, for
example, occurred to more than 20 different people; Edison proved
best at turning it into a business success mainly because he
understood that innovation is about bringing together different
ideas and skills.
From this Johnson then draws the conclusion that “the more we
build up vast repositories of scientific and technological
understanding, the more we conceal them”. For instance, your
ability to tell the time today depends on somebody understanding
how electrons circulate within cesium atoms; the knowledge of how
to send signals to satellites; the ability to trigger steady
vibrations in blocks of silicon dioxide; and much more. None of
which you need to know as you glance at your watch or
smartphone.
Johnson is a fluent writer and knows the value of telling
stories about people to bring history to life. Inventors make for a
rich cast of characters. This book is written to accompany a
television series, which is perhaps why it consists of a series of
discreet episodes, but they all illustrate similar themes, so the
whole hangs together well.
In the telling, the history of technology has tended to be the
poor relation of the history of science. Brilliant geniuses had
great ideas and clumsy tradesmen put them into action. Johnson is
one of a new breed of authors who are turning this upside down by
showing just how independent of science most innovation was. More
often than not it enabled science, rather than sprang from it.
And compared with political and military history, the history of
innovation is not just “one damned thing after another”; it
chronicles genuine, irreversible and magnificent changes in
society. Take the story of a New Jersey doctor named John Leal who
got a job managing water supplies for Jersey City and set out to do
something to make them safe. In secret, without permission and
against the law, he decided to try adding a strong poison called
calcium hyperchlorite, a procedure known as “chlorination” today.
When dilute, it killed bacteria but not people. Fortunately he got
the dose right and nobody died. Interrogated in court, he adamantly
insisted that his experiment had worked, that Jersey City’s water
was now the safest in the world and that he was not in it for the
money: his refusal to patent it led to the adoption of chlorination
all over the world. The court agreed and exonerated him of
wrongdoing.
The impact of Leal’s innovation was extraordinary. Between 1900
and 1930 chlorination cut total mortality in the average American
city by 43 per cent and infant mortality by 74 per cent. Almost
nothing has done more to reduce misery. Chlorination went on to
make swimming pools safe and popular which led, Johnson argues, to
changes in fashion, reinventing attitudes towards how much of the
shape of the female body could be revealed in polite society. A
hummingbird effect.
September 29, 2014
English devolution
My Times column on English devolution following
the Scottish independence referendum:
As part of the 1 per cent of England’s population
that lives north of Hadrian’s Wall, I have found the past few weeks
more than usually intriguing. It was fascinating to find that
nearly everybody in the media seems to think the wall is the
Scottish border; some news takes 1,500 years to reach the
metropolis. And we northeasterners have been banging on for decades
about the unfairness of the Barnett formula, which guarantees
£1,600 extra in public spending per Scottish head per year, so it’s
nice to see the rest of England waking up to that one, too.
Labour needs to be reminded of its biggest electoral defeat. Ten
years ago, almost to the day, the northeast was asked by John
Prescott if it wanted an assembly and it said “no” in the most
emphatic way imaginable — by 78 per cent to 22 per cent in a
referendum. That’s not a landslide, that’s an entombment.
Labour’s attempt to squirm off the hook — on which the prime
minister impaled it last week with his call for English home rule —
will probably include giving the regions more power. That’s what
Brussels wants too (and therefore the Liberal Democrats): the
European Union’s notorious map of a future in which power lies at
the European and regional level does not recognise England as a
region. Only by breaking down England into fragments does the
nation’s disproportionate size become compatible with
federalism.
Yet if anywhere in England should feel ripe for semi-detached
regional devolution it would be us in the northeast. We are as
chippy as they come about southern condescension, we live farther
from London than any other English people, our cities are so
isolated by sheep-infested hills from other English cities they
might as well be on an island. We speak a patois that southerners
claim to find impenetrable, our patriotic regional anthems are
about a fictional bus crash and a large worm, we wear very little
on a Friday night and we spent hundreds of years joining any doomed
rebellion against the crown that was on offer.
Nonetheless in 2004 the people of the northeast spoke with one
voice, or at least by a margin of almost four to one, against the
idea of a regional assembly. Why? Because, although they like
localism, they feel loyalty to England rather than any artificial
entity called the northeast. The inhabitants of Sunderland or
Berwick or Stockton have less than no desire to be governed from
Newcastle. In 2004 they knew a bureaucratic white elephant when
they saw one.
The Labour party and the European Commission (and the Liberal
Democrats for that matter) just do not get this. Any plan to
imitate feckless Scottish or Welsh semi-detachment with gleaming
new buildings to house self-important “assemblies” in Newcastle,
Birmingham, Norwich, Bristol and Liverpool will go against the
grain of England. The one in Cardiff, cut off from much of Wales by
miles of sheep and gorse, was put there by just one in four Welsh
voters. It has developed a reputation for incompetence where it is
relevant at all.
For the Conservatives, the penny has now dropped that English
devolution means English votes on English laws inside the Palace of
Westminster. I’ve never understood why people find the West Lothian
question so hard. We solve it every day in practice: British
ministers and civil servants already have no powers over Scottish
education, Scottish agricultural subsidies, Scottish health service
priorities, Scottish sentencing policy. It works fine.
It’s perfectly possible to exclude Scottish MPs from voting and
speaking on these and many other matters, too. There will be the
odd moment of confusion when an inebriated Glaswegian MP wanders
into the wrong committee debate, but so what? No need to build an
over-budget, ugly building in Sheffield and fill it with jobsworth
“EMPs”.
Sure, there would be a constitutional crisis if a Labour prime
minister were elected with a British but not an English majority,
and found himself regularly outvoted, but we have a well-tried
solution to such crises — a temporary coalition with another party
or a vote of confidence and another general election.
One big advantage of more democratic decision making at the
level of the four nations would be to encourage competition between
them in tax policy and in the provision of services. We are already
seeing glimmers of this in the effect of Wales’s poor and declining
results in international school league tables as well as its
underperforming health service.
Northern Ireland offers an illuminating example of how
devolution should work. Whereas the Scots have refused to use the
tax-varying powers they already have, Stormont may be on the brink
of leading the way.
Seeing how the Irish Republic’s dramatic cut in corporation tax
to 12.5 per cent attracted businesses, Owen Paterson, when Northern
Ireland secretary, argued for a similar cut in Northern Ireland. He
persuaded all parties there to back the idea and got the Treasury
on board by suggesting that it knock the corporation tax income off
the province’s central government block grant, making the change
revenue-neutral as far as Whitehall was concerned.
The change should happen soon. That is a key lesson for how to
do real devolution as opposed to the spend-and-whine version
favoured by the Scottish nationalists. Mr Paterson argues that
devolution must restore the link between tax, services and votes.
England’s antipathy to regionalism need not preclude more
localism.
Proper financial accountability at the level of the county,
rural or metropolitan, would transform local democracy and attract
better councillors. Single-tier counties (many of which are bigger
than some American states) would start to compete on price or on
quality of service instead of competing, as they do now, on their
ability to extract largesse from central government. We should
emulate the way America uses state government as a laboratory to
test policy.
Of course, there is a heck of a lot that counties (and nations)
would not be allowed to do by Brussels: compete on VAT, abolish
agricultural subsidies and so forth. But at least we would flush
this out. At the moment nobody realises just how many of the
decisions that politicians pretend to take are in fact handed down
by unelected Eurocrats to unelected Sir Humphreys with a token nod
through parliament. Genuine English home rule would soon clash with
the technocratic version offered by Brussels. Another reason to
like it.
September 25, 2014
The ozone hole was exaggerated as a problem
My recent Times column argued that the alleged healing
of the ozone layer is exaggerated, but so was the impact of the
ozone hole over Antarctica:
The ozone layer is healing. Or so said the news
last week. Thanks to a treaty signed in Montreal in 1989 to get rid
of refrigerant chemicals called chlorofluorocarbons (CFCs), the
planet’s stratospheric sunscreen has at last begun thickening
again. Planetary disaster has been averted by politics.
For reasons I will explain, this news deserves to be taken with
a large pinch of salt. You do not have to dig far to find evidence
that the ozone hole was never nearly as dangerous as some people
said, that it is not necessarily healing yet and that it might not
have been caused mainly by CFCs anyway.
The timing of the announcement was plainly political: it came on
the 25th anniversary of the treaty, and just before a big United
Nations climate conference in New York, the aim of which is to push
for a climate treaty modelled on the ozone one.
Here’s what was actually announced last week, in the words of a
Nasa scientist, Paul Newman: “From 2000 to 2013, ozone levels
climbed 4 per cent in the key mid-northern latitudes.” That’s a
pretty small change and it is in the wrong place. The ozone
thinning that worried everybody in the 1980s was over
Antarctica.
Over northern latitudes, ozone concentration has been falling by
about 4 per cent each March before recovering. Over Antarctica,
since 1980, the ozone concentration has fallen by
40 or 50 per cent each September before the sun rebuilds
it.
So what’s happening to the Antarctic ozone hole? Thanks to a
diligent blogger named Anthony Watts, I came across a press release
also from Nasa about nine months ago, which said: “
Two new studies show that signs of recovery are not yet
present, and that temperature and winds are still driving any
annual changes in ozone hole size.”
As recently as 2006, Nasa announced, quoting Paul Newman again,
that the Antarctic ozone hole that year was “the largest ever
recorded”. The following year a paper in Nature
magazine from Markus Rex, a German scientist, presented new
evidence that suggested CFCs may be responsible for less than 40
per cent of ozone destruction anyway. Besides, nobody knows for
sure how big the ozone hole was each spring before CFCs were
invented. All we know is that it varies from year to year.
How much damage did the ozone hole ever threaten to do anyway?
It is fascinating to go back and read what the usual
hyperventilating eco-exaggerators said about ozone thinning in the
1980s. As a result of the extra ultraviolet light coming through
the Antarctic ozone hole, southernmost parts of Patagonia and New
Zealand see about 12 per cent more UV light than expected. This
means that the weak September sunshine, though it feels much the
same, has the power to cause sunburn more like that of latitudes a
few hundred miles north. Hardly Armageddon.
The New York Times reported “an increase
in Twilight Zone-type reports of sheep and
rabbits with cataracts” in southern Chile. Not to be outdone, Al
Gore wrote that “hunters now report finding blind rabbits;
fisherman catch blind salmon”. Zoologists briefly blamed the near
extinction of many amphibian species on thin ozone.
Melanoma in people was also said to be on the rise as a
result.
This was nonsense. Frogs were dying out because of a fungal
disease spread from Africa — nothing to do with ozone. Rabbits and
fish blinded by a little extra sunlight proved to be as mythical as
unicorns. An eye disease in Chilean sheep was happening outside the
ozone-depleted zone and was caused by an infection called pinkeye —
nothing to do with UV light. And melanoma incidence in people
actually levelled out during the period when the
ozone got thinner.
Then remember that the ozone hole appears when the sky is dark
all day, and over an uninhabited continent. Even if it persists
into the Antarctic spring and spills north briefly, the hole allows
50 times less ultraviolet light through than would hit your skin at
the equator at sea level (let alone at a high altitude) in the
tropics. So it would be bonkers to worry about UV as you sailed
round Cape Horn in spring, say, but not when you stopped at the
Galapagos: the skin cancer risk is 50 times higher in the latter
place.
This kind of eco-exaggeration has been going on for 50 years. In
the 1960s Rachel Carson said there was an epidemic of childhood
cancer caused by DDT; it was not true — DDT had environmental
effects but did not cause human cancers.
In the 1970s the Sahara desert was said be advancing a mile a
year; it was not true — the region south of the Sahara has grown
markedly greener and more thickly vegetated in recent decades.
In the 1980s acid rain was said to be devastating European
forests; not true — any local declines in woodland were caused by
pests or local pollution, not by the sulphates and nitrates in
rain, which may have contributed to an actual increase in the
overall growth rate of European forests during the decade.
In the 1990s sperm counts were said to be plummeting thanks to
pollution with man-made “endocrine disruptor” chemicals; not true —
there was no fall in sperm counts.
In the 2000s the Gulf Stream was said to be failing and
hurricanes were said to be getting more numerous and worse, thanks
to global warming; neither was true, except in a Hollywood
studio.
The motive for last week’s announcement was to nudge world
leaders towards a treaty on climate change by reminding them of how
well the ozone treaty worked. But getting the world to agree to
cease production of one rare class of chemical, for which
substitutes existed, and which only a few companies mainly in rich
countries manufactured, was a very different proposition from
setting out to decarbonise the whole economy, when each of us
depends on burning carbon (and hydrogen) for almost every product,
service, meal, comfort and journey in our lives.
The true lesson of the ozone story is that taking precautionary
action on the basis of dubious evidence and exaggerated claims
might be all right if the action does relatively little economic
harm.
However, loading the entire world economy with costly energy,
and new environmental risks based on exaggerated claims about what
might in future happen to the climate makes less sense.
September 7, 2014
Whatever happened to global warming?
My op-ed in the Wall Street Journal addresses the
latest explanations for the "pause" in global warming and their
implications. I have responded to an ill-informed critique of the
article below.
On Sept. 23 the United Nations will host a party for world
leaders in New York to pledge urgent action against climate change.
Yet leaders from China, India and Germany have already announced
that they won't attend the summit and others are likely to follow,
leaving President Obama looking a bit lonely. Could it be that they
no longer regard it as an urgent threat that some time later in
this century the air may get a bit warmer?
In effect, this is all that's left of the global-warming
emergency the U.N. declared in its first report on the subject in
1990. The U.N. no longer claims that there will be dangerous or
rapid climate change in the next two decades. Last September,
between the second and final draft of its fifth assessment report,
the U.N.'s Intergovernmental Panel on Climate Change
quietly downgraded the warming it
expected in the 30 years following 1995, to about 0.5 degrees
Celsius from 0.7 (or, in Fahrenheit, to about 0.9 degrees, from
1.3).
Even that is likely to be too high. The climate-research
establishment has finally admitted openly what skeptic scientists
have been saying for nearly a decade: Global warming has stopped
since shortly before this century began.
First the climate-research establishment denied that a pause
existed, noting that if there was a pause, it would invalidate
their theories. Now they say there is a pause (or "hiatus"), but
that it doesn't after all invalidate their theories.
Alas, their explanations have made their predicament worse by
implying that man-made climate change is so slow and tentative that
it can be easily overwhelmed by natural variation in temperature—a
possibility that they had previously all but ruled out.
When the climate scientist and geologist Bob Carter of James
Cook University in Australia wrote an article in 2006 saying that
there had been no global warming since 1998 according to the most
widely used measure of average global air temperatures, there was
an outcry. A year later, when David Whitehouse of the Global
Warming Policy Foundation in London made the same point, the environmentalist
and journalist Mark Lynas said in the New Statesman that
Mr. Whitehouse was "wrong, completely wrong," and was
"deliberately, or otherwise, misleading the public."
We know now that it was Mr. Lynas who was wrong. Two years
before Mr. Whitehouse's article, climate scientists were already
admitting in emails among themselves that
there had been no warming since the late 1990s. "The scientific
community would come down on me in no uncertain terms if I said the
world had cooled from 1998," wrote Phil Jones of the University of
East Anglia in Britain in 2005. He went on: "Okay it has but it is
only seven years of data and it isn't statistically
significant."
If the pause lasted 15 years, they conceded, then it would be so
significant that it would invalidate the climate-change models upon
which policy was being built. A report from the National
Oceanic and Atmospheric Administration (NOAA) written in 2008 made
this clear: "The simulations rule out (at the 95% level) zero
trends for intervals of 15 yr or more."
Well, the pause has now lasted for 16, 19 or 26 years—depending
on whether you choose the surface temperature record or one of two
satellite records of the lower atmosphere. That's according to a
new statistical calculation by Ross McKitrick,
a professor of economics at the University of Guelph in Canada.
It has been roughly two decades since there was a trend in
temperature significantly different from zero. The burst of warming
that preceded the millennium lasted about 20 years and was preceded
by 30 years of slight cooling after 1940.
This has taken me by surprise. I was among those who thought the
pause was a blip. As a "lukewarmer," I've long thought that
man-made carbon-dioxide emissions will raise global temperatures,
but that this effect will not be amplified much by feedbacks from
extra water vapor and clouds, so the world will probably be only a
bit more than one degree Celsius warmer in 2100 than today. By
contrast, the assumption built into the average climate model is
that water-vapor feedback will treble the effect of carbon
dioxide.
But now I worry that I am exaggerating, rather than
underplaying, the likely warming.
Most science journalists, who are strongly biased in favor of
reporting alarming predictions, rather than neutral facts, chose to
ignore the pause until very recently, when there were explanations
available for it. Nearly 40 different excuses for the pause have
been advanced, including Chinese economic growth that supposedly
pushed cooling sulfate particles into the air, the removal of
ozone-eating chemicals, an excess of volcanic emissions, and a
slowdown in magnetic activity in the sun.
The favorite explanation earlier this year was that strong trade
winds in the Pacific Ocean had been taking warmth from the air and
sequestering it in the ocean. This was based on a few sketchy
observations, suggesting a very tiny change in water temperature—a
few hundredths of a degree—at depths of up to 200 meters.
Last month two scientists wrote in Science that they had instead
found the explanation in natural fluctuations in currents in the
Atlantic Ocean. For the last 30 years of the 20th century, Xianyao
Chen and Ka-Kit Tung suggested, these currents had been boosting
the warming by bringing heat to the surface, then for the past 15
years the currents had been counteracting it by taking heat down
deep.
The warming in the last three decades of the 20th century, to
quote the news release that accompanied
their paper, "was roughly half due to global warming and half to
the natural Atlantic Ocean cycle." In other words, even the modest
warming in the 1980s and 1990s—which never achieved the 0.3 degrees
Celsius per decade necessary to satisfy the feedback-enhanced
models that predict about three degrees of warming by the end of
the century—had been exaggerated by natural causes. The man-made
warming of the past 20 years has been so feeble that a shifting
current in one ocean was enough to wipe it out altogether.
Putting the icing on the cake of good news, Xianyao Chen and
Ka-Kit Tung think the Atlantic Ocean may continue to prevent any
warming for the next two decades. So in their quest to explain the
pause, scientists have made the future sound even less alarming
than before. Let's hope that the United Nations admits as much on
day one of its coming jamboree and asks the delegates to pack up,
go home and concentrate on more pressing global problems like war,
terror, disease, poverty, habitat loss and the 1.3 billion people
with no electricity.
Post-script. After the article was published, an
astonishing tweet was sent by the prominent economist Jeffrey Sachs
saying
"Ridley climate ignorance in WSJ
today is part of compulsive lying of Murdoch media gang. Ridley
totally misrepresents the science."
Curious to know how I had lied or "totally misrepresented" the
science, I asked Sachs to explain. There was a deafening
silence.
There then appeared at the Huffington Post (a media outlet owned
by a person with strong views, by the way) an article under Sachs's
name. Its style was quite unlike that of Sachs, and strongly
resembled the style and debating technique of a spin doctor
employed by Lord Stern at the London School of Economics, who
writes to newspapers furiously denouncing the author of any article
on climate change that he does not like. Indeed that same spin
doctor, Bob Ward, alerted me to the Huff Post article in a tweet.
The piece purported to -- in the spin doctor's words -- expose
"The Wall Street Journal Parade of
Climate Lies - @JeffDSachs destroys daft
@mattwridley article
in@WSJ".
However, it does nothing of the sort. It's all bluster and
careful misdirection, and contradicts nothing in my article, let
alone producing evidence against of lies. Paragraph by paragraph, I
will expose its daftness, which truly shocked me given that I had
respect for Jeffrey Sachs as a scholar before reading this. Here
are the key paragraphs:
Ridley's "smoking gun" is a paper
last week in Science Magazine by two scientists
Xianyao Chen and Ka-Kit Tung, which Ridley somehow believes refutes
all previous climate science. Ridley quotes a sentence fragment
from the press release suggesting that roughly half of the global
warming in the last three decades of the past century (1970-2000)
was due to global warming and half to a natural Atlantic Ocean
cycle. He then states that "the man-made warming of the past 20
years has been so feeble that a shifting current in one ocean was
enough to wipe it out altogether," and "That to put the icing on
the case of good news, Xianyao Chen and Ka-Kit Tung think the
Atlantic Ocean may continue to prevent any warming for the next two
decades."
Notice the quote marks around "smoking gun", implying that I
used the phrase. I did not. In any case, the Chen and Tung paper
was only one of the pieces of evidence I cited.
The Wall Street
Journal editors don't give a hoot about the nonsense they
publish if it serves their cause of fighting measures to limit
human-induced climate change. If they had simply gone online to
read the actual paper, they would have found that the paper's
conclusions are the very opposite of Ridley's.
In his writing the real Mr Sachs does not often use phrases like
"don't give a hoot".
In any case, he's plain wrong about the contradiction. The quote
I gave from the press release is accurate. And I have read the
paper and can assure Mr "Sachs" that its conclusions are not the
opposute of what I have said. As further confirmation, how about
asking the paper's lead author himself? This is what he wrote to Professor Judith Curry in response
to her questions:
Dear Judy,
The argument on the roughly
50-50 attribution of the forced vs unforced warming for the last
two and half decades of the 20th century is actually quite simple.
If one is blaming internal variability for canceling out the
anthropogenically forced warming during the current hiatus, one
must admit that the former is not negligible compared to the
latter, and the two are probably roughly of the same magnitude.
Then when the internal cycle is of the different sign in the latter
part of the 20th century, it must have added to the forced
response. Assuming the rate of forced warming has not changed
during the period concerned, then the two combined must be roughly
twice the forced warming during the last two and half decades of
the 20th century.
In other words, as I said, the warming of 1975-2000 was only
half caused by man-made emissions and half by natural causes, and
natural causes were enough to cancel man-made forcing in the years
after 2000.
To continue with the "Sachs" article:
First, the paper makes perfectly
clear that the Earth is warming in line with standard climate
science, and that the Earth's warming is unabated
in recent years. In the scientific lingo of the paper (it's very
first line, so Ridley didn't have far to read!), "Increasing
anthropogenic greenhouse-gas-emissions perturb Earth's radiative
equilibrium, leading to a persistent imbalance at the top of the
atmosphere (TOA) despite some long-wave radiative adjustment." In
short, we humans are filling the atmosphere with carbon dioxide
from fossil-fuel use, and we are warming the planet.
Mr "Sachs" did not have far to read in my own article to find
this is in complete agreement with what I wrote also:
I've long thought that
man-made carbon-dioxide emissions will raise global temperatures,
but that this effect will not be amplified much by feedbacks from
extra water vapor and clouds, so the world will probably be only a
bit more than one degree Celsius warmer in 2100 than
today.
Instead of using words like "unabated" why not give numbers? I
did.
The warming during 1975-2000, even if you cherry-pick the end
points, was about 0.4 degrees C if you average the five main global
data sets, and if half of that was natural, then man-made forcing
was going at the rate of less than 1 degree per century, rather
less than what i said.
Second, the total warming is
distributed between the land and ocean surface on the one hand and
the ocean deep water on the other. The total rise of ocean heat
content has continued unabated, while the proportion of heat
absorbed at the surface and in the deeper ocean varies over time.
Again, in the scientific lingo of the paper, "[T]his forced total
OHC [ocean heat content] should be increasing monotonically over
longer periods even through the current period of slowed warming.
In fact, that expectation is verified by observation ...". In other
words, the ocean has continued to warm in line with predictions of
just such a phenomenon seen in climate models.
This is highly misleading. Yes, as I clearly stated in my
article, the ocean could start to transfer heat to the air again.
So the quote from the paper does not contradict me at all. In any
case, remember, the data on ocean heat content is highly ambiguous.
As Judith Curry summarised it recently:
The main issue of interest is
to what extent can ocean heat sequestration explain the hiatus
since 1998. The only data set that appears to provide support
for ocean sequestration is the ocean reanalysis, with the Palmer
and Domingues 0-700 m OHC climatology providing support for
continued warming in the upper ocean.
All in all, I don’t see a very
convincing case for deep ocean sequestration of heat. And
even if the heat from surface heating of the ocean did make it into
the deep ocean, presumably the only way for this to happen involves
mixing (rather than adiabatic processes), so it is very difficult
to imagine how this heat could reappear at the surface in light of
the 2nd law of thermodynamics.
Back to the Sachs article:
Third, it is the "vertical
distribution" of the warming, between the surface and deep water,
which affects the warming observed on land and at the sea surface.
The point of the paper is that the allocation of the warming
vertically varies over time, sometimes warming the surface rapidly,
other times warming the deeper ocean to a great extent and the
surface water less rapidly. According to the paper, the period of
the late 20th century was a period in which the surface was warmed
relative to the deeper ocean. The period since 2000 is the
opposite, with more warming of the deeper ocean. How do the
scientists know? They measure the ocean temperature at varying
depths with a sophisticated system of "Argo profiling floats,"
which periodically dive into the ocean depths to take temperature
readings and resurface to transmit them to the data centers.
I have no problem with this paragraph, which merely reiterates
what i said about the Chen and Tung paper, with a bit more detail
about the Argo floats etc.
So, what is Ridley's "smoking gun"
when you strip away his absurd version of the paper? It goes like
this. The Earth is continuing to warm just as greenhouse gas theory
holds.
Check, I agree. But the atmosphere is not continuing to warm
right now.
The warming heats the land and the
ocean. The ocean distributes some of the warming to the surface
waters and some to the deeper waters, depending on the complex
circulation of ocean waters.
Check. Could not have said it better myself.
The shares of warming of the surface
and deeper ocean vary over time, in fluctuations that can last a
few years or a few decades.
Check.
Where's the contradiction with what i wrote? There is none. If
Mr "Sachs" had bothered to read my article properly, he would find
that his description of what is happening is pretty well exactly
the same as mine. Except that he gives no numbers. What I did was
to show that if Chen and Tung is right, and half the warming in the
last part of the last century was natural, then the "rapid" warming
of those three decades, was still too slow for the predictions made
by the models, will if it resumes give us a not very alarming
future. And if it does not resume for some time, as Chen and Tung
speculate that it might not, then the future is even less
alarming.
And no, again, I did not use the phrase "smoking gun". I used
several other arguments, all of which Mr "Sachs" fails to address
at all, so presumably he agrees that there has been a "pause", that
it was denied for many years by the climate establishment, that
there was general agreement among them that a pause of more than 15
years would invalidate their models, and so on.
He goes on:
If the surface warming is somewhat
less in recent years than in the last part of the 20th century, is
that reason for complacency? Hardly. The warming is continuing, and
the consequences of our current trajectory will be devastating
unless greenhouse gas emissions (mainly carbon dioxide) are stopped
during this century. As Chen and Tung conclude in their Science
paper, "When the internal variability [of the ocean] that is
responsible for the current hiatus [in warming] switches sign, as
it inevitably will, another episode of accelerated global warming
should ensue."
I hardly think it was complacent of me to ask world leaders to
address the much more urgent issues of war, terror, disease,
poverty, habitat loss and the 1.3 billion people with no
electricity.
Again, i said, that warming may well resume. The only
disagreement is whether it will be devastating, and that is a
prediction not an empirical fact. I cannot yet be "wrong" about
it.
When, Oh when, will Mr "Sachs" get around to including a number,
any number. He surely cannot be under the impression that
lukewarmers like me think there is no greenhouse effect? He surely
knows that the argument is not about whether there is warming, but
how fast.
And where did I lie, or misrepresent? Where, Mr Ward, did he
"destroy" me, pray? He did not.
Mr "Sachs", who is usually a careful academic, has published a
lot of wild accusations against me and "totally" (his word) failed
to stand them up. How did this come about? Perhaps, being a busy
man, he asked somebody else to ghost-write much of the piece for
him and did not check it very thoroughly. If so, no problem, a
quick tweet apologising to me and admitting that nothing in his
article contradicts anything in mine, that we merely disagree on
the predictions of dangerous warming, and I will consider the
matter closed.
September 4, 2014
Government begins as a monopoly on violence
My Times column last week was on the historical
roots of government:
Nobody seems to agree whether Islamic State is
best described as a gang of criminals, a terrorist organisation or
a religious movement. It clearly has a bit of all three. But don’t
forget that it aspires, for better or worse, to be a government. A
brutal, bigoted and murderous government, its appeal is at least
partly that it seems capable of imposing its version of “order” on
the territory it controls, however briefly. It reminds us that the
origin and defining characteristic of all government is that it is
an organisation with a monopoly on violence.
The deal implicit in being governed is at root a simple one: we
allow the people who govern us to have an exclusive right to commit
violence, so long as they direct it at other countries and at
criminals. In almost every nation, if you go back far enough,
government began as a group of thugs who, as Pope Gregory VII put it in 1081, “raised themselves up above
their fellows by pride, plunder, treachery, murder — in short by
every kind of crime”.
Was Canute, or William the Conqueror, or Oliver Cromwell really
much different from the Islamic State? They got to the top by
violence and then violently dealt with anybody who rebelled. The
American writer Albert Jay Nock in 1939 observed: “The idea that
the state originated to serve any kind of social purpose is
completely unhistorical. It originated in conquest and confiscation
— that is to say, in crime . . . No state known to history
originated in any other manner, or for any other purpose.”
Henry VII, the monarch who managed, after a century of gang
warfare, to establish a monopolistic central government in England,
funded his administration largely by extorting money from rich
merchants with the threat of violence. That is to say, he ran a
protection racket as blatant as any mafia don or IRA commander: pay
up or lose your kneecaps.
Organised crime tends to evolve into government; and government
begins as organised crime. The mafia emerged as a way of imposing
order on a Sicilian society racked by arbitrary violence. People
turn to gangs for protection.
A fascinating new book by the King’s College London political
economist David Skarbek, The Social Order of the Underworld,
documents how the huge expansion of the American prison population
in the 1970s led to the breakdown of the simple “convict code” by
which prisoners tended to keep each other under control, and
brought instead the emergence in 30 different prisons within a few
years of prison gangs.
Gangs proliferated in prisons, he argues, because they imposed a
form of rudimentary governance that suppressed violence, increased
trade in drugs and other goods, lowered prices and generally
improved the inmates’ and prison officers’ lives — so long as
nobody crossed or cheated the gang leaders.
The same thing has not yet happened in women’s prisons, which
are still much smaller. Skarbek thinks this echoes what happened in
early civilisations, where the simple, interpersonal norms that
kept people relatively nice in hunter-gatherer society stopped
working when society passed a certain scale, and government was
invented by whichever gang managed to impose a monopoly of
violence.
Caesar Augustus emerged from decades of civil war and mayhem as
the man with the monopoly on violence in Rome. As Ian Morris
observes in his book War: What is it Good For?, there was a
paradoxical logic at work in the Pax Romana that he achieved.
Because everybody knew that Augustus could send in the legions, he
almost never had to.
Thus, the state’s monopoly of violence fades from view if all
goes well. Should Islamic State’s caliphate endure, it is a fair
bet that it will eventually become a bureaucracy peopled with
functionaries rather than assassins. That is what happened in the
original caliphate, after all. But the monopoly of violence is
always there even in western society. If you doubt it, try not
paying your taxes and see what happens when you resist arrest.
One of the great peculiarities of the United States is that it
never quite managed to impose a state monopoly on powerful
weaponry. The right to bear arms was a reaction to the presence of
redcoats as an occupying army before 1783. The government got to
own the tanks and aircraft carriers, but never pointed them at its
own people, who were allowed to own guns much more freely than in
other countries.
This is what makes the kit that the police displayed in
Ferguson, Missouri, this month so alarming. With their camouflage
uniforms, armoured vehicles and heavy-calibre machine guns, “law
enforcement” cops looked less like a constabulary and more like an
occupying army. In recent years, largely by exploiting the “war” on
terror and the “war” on drugs, the American police have indeed been
radically militarised.
In 2013 the United States Department of Homeland Security set out to buy 1.6 billion rounds of
ammunition for law enforcement, some of it hollow-point — that is
to say, forbidden by international law for use in war. That’s
enough to shoot the entire population five times over. The US
government has armed many of its agencies, from the
Social Security Administration to the Internal Revenue Service to
the Department of Education to the Bureau of Land Management, even
the National Oceanographic and Atmospheric Administration.
The Republican senator Rand Paul commented in Time magazine that
the federal government had incentivised the militarisation of local
police, funding municipal governments to “build what are
essentially small armies”. Evan Bernick, of the Heritage
Foundation, warned last year that “the Department of Homeland
Security has handed out anti-terrorism grants to cities and towns
across the country, enabling them to buy armoured vehicles, guns,
armour, aircraft”. The Pentagon actually donates military equipment
to the police, including tanks.
We have not yet gone so far in this country. Ofsted and the Met
Office — as far as I know — do not yet arm their inspectors and
forecasters. But the days when the state’s monopoly on violence was
merely hinted at by a policeman’s uniform are long gone. You see
police with sub-machineguns everywhere, and the Met is about to
purchase water cannon to keep us in order. I hope that in combating
violent gangs, our governments do not themselves turn back into
violent gangs.
Matt Ridley's Blog
- Matt Ridley's profile
- 2180 followers
