Matt Ridley's Blog, page 36

January 7, 2015

Digital government begins

In December, I omitted to post my Times column on
government IT and digital policy:



 



The travel chaos last Friday was a reminder of
just how much life depends on Big Software doing its job. The
air-traffic control centre at Swanwick was six years late and
hundreds of millions over budget when it opened in 2002 in shiny
new offices, but with software still based on an upgraded, old
system. Unnoticed and unsung, however, this government may actually
have found a way to bring the horrid history of big, public IT projects to
an end.



Remember the chaotic launch of Obamacare in
August, with only one in five users able to access the
healthcare.gov website. Or the NHS patient record system, abandoned
last year after costing the taxpayer nearly £10 billion. Likewise,
the BBC wrote off £100 million last year after five years of
failing to make its “digital media initiative” work. It’s equally bad in many big companies: McKinsey
found in 2012 that 17 per cent of IT projects budgeted at more than
$15 million fail so badly they threaten the company’s very
existence.



There is something uniquely troublesome about big IT projects.
So it may surprise you to hear that I think a genuine success story
of this government is that it has finally learnt how to prevent
such problems and design new systems so that they work. In essence,
it has begun to adopt the principles of evolution, rather than
creationism.



Largely unheralded, the government digital service is one of the
current administration’s success stories. Tirelessly championed by
Francis Maude, as minister for the Cabinet Office, egged on by
Baroness Lane-Fox, and run by Mike Bracken, brought in from
outside, the GDS is not just trying to make government services
online as easy as shopping at Amazon or booking an airline ticket.
It is also reshaping the way the public sector does big IT projects
to make sure cost and time overruns are history.



What Mr Bracken calls the “waterfall” approach to such big
projects in the past consisted of “writing most when you know
least”. The people in charge wrote enormous documents to try to
specify the comprehensive requirements of the end users, did not
change them as technology changed and issued vast, long and
lucrative contracts to big companies.



Instead, Mr Maude and Mr Bracken are teaching the civil service
to start small, fail fast, get feedback from users early and evolve
the thing as you go along. So those designing an online service
begin with a discovery phase, lasting six to 12 weeks, then build
an “alpha” prototype of a working software service in less than
three months, followed by commissioning a private “beta”, to be
used by a private audience of specialist users. Only once
rigorously tested is this opened up to the public, sometimes in a
controlled way. And only later is the old service turned off.



This is exactly the sort of recipe for success championed by the
economist Tim Harford in his book Adapt. Harford
pointed out that whether pacifying Iraq, designing an aircraft or
writing a Broadway musical, those who succeed allow for plenty of
low-cost trial and error and incremental change. It’s the mechanism
Charles Darwin discovered that Mother Nature uses. Rather than a
grand “creationist” plan or a big leap, natural selection
incrementally discovers success through trial and failure. From the
English language to an airliner, everything successful has emerged
by small steps.



The successful IT systems we all use, from Facebook to BBC News,
were all built this way. Yet government kept trying to do things by
grand plan. The history of information technology explains how we
went wrong. In the beginning all things related to the web, in
public and private sectors, became the property of the high priests
in the IT department, as the only people who understood the
technology. They thought mainly of the needs of producers of
content, rather than users. Much of the private sector wrestled
digital content out of the hands of the IT department long ago, but
in much of Whitehall that’s where it still lay until recently.



Mr Bracken and his lieutenants have turned Whitehall upside
down, collapsing the profession of chief information officer (head
of IT) altogether. These CIOs had often been recruited from big
consultancy firms, and they were in the habit of outsourcing big
projects to those firms. The plan was to outsource the risk, but it
never happened. Indeed, in many instances the big IT consultancies
royally ripped off the taxpayer by designing big systems in the
waterfall fashion so that they overran their budgets and
timetables.



Mr Maude began by centralising controls so that he had to sign
off any IT contract of more than £1 million (now raised to £5
million), then built up an in-house capability to offer cheaper and
better design, and opened procurement to smaller companies.
Government contracts with outside IT suppliers are now shorter and
smaller. Some of the savings on offer were so vast that civil
servants refused to believe them. In one case, 98.5 per cent of the
cost of an existing contract was saved by letting a contract to a
small British business rather than an incumbent multinational IT
firm, and it worked better. With much of the system still to
tackle, Mr Maude reckons he will have saved £4 billion a year by
2019-20.



The big question being asked in Whitehall right now is whether
Universal Credit will prove to have been the last of the old or the
first of the new. It began as an old-fashioned waterfall project,
but is being reformulated in parallel in the incremental and
digital fashion.



It is not just me who is starstruck by what Mr Maude and Mr
Bracken are doing. Other governments are noticing. Last week the
government hosted an international summit called the D5, an
exclusive club with Estonia, Israel, South Korea and New Zealand in
it. Along with Britain, these countries see themselves as by far
the most advanced in digitising government services in a way that
makes them easy for users to use. The United States government is
deliberately emulating Britain’s approach.



The rapid roll-out of the new government web platform, gov.uk,
now used by 300 government organisations, won the prestigious
Design of the Year award last year, beating the Shard. Mr Bracken
relentlessly demands simplicity from government websites, reversing
a 15-year build-up of useless information, banishing 40 per cent of
the content to the National Archives, purifying forms, closing
1,800 separate government websites and saving 60 per cent of costs.
One-click government at last.

 •  0 comments  •  flag
Share on Twitter
Published on January 07, 2015 02:49

January 1, 2015

Britain's best years

My Times column is on the UK's high standard of
living and social freedoms:



Years ending in 15 (or 65) have often been good
ones to be British. In January, we celebrate 750 years since Simon
de Montfort first summoned Parliament to Westminster. In June, we
mark the 800th anniversary of making kings subject to the law in
Magna Carta. Three days later it’s off to Waterloo for the 200th
birthday of the battle.



There’s more. In October, we cry God for Harry, England and St
George, and beat the French again at the 600th anniversary of
Agincourt. November, for those with any fireworks left, marks the
300th anniversary of arguably the last battle fought on English
soil — at Preston, where the Old Pretender’s last hopes died.



Unlike this year’s remembrance of 1914, these are cheerful and
somewhat British events — with the exception of Waterloo, where
Blücher’s Prussian army and Wellington’s Hanoverian troops deserve
a large chunk of the credit. Indeed, Waterloo and Preston excepted,
they are English events (I presume the French do not celebrate
Agincourt or Waterloo). So it is not a bad time to remind ourselves
how lucky we are to live on this damp little island.



I don’t mean this in a jingoistic way, and certainly when you
look closely there is little to recommend Henry V’s brutal French
raid. What there is to celebrate, of course, is Shakespeare’s
poetic rendering of the campaign. It is our literary, scientific,
technological, economic, political and philosophical achievements,
rather than just our military milestones that we should
occasionally pause to remember, amid our usual self-criticism.



All my life I have been told that Britain is in decline. But
stand back and take a long, hard look. Even by relative standards,
it just is not true. We have recently overtaken France (again) as
the fifth largest economy in the world and are closing on Germany.
We have the fourth largest defence budget in the world, devoted
largely to peace-keeping. We disproportionately contribute to the
world’s literature, art, music, technology and science.



We have won some 123 Nobel prizes, more than any other country
bar America (and more per capita than America), and we continue to
win them, with 18 in this century so far. In the field of genetics,
which I know best, we discovered the structure of DNA, invented DNA
fingerprinting, pioneered cloning and contributed 40 per cent of
the first sequencing of the human genome.



On absolute measures, we are in even better shape. Income per
capita has more than doubled since 1965 — in real terms. In those
days, three million households lacked or shared an inside lavatory,
most houses did not have central heating and twice as many people
as today had no access to a car. When they did it was expensive,
unreliable and leaked fumes.



In the 1960s even though there were fewer people in Britain,
rivers were more polluted, the air was dirtier, and there were
fewer trees, otters and buzzards. Budget airlines, mobile phones,
search engines and social media were as unimaginable as unicorns.
Sure, there was less obesity and fewer traffic jams, but there were
more strikes, racism and nylon clothing. People spent twice as much
of their income on food. There may be political angst about
immigrants, but Britain is far more at ease with its multicultural
self today than we might have dared to hope in the 1960s.



Even the things that were getting worse turned around. After
1965, levels of murder and other crimes rose for a while, but then
fell back and are now lower than they were then. The number of cars
produced in Britain fell as our industrial relations deteriorated,
then rose again and will soon break the record set in 1972. Britain
is making more cars than France [corrected from Germany], 80 per
cent of them for export.



Likewise, London’s role as a capital of global finance shrank
for a while, but then boomed as never before, giving us an
unrivalled role in international service industries — confounding
the pessimists who warned us that we would become an irrelevance in
the world, especially if we did not join the euro. London’s
population shrank, then boomed as it became the city everybody’s
rich people wanted to live in. That brings rising house prices. But
rather that than Detroit’s urban deterioration.



Compared with many other countries, we have enviable
opportunities ahead of us. We are sitting on one of the world’s
largest shale-gas fields. Our economy is growing faster than any
other in the western world. Unemployment is falling faster than
anybody predicted and is less than half that of France, one quarter
than of Spain. According to the 2015 Global Entrepreneurship Index,
published last month, we are the most
entrepreneurial country in Europe and the fourth most
entrepreneurial in the world, our highest-ever ranking.



Don’t forget our natural advantages: a Goldilocks climate with
none of the brutal cold or blistering heat that most countries
experience at one season or another. Enough rain to keep the
country green throughout the year, unlike most countries, but not
so much (on the whole) as to annoy. And sufficiently unpredictable
weather to be worth talking about, unlike in many countries. That
it is a bit too dark at this time of year is a price worth paying
for those long summer evenings.



Then there is a stunning coastline that is never more than 70
miles away, few snakes, bears, mosquitoes, tornadoes or earthquakes
and no poison ivy to worry about when walking or gardening.



Plus a huge variety of landscapes crammed into a small land area
and an amazingly rich architectural heritage. In short, we have an
economy to rival America in a culture to rival Italy on a landscape
to rival France with social cohesion to rival Germany.



Then we have a democratic tradition as strong as any in the
world and an adherence to defending liberty that — for all the
threats — is still far more robust than most people in the world
can experience. And to cap it all, a brilliantly neutral and
beneficent head of state who this coming September becomes our
longest reigning monarch.



There is one giant fly in the ointment: the huge and rapidly
growing national debt, alongside our steep levels of personal debt.
Rightly, that will obsess us in an election year. Even so, let’s
pause at New Year to contemplate what might go right in
Britain.

 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2015 15:05

December 24, 2014

Polygamy fuels violence

My column in The Times:



When the Kurdish peshmerga forces broke the siege
of Mount Sinjar last week, there was no trace of the 5,000 Yazidi women
and children abducted from the area in August. It is thought that
they have been mostly sold as concubines to jihadist fighters of
Islamic State. When The Times posed as two
British girls interested in joining Islamic State, they were told: “The only way to guarantee
being together is marrying the same man.” The 219 girls still
missing in Nigeria after being abducted in April have been “married
off”, according to the leader of Boko Haram.



My point in connecting these incidents is that throughout
history polygamy has fuelled violence. Might it be worth suggesting
to Muslim leaders, religious and secular, that they push for
monogamous norms as one way to reduce violence and bring more peace
to the Middle East and to north and west Africa? Of course,
polygamy is not the only or the main cause of violence in such
places, but it almost certainly contributes.



The correlation between violence and polygamy (strictly,
polygyny — being married to more than one wife at the same time —
as having more than one husband is much rarer) is not just about
violence to women. It is also about violence among men. From Troy
to Brigham Young, from Genghis Khan to Islamic State, there has
been a tendency for nations that allow polygamous marriage to
exhibit more crime and more warfare than those that do not. The
cause is increased competition for mates. Polygamy results in more
unmarried young men, and these commit most violence.



Even moderate polygamy can produce large imbalances. Imagine
that in a village of 50 men and 50 women, two men have four wives,
four men have three wives and fourteen have two wives: that leaves
30 men chasing the remaining two women. A recipe for trouble.



A fascinating 2009 paper called The Puzzle of Monogamous
Marriage,
by the anthropologist Joe Henrich and his
colleagues, detailed the historical correlation between polygamy
and crime, chillingly explaining it thus: “Faced with high levels
of intra-sexual competition and little chance of obtaining even one
long-term mate, unmarried, low-status men will heavily discount the
future and more readily engage in risky status-elevating and
sex-seeking behaviours. This will result in higher rates of murder,
theft, rape, social disruption, kidnapping (especially of females),
sexual slavery and prostitution.”



The authors argue that the gradual and erratic imposition over
many centuries of “normative monogamy” in Europe and then much of
the rest of the world was motivated largely by rulers wanting to
suppress crime and violence. Or perhaps societies that suppressed
polygamy proved more successful, displacing those that didn’t.



Professor Henrich even argues that the advance of monogamy
played a part in the industrial revolution. Reducing the pool of
unmarried men and levelling the reproductive playing field not only
decreased crime, but spurred commerce and innovation. Once men stop
striving to achieve marriage (or double marriage) they invest their
energy in more productive ambitions.



The story of Figaro and the plot of
Richardson’s Pamela both testify to the
box-office success of tales about punishing powerful men for
indulging in droit de seigneur in the 18th
century, a last echo of the time when rich men in the west could
get away with having concubines. By the time of Queen Victoria, not
even the richest man could have a harem — unless in secret.



The longer history of polygamy and monogamy is a rollercoaster.
In the stone age, hunter-gatherers were (and where they exist still
are) usually only slightly polygamous, reflecting a high death rate
of males in wars and hunts. With the advent of agriculture,
polygamous marriage flourished, resulting in vast harems for Bronze
Age emperors in Egypt, China, India, Peru and Mexico. Then while
polygamy faded in Christendom, highly unbalanced polygamy persisted
in pastoral societies in central Asia and the Middle East, perhaps
because having a hundred sheep is not much harder than having ten,
so wealth inequality brought marital rewards.



Little wonder that pastoralists exploded out of central Asia at
regular intervals, if only to satisfy the need of their low-status,
unmarried men to kill men and abduct women: hence Huns, Tartars,
Mongols, Turks, Moghuls. In 1401 Tamerlane marched through what is
now northern Iraq, sacked Baghdad and ordered his men to produce
two enemy skulls each or lose their own. Women were taken as sexual
slaves. Not much different from what Islamic State did near there
last year.



Today most of the world’s officially sanctioned polygamy
coincides with Islam. The rest is in sub-Saharan Africa (and
Burma). Some Muslim countries, such as Tunisia and Turkey, forbid
it, while others, such as Egypt, discourage it. In Indonesia, the
world’s most populous Muslim country, polygamy is legal (up to four wives) but faces
stiff opposition and is hedged about with restrictions: a man must
treat his wives equally and support them financially. It is
forbidden to civil servants and the military. It is no coincidence
that women’s rights are stronger in Indonesia than in some Muslim
countries and women less cloistered and veiled.



The Kurds passed a law restricting polygamy in 2008 — it is
forbidden unless the current wife agrees, punishable by up to a
year in prison (this is also the law in Libya and Morocco).
Certainly, Kurds have gone further in promoting women’s rights than
some Muslim societies, as their rather competent snipers fighting
Islamic State illustrate. The law remains controversial however,
with some arguing that monogamy leads to more adultery and deprives
widows of opportunities to remarry.



Sharia does not discourage polygamy, but nor does the Old
Testament and nor did some early Christian fathers, and they proved
capable of change. By one estimate there may be as many as 20,000
polygamous Muslim men in Britain: a television programme in
September featured Nabilah Philips, a Malaysian- born Cambridge PhD
student who willingly became the second wife of a Muslim convert
who promptly added a third during filming. Mr Philips spends three
nights with each wife.



Still, polygamy is probably in very slow retreat in the Muslim
world. Even in Saudi Arabia, there is growing reluctance among
women to become second wives. Islamic State is, of course, a
reactionary exception.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on December 24, 2014 16:29

December 22, 2014

Coal interest

I have had enquiries about my interest in coal mining, and am
happy to make the following statement:



The following has been on my website since its inception:



“I have a financial interest in coal mining on my family's land.
The details are commercially confidential, but I have always been
careful to disclose that I have this interest in my writing when it
is relevant; I am proud that the coal mining on my land contributes
to the local and national economy; and that my income from coal is
not subsidized and not a drain on the economy through raising
energy prices. I deliberately do not argue directly for the
interests of the modern coal industry and I consistently champion
the development of gas reserves, which is a far bigger threat to
the coal-mining industry than renewable energy can ever be. So I
consistently argue against my own financial interest.”



In response to recent inaccurate reports, I can add further
detail:



“I have declared my interest in the coal mining on my family’s
land whenever and wherever relevant both in my writing and in
Parliament. However, I generally argue in favour of gas, which is
coal’s main competitor, and do not argue directly in favour of the
modern coal industry, though I have commented on the role coal has
played over the course of history. There is a long and proud
tradition of coal mining in the North-east of England. The coal
under my family’s land belongs to the state, being nationalized, so
royalties go to the government, not the landowner. Only part of the
coal mining operated by the Banks Group at the Shotton and Brenkley
mines is on my family’s land. I do not own or operate the mines
themselves. I consider the mining operation an excellent local
employer, which provides affordable energy to UK industry and
electricity consumers, without subsidy, and in a situation where
the UK imports the majority of the coal it burns. It also
contributes generous taxes as well as funding welcome environmental
benefits and numerous community projects. I receive no financial
benefit other than a wayleave fee in exchange for providing access
to the land. The details are commercially confidential and involve
several parties, but the wayleave is very small indeed in relation
to the value of the coal mined from my family’s land. It is partly
shared with local residents and the remainder, after paying tax, is
almost entirely reinvested in the maintenance and improvement of
the property. The coal industry has never tried to influence my
views on climate science or policy.”

 •  0 comments  •  flag
Share on Twitter
Published on December 22, 2014 08:13

December 9, 2014

Policy-based evidence making

My column in the Times, with post-scripts:



 



As somebody who has championed science all his
career, carrying a lot of water for the profession against its
critics on many issues, I am losing faith. Recent examples of bias
and corruption in science are bad enough. What’s worse is the
reluctance of scientific leaders to criticise the bad apples.
Science as a philosophy is in good health; science as an
institution increasingly stinks.



The Nuffield Council on Bioethics published a report last week that found
evidence of scientists increasingly “employing less rigorous
research methods” in response to funding pressures. A 2009 survey
found that almost 2 per cent of scientists
admitting that they have fabricated results; 14 per cent say that
their colleagues have done so.



This month has seen three egregious examples of poor scientific
practice. The most recent was the revelation in The
Times last week that scientists appeared to scheme to get
neonicotinoid pesticides banned, rather than open-mindedly
assessing all the evidence. These were supposedly “independent”
scientists, yet they were hand in glove with environmental
activists who were receiving huge grants from the European Union to
lobby it via supposedly independent reports, and they apparently
had their conclusions in mind before they gathered the evidence. Documents that have recently come to light
show them blatantly setting out to make policy-based evidence,
rather than evidence-based policy.



Second example: last week, the World Meteorological Organization
(WMO), a supposedly scientific body, issued a press release stating that this is likely to
be the warmest year in a century or more, based on surface
temperatures. Yet this predicted record would be only one hundredth of a degree above
2010 and two hundredths of a degree above 2005 — with an error
range of one tenth of a degree. True scientists would have said:
this year is unlikely to be significantly warmer than 2010 or 2005
and left it at that.



In any case, the year is not over, so why the announcement now?
Oh yes, there’s a political climate summit in Lima this week. The
scientists of WMO allowed themselves to be used politically. Not
that they were reluctant. To squeeze and cajole the data until they
just crossed the line, the WMO “reanalysed” a merger of five data
sets. Maybe that was legitimate but, given how the institutions
that gather temperature data have twice this year been caught
red-handed making poorly justified adjustments to “homogenise” and
“in-fill” thermometer records in such a way as to cool down old
records and warm up new ones, I have my doubts.



In one case, in Rutherglen, a town in Victoria, a recorded
cooling trend of minus 0.35C became a reported warming trend of plus 1.73C
after “homogenisation” by the Australian Bureau of Meteorology. It
claimed the adjustment was necessary because the thermometer had
moved between two fields, but could provide no evidence for this,
or for why it necessitated such a drastic adjustment.



Most of the people in charge of collating temperature data are
vocal in their views on climate policy, which hardly reassures the
rest of us that they leave those prejudices at the laboratory door.
Imagine if bankers were in charge of measuring inflation.



Third example: the Royal Society used to be the gold standard of
scientific objectivity. Yet this month it issued a report on resilience to extreme
weather that, in its 100-plus pages, could find room for not a single graph to show
recent trends in extreme weather. That is because no such graph
shows an upward trend in global frequency of droughts, storms or
floods. The report did find room for a graph showing the rising
cost of damage by extreme weather, which is a function of the
increased value of insured property, not a measure of weather.



The Royal Society report also carefully omitted what is perhaps
the most telling of all statistics about extreme weather: the
plummeting death toll. The global probability of being killed by a
drought, flood or storm is down by 98 per cent since the 1920s and
has never been lower — not because weather is less dangerous but
because of improvements in transport, trade, infrastructure, aid
and communication.



The Royal Society’s decision to cherry-pick its way past such
data would be less worrying if its president, Sir Paul Nurse, had
not gone on the record as highly partisan on the subject of climate
science. He called for those who disagree with him to be
“crushed and buried”, hardly the language of Galileo.



Three months ago Sir Paul said: “We need to be aware of those who mix up
science, based on evidence and rationality, with politics and
ideology, where opinion, rhetoric and tradition hold more sway. We
need to be aware of political or ideological lobbyists who do not
respect science, cherry-picking data or argument, to support their
predetermined positions.”



If he wishes to be consistent, he will therefore condemn the
behaviour of the scientists over neonicotinoids and the WMO over
temperature records, and chastise his colleagues’ report, for these
are prime examples of his point.



I am not hopeful. When a similar scandal blew up in 2009 over
the hiding of inconvenient data that appeared to discredit the
validity of proxies for past global temperatures based on tree
rings (part of “Climategate”), the scientific establishment closed
ranks and tried to pretend it did not matter. Last week a further instalment of that story came to light,
showing that yet more inconvenient data (which discredit
bristlecone pine tree rings as temperature proxies) had
emerged.



The overwhelming majority of scientists do excellent, objective
work, following the evidence wherever it leads. Science remains (in
my view) our most treasured cultural achievement, bar none. Most of
its astonishing insights into life, the universe and everything are
beyond reproach and beyond compare. All the more reason to be less
tolerant of those who let their motivated reasoning distort data or
the presentation of data. It’s hard for champions of science like
me to make our case against creationists, homeopaths and other
merchants of mysticism if some of those within science also
practise pseudo-science.



In all the millions of scientific careers in Britain over the
past few decades, outside medical science there has never been a
case of a scientist convicted of malpractice. Not one. Maybe that
is because — unlike the police, the church and politics —
scientists are all pure as the driven snow. Or maybe it is because
science as an institution, like so many other institutions, does
not police itself properly.



 



Postscript:



 



For those interested in further details of some of the incidents
mentioned in this artice, here follow some links and quotes.



1. On neonicotinoids, here is what the scientists wrote:




"Based on the results of the meeting
in Paris the following was agreed that the will be published in
peer-reviewed journals. Building on these papers a research paper
will be submitted to Science (first choice) or Nature (second
choice) which would introduce new analyses and findings across the
scientific disciplines to demonstrate as convincingly as possible
the impact of neonicotionoides on insects, birds, other species,
ecosystem functions, and human livelihoods. This high-impact paper
would have a carefully selected first author, a core author team of
7 people or fewer (including the authors of the initial four
papers), and a broader set of authors to give global and
interdisciplinary coverage. A significant amount of the supporting
evidence will be in the official Supporting Online Material
accompanying the paper. A parallel « sister » paper (this would be
a shorter Policy Forum paper) could be submitted to Science
simultaneously drawing attention to the policy implications of the
other paper, and calling for a moratorium in the use and sale of
neonicotinoid pestcides. We would try to pull together some major
names in the scientific world to be authors of this paper. If we
are successful in getting these two papers published, there will be
enormous impact, and a campaign led by WWF etc could be launched
right away. It will be much harder for politicians to ignore a
research paper and a Policy Forum paper in Science The most urgent
thing is to obtain the necessary policy change to have these
pesticides banned, not to start a campaign. A stronger scientific
basis for the campaign will hopefully mean a shorter campaign. In
any case, this is going to take time, because the chemical industry
will throw millions into a lobbying exercise."



2. On Rutherglen, an Australian weather station, the Bureau of
Meteorlogy added a page of explanation after the scandal was drawn
to their attention. Unfortunately this made things worse by
admitting that the record "does not list a site move" and contains
"no firm evidence exists [of] exact location". Moreover, the page
reveals precisely nothing about how the adjustment was done to
account for this unknown site move. As Jo Nova commented at the time:



"The BOM has added a page listing
“Adjustments”. It’s two years late, inadequate and incomplete.
Skeptics shouldn’t have had to ask for it in the first place, and
we still don’t have the algorithms and codes, or rational answers
to most questions.  No one can replicate the mystery black box
homogenisation methods of the BOM — and without replication, it
isn’t science. There is still no explanation of why an excellent
station like Rutherglen should change from cooling to
warming, except for vague “statistics”, or why any station
should be adjusted without documentary evidence, based on
thermometers that might be 300km away."



The new “adjustments”
page doesn’t resolve much at all. There are still blatant errors —
The changes to long term trends in minima are not “neutral”, 
but increase the trend by nearly 50% (See 
Ken Stewart’s site here
and 
the finished set here
). The hottest day in Australia was almost
certainly not in Albany in 1933 (which remains
uncorrected at 51C
). Many 
maximums have been adjusted
and become lower than minimums.
Those mistakes did not exist in the raw data. The homogenisation
has created them, like the new 
discontinuity in Deniliquin.




The adjustments page is just a glorified rehash of the
same old excuses



Effectively the bureau
is saying “we need large mysterious transformations of data to make
Australian trends look like international trends”.What serious
climate scientist thinks Australia is supposed to get hotter,
colder, wetter, drier, or cloudier with the exact same timing and
patterns to the rest of the world? Even high schoolers know that
when it rains on the East Coast with El Ninos, it’s not raining on
the other side of the Pacific. Just because other homogenizations
have produced the same trends by blending data to the point where
it is unrecognisable does not make it “good” science.



Lots of international bankers were
marketing the same overrated mortage bundles. Anyone
want to buy subprime science — I have a collateralized trend for
sale



Three years ago the
independent audit team, with Senator Cory Bernardi,
asked for an ANAO audit
of the BOM’s “High Quality” HQ data
set. The BOM was not enthused. They dumped the HQ set that they had
previously lauded and set up a new one called “ACORN”. We listed
some of the 
errors in June 2012.
Two years on, nothing much appears to have
changed. They still haven’t released the algorithms used in the
homogenization process. They are still using stations 
more than 100km away
, some 600 km away, to “adjust”
temperatures.  The mystery black box adjustments are still
producing inexplicable nonsense, and the BOM still can’t explain
why — on individual stations 
like Rutherglen
and 
Bourke
— anyone should find their adjustments necessary and
scientifically justified. There is 
no documentation
showing Rutherglen has moved. But there is
documentation suggesting perhaps 
Bourke’s deleted “hottest” day
really might have been 125F in
1909.



The BOM’s active silence on
the 
long hot records
of the late
1880s and 1890s
suggests they are more interested in promoting
one message — “it’s warming” — rather than being custodians of the
real and more complicated history of the Australian climate.



 



3. On tree-ring proxies, the new data concerns "out of sample"
data on tree rings from bristlecone pines on Sheep Mountain in
California. This data set gives a sharp hockey stick up till 1980,
implying rapidly increasing growth of trees as 1980 approached. But
critics have alleged that this is unlikely to be down to
temperature, and more iikely a "strip-bark" phenomenon, whereby the
tree regrows rapidly after damage by goats and sheep. The absence
of data from after 1980 has always been puzzling. Well, as Steve
McIntyre now reports, the data have now been collected and they
show low growth rates during the warm years of 1980-2000. Once
again, Jo Nova summarises the story especially well:



The  obvious message is that
these particular proxies don’t work now and probably never did, and
that this hockeystick shape  depends
on not using tree rings after 1980.



More important than the details of
one proxy, is the message that the modern bureaucratized
monopolistic version of “science” doesn’t work. Real scientists,
who were really interested in the climate, would have published
updates years ago. (Indeed, would never have published the
hockeystick graph in the first place. Its dysfunctional combination
of temperatures and truncated proxies is mashed through a maths
process so bad it produces a hockey stick most of the time even if
the data is replaced by red noise.)



The screaming absence of this
obvious update for so long is an example of what I call the “rachet
effect” in science — where only the right experiments, or the right
data, gets published. It’s not that there is a conspiracy, it’s
just that no one is paid to find the holes in the theory and the
awkward results sit buried at the bottom of a drawer for a
decade.  The cortex soaked in confirmation-bias couldn’t
figure out how to explain them.



See Climate Audit for McIntyre’s
view
on Salzer et al 2014.




"The new results of Salzer et al 2014 (though not candid on the
topic) fully demonstrate this point in respect to Sheep Mountain.
 In the warm 1990s and 2000s, the proxy not only doesn’t
respond linearly to higher temperatures, it actually goes the wrong
way.   This will result in very negative RE values for
MBH-style reconstructions from its AD1000 and AD1400  networks
when brought up to date, further demonstrating these networks have
no real “skill” out of sample.



We’ve also heard over and over about how “divergence” is limited
to high-latitude tree ring series and about how the Mann
reconstruction was supposedly immune from the problem.
 However, these claims mostly relied on stripbark chronologies
(such as Sheep Mountain) and the validity of such claims is very
much in question.



As previously discussed on many occasions, stripbark
chronologies have been used over and over in the canonical IPCC
reconstructions, with the result that divergence problems at Sheep
Mountain and other sites do not merely impact Mann et al 1998-99,
but numerous other reconstructions.  Even the recent PAGES2K
North America reconstruction uses non-updated Graybill stripbark
chronologies.  It also ludicrously ends in 1974.  So
rather than bringing the Mann et al network up-to-date, it is even
less up-to-date."




Note that bristlecone pines were
never supposed to be used as climate proxies anyway. They are a
rather unusual species — their growth was thought to be CO2-limited
rather than limited by temperature or moisture, so they responded
well at first to the increase in CO2 in the 20th century, though
obviously something else is going on after 1980. This graph and
these results apply only to one situation — not all tree rings. But
the  failure of review applies to the whole scientific
community.



The IPCC adopted the hockeystick for
their logo shortly after Mann produced it, but long since dropped
it. Where was the all-marvelous, hallowed, IPCC “expert”
review?



 



PPS After this article was published an extraordinary series of
tweets appeared under the name of Richard Betts, a scientist at the
UK Met Office and somebody who is normally polite even when
critical. He called me “paranoid and rude” and made a series of
assertions about what I had written that were either inaccurate or
stretched interpretations to say the least. He then advanced the
doctrine that politicians should not criticize civil servants. The
particular sentence he objected to was:



Most of the people in charge of
collating temperature data are vocal in their views on climate
policy, which hardly reassures the rest of us that they leave those
prejudices at the laboratory door.



He thought this was an unjustified attack on civil servants.
However, if you read what I said in that sentence, it is that (1)
people in charge of collating temperature data are vocal in support
of certain policies – which is not a criticism, just a statement;
and (2) that we need reassurance that they do not let that
consciously or unconsciously influence their work, which again is
not a criticism, let alone an attack, merely a request for
reassurance. Certainly there is no mention of civil servants, let
alone by name, and nothing to compare with an attack on me by name
calling me paranoid and rude.



Is the first assertion true? I had in mind Jim Hansen, who was
in charge of GISS, a data set for which serious questions have been
raised about adjustments made that warm the present or cool the
past, and who is prepared to get himself arrested in protest
against fossil fuels. I also had in mind Phil Jones, partly in
charge of HADCRUT, who also is not shy with his views. I was not
thinking of Julia Slingo of the Met Office, because I do not think
of the Met Office as a collater of temperature data, but perhaps I
should have been. And then there’s Australia’s BoM. And indeed the
RSS data, whose collater, Dr Carl Mears, fumes at the way
“denialists” talk about his data. Hardly objective language.



Is my request for reassurance reasonable? In view of the
Australian episodes, the GISS adjustments, the USHCN story from
earlier this year (see here) – all of which raised doubts about the
legitimacy of adjustments being made to the temperature data – then
yes, I think I am. Do I think the data are fatally flawed? No, I
don’t. I happily accept that all the data sets show some warming in
the 1980s and 1990s and not much since and that this fits with the
satellite data. But do I think such data can be used to assert that
this is the warmest year, by 0.01 degrees, a month before the year
ends? No, I don’t. I think people like Dr Betts should say as
much.



As of this writing, Dr Betts’s latest tweet is:



If ‪@mattwridley wants to
criticise climate policy then he's got every right, but attacking
scientists is wrong.



Well, if by attacking he means physically or verbally abusing,
then yes, I agree, but I don’t do it. I don’t call people by name
“paranoid”, for example. But criticizing scientists should be
allowed surely? And asking for reassurance? Come on, Richard.



The WMO “re-analysed” a data set to get its 0.01 degree warmest
year. What was that reanalysis and has it been independently
checked? I would genuinely like to know. I stopped taking these
things on trust after the hockey stick scandal.



The thrust of my article was that the reputation of the whole of
science is at risk if bad practices and biases are allowed to
infect data collection and presentation, and that science like
other institutions can no longer take public trust for granted. A
reaction of bluster and invective hardly reassures me that science
takes my point on board. For the moment, I remain of the view
that



The overwhelming majority of
scientists do excellent, objective work, following the evidence
wherever it leads. Science remains (in my view) our most treasured
cultural achievement, bar none. Most of its astonishing insights
into life, the universe and everything are beyond reproach and
beyond compare.



But Dr Betts’s reaction has weakened my confidence in this
view.

 •  0 comments  •  flag
Share on Twitter
Published on December 09, 2014 01:08

December 3, 2014

Pilotless planes and driverless cars

My column in The Times:



The Civil Aviation Authority is concerned that
pilots are becoming too reliant on automation and are increasingly
out of practice in what to do when the autopilot cannot cope. We
now know that a fatal Air France crash in the Atlantic in 2009 was
caused by confused co-pilots reacting wrongly when the autopilot
disengaged during turbulence. They put the nose of the plane up
instead of down.



But there is another way to see that incident: the pilot was
asleep at the time, having spent his time in Rio sightseeing with
his girlfriend instead of sleeping. When roused as the plane
stalled, he woke slowly and reacted too groggily to correct the
co-pilots’ mistakes. Human frailty crashed the plane, not mistakes
of automation.



Human error, or sabotage, also seems most likely (though we
cannot yet be sure) to have disabled and diverted the Malaysian
Airlines jet that vanished over the Indian Ocean in March. Human
action certainly caused 9/11. For every occasion on which a Chesley
Sullenberger brilliantly and heroically landed a plane on the
Hudson River after a flock of geese went into the engines, there
have been many more where people caused catastrophe. Human error is
the largest cause of crashes in the sky, as it is on the
ground.



That is, I suggest, why we will embrace the inevitability of
pilotless aeroplanes at some point in the not so distant future.
Already, automated systems are better at landing planes than
pilots, even on to aircraft carriers: they react quicker. Drones
are crashing less often when allowed to land themselves rather than
be guided in by ground-based pilots. Even Hudson River heroism
could possibly be automated. I confess I am probably an outlier
here and that most people will be horrified by the prospect of
boarding pilotless planes for a while yet. But I think they will
come round.



Driverless ground transport will help to assuage our fears. I
took a driverless train between terminals at Heathrow last week,
and Transport for London has begun tendering for driverless Tube
trains, to predictable fury from the unions. Prototype driverless
cars are proving better and safer than anybody expected. It cannot
be long before they seem preferable to an occasionally distracted,
risk-taking, radio-playing or grandee-teasing taxi driver.



Google’s prototype self-driving cars have now covered more than
700,000 miles on public roads with only one accident — which
happened when a human took the controls. They may be commercially
available after 2017. Testing of self-driving cars will begin on
British roads next month.



Getting out of a driverless car, after a restful journey working
and reading, then telling it to park and come back when you need
it, would bring the luxury of the chauffeured plutocrat within
reach of ordinary people. Driverless lorries on the motorways could
be confined to night-time operation, leaving the roads clear for
cars in the day.



In the air, small drones are now commonplace and not just in the
military. The “Matternet” is a plan to use them to supply the needs
of remote areas with few roads in poor countries, leapfrogging poor
infrastructure as mobile phones leapfrogged the lack of landlines.
Once drones can refuel each other in the air, they should quickly
take over (for instance) searches of the ocean when planes or boats
are lost — so as to put fewer lives at risk.



The next step would be that cargo planes would fly without human
beings aboard. The sticking point will be air-traffic control’s
reluctance to sanction such planes landing at airports in built-up
areas. At the moment, drones and piloted aircraft are kept apart in
separate zones. If you live under a flight path it is comforting to
know that the planes overhead are piloted by people with every
incentive to land safely: with “skin in the game”. The existence of
a “ground pilot” who can take control of a plane from the ground,
as drone operators can do now, would be of little comfort to such
people, let alone topassengers on a plane.



But pilots’ wages and training costs are one of the highest
contributors to the cost of flying, after fuel, and if pilotless
planes can fly safely for years without passengers, objections to
them carrying passengers will gradually fade. An ordinary aircraft
is now regularly flying between Lancashire and Scotland
with nobody at the controls(though there is a crew
on board to take over if necessary). The offspring of a
seven-company consortium called ASTRAEA, it uses radar, radio and
visual sensors to detect and avoid hazards.



Are we approaching the era when it will be more reassuring to
know that there is not a human being in the cockpit than to know
that there is? We might find it comforting to know that the cockpit
was wholly inaccessible to terrorists and that the machine within
it had not spent the night drinking.



It is true, as the CAA has spotted, that we currently have an
uncertain mixture of people and machines flying planes, with a
danger that the former are getting out of practice and confused.
But since accident rates are low and falling, there is no evidence
that this partial automation has been a problem, or that going
further towards full automation would not help.



Perhaps robotic surgery holds a lesson. Justin Cobb, a
distinguished professor of orthopaedic surgery at Imperial College
London, tells me that his engineers build into his experimental
robots — which carve out, via keyholes, slots in your knee or hip
bones of just the right size and shape to fit the necessary
implants — what is little more than an illusion of control by the
surgeon. The surgeon is allowed to move the tool about, but only
within a certain boundary. Beyond that, the robot’s software
prevents the tool straying.



So an automated aeroplane might allow the pilot to play with the
joystick and the switches, but only within limits. Thus can the
pilot retain what is left of his dignity and the passenger indulge
what is left of his irrational fear of submitting his life to a
machine. Imagine a future hijacker or suicidal pilot finding the
controls of the plane refusing to obey orders. Like Hal in the
film 2001, but in a good way: “No, Dave, I can’t let
you crash this plane.”



So in practice, despite the cost, we will keep pilots around in
the cabin even if there is not much for them to do, and surgeons in
the operating theatre, farmers in the cabs of tractors, teachers in
the classroom, lawyers in the courts, and columnists on
newspapers.

 •  0 comments  •  flag
Share on Twitter
Published on December 03, 2014 09:08

November 26, 2014

The EU versus the UN: who makes the rules?

 



My column in the Times:



 



In today’s speech on the European Union, previewed in
this morning’s Times, Owen Paterson, the former
environment secretary, will make a surprising and telling
point.



It is that many of the rules handed down to British businesses
and consumers by Brussels have often (and increasingly) been in
turn handed down to it by higher powers. This means, he argues,
that we would have more influence outside the EU than within it. We
could rejoin some top tables.



One example is the set of rules about food safety: additives,
labelling, pesticide residues and so on. The food rules that
Britain has to implement under the EU’s single market are now made
by an organisation that sounds like either a Vatican secret society
or a Linnean name for a tapeworm: Codex Alimentarius. Boringly,
it’s actually a standard-setting commission, based in Rome.



Codex is a creature of the United Nations. Its rules are in
theory voluntary but since the EU turns Codex’s decisions into
single-market law, and since the World Trade Organisation (WTO)
judges disputes by Codex’s rules, Britain in effect is lumped with
what Codex decides. But it’s Brussels that represents us on many of
the key committees, so we have little chance to influence the rules
in advance.



Codex has two sister organisations, which deal with animal and
plant health. As environment secretary, Mr Paterson discovered on a
visit to New Zealand just how powerless other countries perceive us
to be. There was a particular new rule about a sheep disease that
the New Zealand government wanted to persuade one of these bodies
to amend. It had got Australia on side, and planned to enlist
Canada and America, but when asked by Mr Paterson if Britain —
Europe’s leading sheep producer — could help, the New Zealanders
replied: no point, you’re just part of the EU. He felt stung by the
implication of that remark.



In effect, if an organisation such as Codex changes its rules
about food labels, Brussels is powerless to do anything other than
follow suit. This goes much deeper than just a few veterinary and
food issues. In 1994 the EU adopted the world trade system that
required all signatories to adopt international standards in
preference to their own.



Take another example. The rules followed by the banking industry
when assessing asset risk are decided not by the EU but by a
committee based in Switzerland. Then there’s the Financial
Stability Board, chaired by Mark Carney and based in Paris. It’s a
creature of the G20. It is supposed to set the standards for
financial regulation worldwide.



Britain’s car industry is vital to our economy. Yet the single
market standards of the EU for motor manufacturing are derived from
regulations produced by (take a deep breath) the World Forum for
the Harmonisation of Vehicle Regulations, a subsidiary of the
UN.



Ask yourself: is it likely that Britain, with its
disproportionate interest in fish, car manufacturing, banking and
sheep, will have seen these topics aired to our best advantage by
some suave suit from Malta or Lithuania acting on behalf of the
entire EU? Not a chance.



There’s plenty of other supranational bodies on which we are
represented separately, and don’t need to leave the EU to join.
There’s Nato, and the UNclimate change framework, whose chief
(Christiana Figueres) says she wants to use it to achieve
“centralised transformation” of the world economy if she can get a
world treaty. So, to an increasing extent, the EU is just one of
the spider’s webs in which we are entangled — but it’s often the
only one that represents our interests.



At the weekend I looked up the latest review of the WTO’s
Committee on Technical Barriers to Trade Agreement (I’m a sad case,
I know, but it was raining and there was not much on the telly)
and, sure enough, it lists lots of comments it has received from
countries such as New Zealand, Malaysia, Japan, Switzerland, even
Cuba. Not a single EU country is mentioned because of course our
comments were relayed by the European Union.



In the past, “ministers had to travel to Brussels to make their
case, and to keep an eye on new laws”, Mr Paterson will say in his
speech, “but with the advance of globalisation we now need to be
represented in Geneva, Paris, Berne, Rome and elsewhere.”



No wonder Eurosceptics (including Nigel Farage) say we have less
international clout than Norway, which sits on all these
committees. It plays a big role in the Codex Alimentarius, hosting
a key committee about fish. Very few of these international
rule-setting bodies are based in Britain. If we left the EU, we
would at least get to be like Switzerland — a place favoured by UN
agencies to base themselves. There’s jobs in polishing the shoes
and limos of UN-crats.



This is good news for those Europhiles who sound so touchingly
worried that they might lose the opportunities for racking up
room-service bills while on business in Brussels. They can relax,
and vote “out” in a referendum. The hotels in Switzerland are just
as good.



And conversely, the supranational world is not an entirely
comforting point for Eurosceptics to make. If we left the EU, we
would not find ourselves in some sunlit meadow where we could make
up any rules we wanted, as Ukip likes to imply. We would be still
be just as subject to all these international standards and
intrusions if we wanted to trade with other countries. And although
we might get a bit more influence over rule-making in the areas
that matter to us, we would still be regularly outvoted.



We are often told to fear leaving the EU because it would lead
to “fax diplomacy”: learning about new laws without having had a
chance to comment on them first. (The MEP Daniel Hannan has
memorably expressed his puzzlement at the archaic way that
Europhiles express themselves: who still uses fax?) But Brussels is
also receiving such faxes. Leave the EU and we could be sending
some of the faxes to Brussels ourselves. And perhaps even hosting a
few of the fax machines.



More generally, the EU is increasingly a problem in the
multilateral, supranational world. The inexorable drift towards
co-ordinated world government is indeed happening, but the European
Union is looking more like an oxbow lake, rather than the stream.
Let’s get back in the main channel.

 •  0 comments  •  flag
Share on Twitter
Published on November 26, 2014 02:28

November 18, 2014

Political institutions evolve slower than social ones

My Times column on the little-changed political
institutions of London:



Two hundred and ninety years ago a novelist, spy,
tradesman and bankrupt named Daniel Defoe began publishing his account of A Tour Thro’ the Whole
Island of Great Britain
. A book out this week by the
distinguished sociologist WG Runciman imagines what Defoe would
make of the island if he were to take his tour again today. His title gives away the
conclusion: Very Different, But Much the Same.



For all the astonishing changes that would boggle Defoe’s mind —
aeroplanes, toilets, motorways, telephones, cameras, pensions, the
internet, religious diversity, vaccines, working women, electricity
and vastly higher living standards especially for the poor — he
would be just as amazed at the things that have not changed.



There is still a hereditary monarch, a Church of England, a Tory
party, a bicameral parliament, one house of which is elected, the
other still with bishops and a few hereditary peers on its benches.
(Though not a Tory, Garry Runciman is a Northumbrian viscount, like
me, although he no longer sits in the Lords.) There is still a
bewigged judiciary presiding over trial by jury and sentencing
criminals to prisons, and a City full of bankers and speculators,
some of whom grow rich enough to buy country estates or horses to
run at Epsom, and to send their sons to Eton and Oxford.



Most of these enduring features are in London, a city that
dominates the country as much as it did in Defoe’s day, despite the
emergence in the 19th century of an industrial, urban powerhouse
astride the Pennines. It is astonishing that the industrial
revolution, and the vast expansion of the population that it
allowed, did so little to change the main shape and habits of
Britain’s political and cultural institutions.



Once he had got the hang of modern technology, transport and
clothing, Defoe would be quite at home in London. There are still
West End theatres showing Jacobean tragedies and restoration
comedies. Over all this a fairly small, London-based political and
financial elite presides with powers of patronage and a tendency to
periodic scandal that would be instantly familiar. Scotland is
again semi-attached; India has come and gone since his day. It is
all very Hanoverian.



What’s more, Defoe would still be able to boast of Britain’s
unusual nature, says Runciman. He would “have as much reason as
ever to remark on the distinctiveness of his own society’s
political, ideological and economic institutions”. For three
hundred years Britain has almost uniquely avoided slipping into
feudal, theocratic, despotic or military rule. It has remained an
untidy mixture of constrained monarchy, permeable oligarchy,
imperfect markets, and representative democracy.



Defoe lived towards the end of a period of civil war, regicide,
restoration, (“glorious”) foreign invasion, Jacobite civil war and
the imposition of a new dynasty. Yet in the centuries since, with
the exception of Culloden and the Blitz, there have been no battles
within the country, let alone civil wars. With hindsight, says
Runciman, it was never even likely that Britain would succumb to
revolution or dictatorship. The idea of freedom was always too
strongly popular: freedom of movement, opinion, assembly, speech,
contract, trade and fair trial.



The early 19th-century prime minister Lord Liverpool constantly
fretted that a French-style revolution could happen here. The Duke
of Wellington thought the Great Reform Bill would start a
revolution. Opponents of widening the franchise in 1867 thought the
same. But compared with France in 1789, Russia in 1917 or Germany
in 1933, Britain’s power elite was too constrained and many-headed,
its peasantry and proletariat too free and empowered, and its civil
society too entrenched for an autocratic outcome to be
plausible.



If Labour in 1950 had gone to the country with a programme of
full communism, or the Conservatives in 1979 with a programme of
ultra-libertarianism, both would have lost by a mile. Gradualism
was made inevitable by elections, which force parties to respect
the cautious instincts of the people: observe how Ukip has suddenly
accepted the sacrosanct nature of the NHS in order to be
electable.



Runciman (like me) is a devotee of the theory of cultural
evolution, the notion that society changes by the gradual and
undirected emergence of new ways of doing things that persist by
competitive survival, rather than by grand design. In Darwinian
terms, some of Britain’s institutions are sociological coelacanths
— living fossils that have changed little while the world has
changed rapidly around them.



The Left generally sees this institutional persistence as
evidence that (in Runciman’s words) the “apparatus of government
has been the instrument of a self-serving ruling class determined
to retain its control of the means of coercion and persuasion”. The
Right sees it as evidence that Britain’s institutions “have
consistently allowed for the representation of conflicting
interests through a parliamentary assembly and delegation of power
to local authorities while containing the risk of instability
through judicious toleration of alternative opinions and lifestyles
and prudent oversight of a market in commodities and labour”.



Although Defoe might be pleasantly surprised by a decline in the
bossiness of the church and the excise man (gone are the law that
all must attend church and the Calico Acts, forbidding cotton
clothing, to protect wool and silk makers), he would undoubtedly be
astonished by the enormous growth of other officious
bureaucracies.



He would surely also be appalled at the degree to which we are
subjects of an alien and unelected European nomenklatura. If, as
Paul Johnson used to argue, Britain’s history can be seen as a
series of painful and prolonged disengagements from (and resistance
to) projects of European unification — by Caesars, popes, Bourbons,
Hapsburgs, Bonapartes, Hitlers, Stalins, Delorses — then it would
be reasonable to predict that cultural evolution will eventually
lead us out of the EU. But predicting evolution is very hard.



It is hard to think of another nation — only the Vatican or
perhaps the Netherlands spring to mind — where the main
institutions have changed so little since 1724. Everywhere else
they were remade in the 1780s, 1840s, 1950s or 1990s. On every
measure of science, technology and society, we are as modern as you
could wish. Yet we manage to be so within national institutions
that are little changed from the time of George I.

 •  0 comments  •  flag
Share on Twitter
Published on November 18, 2014 14:47

November 11, 2014

Ants, altruism and self sacrifice

My Times column is on a disagreement between
Edward Wilson and Richard Dawkins about evolution:



 



I find it magnificent that a difference of opinion
about the origin of ants between two retired evolutionary
biologists, one in his eighties and one in his seventies, has made
the news. On television, the Harvard biologist EO Wilson called the Oxford biologist Richard Dawkins a
“journalist”, this being apparently the lowest of insults in the
world of science; it was taken as such.



I know and admire both men but having read the relevant papers I
think that on the substantive disagreement between them Dawkins is
right. Which is just as well, I shall explain, or we would need
many more poppies for the Tower of London.



Before plunging (briefly) into the arithmetic of genetic
relatedness within ant colonies, let me first pose a simple
question: why do people care for their children? Raising children
is expensive, hard work and intermittently stressful, but most
people consider it rewarding in the end. What do they mean by
that?



Do they do it just to gratify themselves, selfishly seeking
these rewards, thus devaluing their generosity towards the
children? Hardly.



Surely it is more likely that people bear, raise and treasure
their children for the same reason that rabbits, blackbirds and
spiders care for their offspring — because they are descended from
individuals that cared for offspring. Throughout history those
people who found child rearing worth it, despite the effort, left
behind more descendants than those who did not.



I went fishing at the weekend. The salmon I was fortunate enough
to land (and release) is an extreme example. Although in good shape
when she left my hands, she will very likely die within the next
two months, exhausted by the effort and risk of reaching a stream
where she can lay her eggs. Her breeding instinct is the very
opposite of rewarding for herself. But it perpetuates her
genes.



This is a point that most critics of Richard Dawkins’s book
The Selfish Gene persist in missing: it is the
selfishness of genes that drives us to be selfless. The theory that
most creatures do things that help the survival of their genes
specifically explains and illuminates acts of genuine
generosity.



It is the very opposite of a theory that says we should be
selfish, though it does say that we are likely to be selective in
our generosity. (But we knew that.)



And here is where ants come in. Ants are not generally
altruistic. In fact they fight ants from other colonies to the
death and sometimes enslave ants of other species. Yet within a
colony, worker ants raise their sisters rather than their
daughters. Wilson thinks this is because the survival of the colony
is the main reward that drives their altruism: a theory called
“group selection”; Dawkins believes that the survival of the ants’
genes, shared by those sisters who will become future queens, is
the chief cause: a theory called “kin selection”. Cut through the
mathematics and insults and that’s the core disagreement.



Lots of good evidence supports Dawkins, or rather his late
colleague Bill Hamilton, who originated the theory of kin
selection. For instance, although the sister-rearing habit evolved
in termites and naked mole-rats as well, it appeared eight separate
times in ants, bees and wasps.



This group of insects has the peculiar trait that — because
males are produced from unfertilised eggs — females are more
similar to their sisters than their daughters so long as they share
the same father. And in all eight lineages, it appears it was already the habit in
ancestral species for queens to mate only once, ensuring this
genetic similarity.



Wilson used to buy this argument, but now he says he has
“abandoned” the theory of the selfish gene for one based on the
selective survival of competing groups. That sometimes groups
compete, or that individuals need to be in groups to thrive, is not
in doubt. But does it happen enough for creatures to develop
genetic tendencies to put the success of the group first, before
their own survival?



Wilson likes to call human beings “eusocial”, a word
normally used for ants, bees and termites that live in colonies
where the queen does all the reproducing. But for all the
“groupishness” of people, there is very little evidence that we
seek to sacrifice our own opportunities to reproduce as
individuals, let alone that our groups themselves multiply. On the
contrary, breeding is the one thing we like to do for
ourselves.



And this is why the Wilson-Dawkins disagreement is of political
relevance. “Group selection” has always been portrayed as a more
politically correct idea, implying that there is an evolutionary
tendency to general altruism in people. Gene selection has
generally seemed to be more of a right-wing idea, in which
individuals are at the mercy of the harsh calculus of the
genes.



Actually, this folk understanding is about as misleading as it
can be. Society is not built on one-sided altruism but on mutually
beneficial co-operation.



Nearly all the kind things people do in the world are done in
the name of enlightened self-interest. Think of the people who sold
you coffee, drove your train, even wrote your newspaper today. They
were paid to do so but they did things for you (and you for them).
Likewise, gene selection clearly drives the evolution of a
co-operative instinct in the human breast, and not just towards
close kin.



It can even drive a tendency to defend fellow members of the
group if the survival of the group helps to perpetuate the genes.
But group selection is a theory of competition between groups, and
that is generally known by another name in human affairs. We call
it war. If group selection were to work properly, war would mean
the total annihilation of the enemy by the victorious group.



Richard Dawkins and EO Wilson were once on the same side,
writing influential books within a year of each other in the 1970s
to explain the evolution of behaviour.



Dawkins still admires Wilson but thinks he has fallen
into error. It is a bit like when Charles Darwin chastised Alfred
Russel Wallace in the 1880s for his insistence that “a superior
intelligence has guided the development of man in a definite
direction, and for a special purpose.” To which Darwin replied,
chidingly, in a letter: “I hope you have not murdered too
completely your own and my child”.

 •  0 comments  •  flag
Share on Twitter
Published on November 11, 2014 02:42

November 5, 2014

Greens take the moral low ground

My Times column:



A confession: I voted for the Green Party in 1979 – one of less
than 40,000 people in the whole country who did so. It was then
called the Ecology Party and I knew the local candidate in Oxford,
which is some excuse. But mainly I wanted to save the planet, and
thought the greater good should trump self interest. I was
definitely on the moral high ground. Or was I? Hold that
thought.



The latest opinion polls show that the Green Party is doing to
the Liberal Democrats what UKIP is doing to the Conservatives, and
could even relegate the LibDems to fifth place in next year’s
general election in terms of vote share. Peter Kellner of Yougov
has analysed today’s typical Green voter and found
that she is almost a mirror image of the UKIP voter. Where UKIP
voters are older, maler, more working class, less educated and more
religious than the average voter, Green voters are younger,
femaler, posher, much better educated and less religious than the
average voter.



In Downton Abbey terms, Greens are a lady upstairs in the dining
room, kippers are a footman downstairs in the scullery. Indeed, my
experience of fanatical greens at conferences and anti-fracking
demos is that many are often very grand indeed, disproportionately
hailing (when male) from Eton, Stowe and Westminster, shopping
(especially when female) at the most expensive of organic shops,
and speaking (when of either sex) in the countiest of accents. (A
bit like me, in fact.)



Despite these social and economic advantages, eco-toffs put
their self interest to one side and campaign selflessly for the
greater Gaian good, worry about the effect that climate change will
have on future generations and yearn for a more holistic version
economic growth.



But is greenery really quite so selfless? Take climate change.
The “synthesis report” of the Intergovernmental
Panel on Climate Change, warns of an increased “likelihood” of
severe, pervasive and irreversible impacts if emissions continue.
But when you cut through the spin, what the IPCC is actually saying
is that there is a range of possibilities, from no net harm at all
(scenario RCP2.6) through two middling scenarios to one where
gathering harm from mid century culminates in potentially dire
consequences by 2100 (scenario RCP8.5).



This latter scenario makes wildly unrealistic assumptions about
coal use, trade, methane emissions and other things; RCP2.6 is
equally unrealistic in the other direction. So let’s focus on the
two middle scenarios, known as RCP4.5 and RCP6. In these more
realistic projections, if you use the latest and best estimates of the climate’s
“sensitivity” to carbon dioxide (somewhat lower than the
out-of-date ones still used by the IPCC), the most probable outcome is that world will be
respectively just 0.8 and 1.2 degrees Celsius warmer than today by
the last two decades of this century.



Here is David Rutledge on the RCP8.5:



In the IPCC’s business-as-usual
scenario, Representative Concentration Pathway (RCP) 8.5, coal
accounts for half of future carbon-dioxide emissions through 2100,
and two-thirds of the emissions through 2500. The IPCC’s coal burn
is enormous, twice the world reserves by 2100, and seven times
reserves by 2500. Coal so dominates that it is not an exaggeration
to say that the IPCC and climate-change research programs depend on
this massive coal burn for their existence. Without the threat of
coal, the IPCC could close up shop and the research program funding
would drop to a small fraction of what is spent on research in
weather forecasting.



http://judithcurry.com/2014/04/22/coa...



Most of that warming will be at night, in winter and in northern
latitudes, so tropical daytime warming will be less. Again, on the
best evidence available, it is unlikely that this amount of
warming, especially if it is slow, will have done more harm than
good. The chances are, therefore, that climate change will not
cause significant harm in the lives of our children and
grandchildren.



The OECD’s economic models behind the two scenarios
project that the average person alive in 2100 will be earning an
astonishing four to seven times as much money – corrected for
inflation – as she does today. That’s a 300-600% increase in real
pay. This should enable posterity to buy quite a bit of protection
for itself and the planet against any climate change that does show
up. So we are being asked to make sacrifices today to prevent the
possibility of what may turn out to be pretty small harms to very
wealthy people in the future.



By contrast, the cost of climate policies is already falling
most heavily on today’s poor. Subsidies for renewable energy have
raised costs of heating and transport disproportionately for the
poor. Subsidies for biofuels have raised food prices by diverting
food into fuel, tipping millions into malnutrition and killing about 190,000 people a year. The
refusal of many rich countries to fund aid for coal-fired
electricity in Africa and Asia rather than renewable projects (and
in passing I declare a financial interest in coal mining) leaves more than a billion people without
access to electricity and contributes to 3.5 million deaths a year
from indoor air pollution caused by cooking over open fires of wood
and dung.



Greens think these harms are a price worth paying to stop the
warming. They want (other) people to bear such sacrifices today so
that the people of 2100, who will be up to seven times as rich, do
not have to face the prospect of living in a world that is perhaps
0.8 - 1.2 degrees warmer. And this is the moral high ground?



It is not just climate change. The opposition to genetically
modified food is mostly a middle and upper class obsession, but the
people who would benefit from such foods are often poor people.
Golden rice, for example, could prevent the deaths of hundreds of
thousands of people a year from vitamin A deficiency, but has been
stymied for 15 years by opposition organized by western green
groups, especially Greenpeace. They are entitled to think that this
philanthropic project is a bad idea, but they are buying their
reassurance at the expense of the poor’s health.



Other examples are organic farming and renewable energy, both of
which require more land than the conventional alternatives. Most
conservationists now recognize that “sustainable intensification”
is a key ingredient of environmental protection – that is, using as
little land as possible to grow crops and make energy, so as to
spare more land for nature. Fortunately, this plan also means
cheaper food and cheaper energy so it helps the poor. By all means
go organic and use wind power if you insist, but don’t pretend
there is anything morally superior about it.



Just as UKIP’s rise could deliver some Conservative seats to
Labour, so the rise of the Green party could possibly deliver a few
LibDem and Labour seats to the Tories. That may be the most
selfless thing about it.

 •  0 comments  •  flag
Share on Twitter
Published on November 05, 2014 04:29

Matt Ridley's Blog

Matt Ridley
Matt Ridley isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Matt Ridley's blog with rss.