More on this book
Community
Kindle Notes & Highlights
Read between
July 21 - July 30, 2022
waves in the US are now no more common than they were in 1900, and that the warmest temperatures in the US have not risen in the past fifty years.
Humans have had no detectable impact on hurricanes over the past century. •Greenland’s ice sheet isn’t shrinking any more rapidly today than it was eighty years ago. •The net economic impact of human-induced climate change will be minimal through at least the end of this century.
Much of the public portrayal of climate science suffers from Feynman’s Wesson Oil problem—in an effort to persuade rather than inform, the information presented withholds either essential context or what doesn’t “fit.”
One of my students would not fare well if they produced a map like the Post’s without explaining which warming is due to a changing global climate and which isn’t.
While modelers base their subgrid assumptions upon both fundamental physical laws and observations of weather phenomena, there is still considerable judgment involved. And since different modelers will make different assumptions, results can vary widely among models. This is not at all an unimportant detail, since ordinary fluctuations in the height and coverage of clouds can have as much of an impact on flows of sunlight and heat as do human influences. In fact, the greatest uncertainty in climate modeling stems from the treatment of clouds.
A paper laying out the details of one of the most esteemed models, that of Germany’s Max Planck Institute, tells of tuning a subgrid parameter (related to convection in the atmosphere) by a factor of ten because the originally chosen value resulted in twice as much warming as had been observed.9 Changing a subgrid parameter by a factor of ten from what you thought it was—that’s really dialing the knob.
Having better tools and information to work with should make the models more accurate and more in line with each other. That this has not happened is something to keep in mind whenever you read “Models predict . . .”
But another equally serious issue is also illustrated here: Figure 4.3 shows that the ensembles fail to reproduce the strong warming observed from 1910 to 1940. On average, the models give a warming rate over that period of about half what was actually observed. As the IPCC noted in measured and somewhat antiseptic language: It remains difficult to quantify the contribution to this warming from internal variability, natural forcing and anthropogenic forcing, due to forcing and response uncertainties and incomplete observational coverage.13 More bluntly, they’re saying that we’ve no idea what
...more
•“. . . low confidence regarding the sign of trend in the magnitude and/or frequency of floods on a global scale.”1 •“. . . low confidence in a global-scale observed trend in drought or dryness (lack of rainfall) since the middle of the 20th century . . .”2 •“. . . low confidence in trends in small-scale severe weather phenomena such as hail and thunderstorms . . .”3 •“. . . confidence in large scale changes in the intensity of extreme extratropical cyclones [storms] since 1900 is low.”
The bottom line is that the science says that most extreme weather events show no long-term trends that can be attributed to human influences on the climate.
In the US, which has the world’s most extensive and highest-quality weather data, record low temperatures have indeed become less common, but record daily high temperatures are no more frequent than they were a century ago.
A paper published in 2018 reinforces that last conclusion by analyzing more than thirty-three years of high-quality satellite observations covering the globe between 60° S and 60° N (everywhere except the polar regions): . . . there seems not to be any detectable and significant positive trends in the amount of global precipitation due to the now well-established increasing global temperature. While there are regional trends, there is no evidence of increase in precipitation at the global scale in response to the observed global warming.
Alas, it then spends about twice as long discussing the then most recent six-year California drought, which that state’s governor had declared over six months before the CSSR was released.26 As you can see from Figure 7.11, which shows the Palmer Drought Severity Index for California since 1901, that six-year drought was at its worst during 2014; by 2019 coverage was of the “wet” winter. It is hard to justify a climate assessment analyzing any “trend” shorter than ten years, even if that makes its conclusions less newsworthy. On longer timescales, the state has moved toward drought since 2000;
...more
“Human influences” can take many forms. Forest management (How much fuel is allowed to accumulate? Are fires suppressed or allowed to burn? How much development is permitted in or near forests?) and human-caused ignition (nearly 85 percent of US wildland fires have a human cause) are among the contributors.34 While we may not be able to fully quantify, much less control, the many climate-related influences on wildfires, we have significant power to address these human factors. By making the conversation about wildfires only one of unavoidable doom due to “climate change,” we miss an
...more
In fact, the rate of rise between 1925 and 1940—a period almost as long as the eighteen-year satellite record then available—was almost the same as that recent satellite value, about 3 mm (0.12 inches) per year.
In contrast, assessment authors must judge the validity and importance of many diverse research papers, and then synthesize them into a set of high-level statements meant to inform non-experts. So an assessment report’s “story” really matters, as does the language used to tell it—especially for something as important as climate.
Anyone referring to a scientist with the pejoratives “denier” or “alarmist” is engaging in politics or propaganda.
Omitting numbers is also a red flag. Hearing that “sea level is rising” sounds alarming, but much less so when you’re told it’s been going up at less than 30 cm (one foot) per century for the past 150 years. When numbers are included, omission of uncertainty estimates is another thing to watch out for in non-expert discussions of climate science, as has been recognized by at least one prominent journalist.
According to the IPCC, just stabilizing human influences on the climate would require global annual per capita emissions of CO2 to fall to less than one ton by 2075, a level comparable to today’s emissions from such countries as Haiti, Yemen, and Malawi. For comparison, 2015 annual per capita emissions from the United States, Europe, and China were, respectively, about 17, 7, and 6 tons.
The economic slowdown caused by the COVID-19 pandemic demonstrates just how challenging it will be to reduce emissions rapidly. Global CO2 emissions during the first half of 2020 were down only 8.8 percent compared to the same period in 2019, with 40 percent of that reduction coming from surface transport and 22 percent from the electricity sector. And emissions rebounded promptly in many countries as restrictions eased.
Thus, while it is often invoked as an example, a “Manhattan Project” isn’t a very apt or useful way of thinking about energy change. The real Manhattan Project in the early 1940s produced a few specific atomic “gadgets” for a single customer (the US military); it did not aspire to transform a large system already embedded throughout society. It also didn’t have to compete with an existing capability, while we already have perfectly serviceable ways of providing electricity and fuels. The Manhattan Project was carried out in secret, so public opinion and acceptance wasn’t an issue (although it
...more