Joseph J. Romm's Blog, page 102

August 26, 2015

What Went Wrong To Make This Climate Treaty Add Carbon To The Atmosphere, Not Decrease It

A new study shows that a plan to decrease greenhouse gas emissions backfired after “criminal” activity in Russia and Ukraine flooded the carbon credit market, resulting in 600 million metric tons of emissions.


The study, published Monday in Nature, found that 80 percent of the projects certified under the United Nations Framework Convention on Climate Change’s Joint Implementation (JI) scheme, part of the Kyoto Protocol, did not actually reduce emissions. Projects in many cases would have happened anyway — with or without Kyoto — and some were even fake, Vladyslav Zhezherin, one of the report’s authors, told the Guardian.


The Kyoto Protocol was signed in 1997 and committed countries to emissions cuts. Under the program, countries that are party to the protocol can earn emissions reduction credits for projects that reduce carbon emissions or increase carbon sinks “additional to what would otherwise have occurred.”


In one example, operators of three chemical plants actually increased waste gas emissions, only to turn around and earn credits by reducing them again. “If you produced more greenhouse gases only to destroy them and generate more carbon credits, you would essentially be damaging the climate for profit,” said Lambert Schneider, a co-author of the study, which was put out by the Stockholm Environment Institute (SEI), an international nonprofit research organization.


But the system was broken from the beginning, said Anja Kollmuss, an author of the study and an an associate at SEI. In simple terms, countries set emissions targets and then certified credits from projects — like wind farms or reforesting — that reduced emissions. The countries could buy the credits themselves (retire them) or sell them to other countries or companies that needed to meet reduction goals.


But there were two big problems with the system, Kollmuss said. For one, there was no international oversight for certifying projects. Second, and perhaps more importantly, Russia and Ukraine had overall emissions targets that were greater than their emissions — resulting in billions of excess, valueless credits.


“They received literally billions of spare Kyoto Protocol allowances,” Kollmuss told ThinkProgress. The excess allowances led to projects being certified that would have happened anyway — such as projects that were started in 2002 and certified in 2012, Kollmuss said. “Not all the projects were bad, just the overwhelming majority of them,” she said.


As with any market, when supply overwhelms demand, prices collapse. Suddenly, credits went from €13 (about $14.80) to less than €.5 (about 57 cents), which hurt projects that had been financed on the assumption that the market would hold.


The study’s authors hope the analysis will help inform negotiators during the UNFCC conference in Paris in December.


Carbon credit markets can work, Kollmuss said. In fact, nine New England and Mid-Atlantic states have an emissions credit program that has reduced emissions and lowered electricity bills.


“Do we know how to design a market so it has integrity? Yes, we do,” Kollmuss said. But it comes down to political willingness, and will only work if all party countries have ambitious targets; the targets are calculated in multi-year terms; there are clear accounting rules; and there is international oversight, she said.


“What we know of this future climate treaty if it is passed in Paris, it is much more of a bottom-up treaty. Countries can decide themselves what they are going to do and how they are going to do it,” Kollmuss said.



Tags

Climate ChangeUnited Nations

The post What Went Wrong To Make This Climate Treaty Add Carbon To The Atmosphere, Not Decrease It appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 26, 2015 08:02

Sit-In Staged At Kerry’s House To Dispute Little Known Pipeline Expansion To Be Bigger Than Keystone

On Tuesday morning, over 100 protesters gathered front of Secretary of State John Kerry’s Georgetown home, urging him to stop a pipeline that would carry thousands of barrels of tar sands from Canada into the United States.


But the pipeline in question wasn’t Keystone XL — it was the Alberta Clipper, an expansion project that would increase the capacity of an Enbridge-owned pipeline from 450,000 barrels of tar sands oil per day to over 800,000. Environmentalists have accused the State Department of allowing Enbridge — the Canadian company responsible for the largest inland oil spill in U.S. history, the Kalamazoo River Spill — to push forward with expanding the Alberta Clipper pipeline without undergoing necessary regulatory process, including presidential approval required for all cross-border pipelines.


Pipelines spill. It happens. Everyone knows that

“This expansion is going to be expanding this pipeline to 880,000 barrels of tar sands a day, whereas the Keystone pipeline is proposed to 830,000 barrels a day,” Kieran Williams, a protester and student at Kalamazoo College in Michigan, told ThinkProgress. “We think it’s absolutely absurd that there has been the environmental review and delay of the Keystone pipeline, but that Enbridge can continue this illegal expansion.”


Enbridge applied for a presidential permit to expand the pipeline, which runs from Hardisty, Alberta to Superior, Wisconsin, in 2012. But obtaining a presidential permit from the State Department is a lengthy process — it involves parties at a local, state, and federal level, and requires assessment of a pipeline’s environmental impact, among other things.


[image error]

Under Enbridge’s new plan, oil from the Alberta Clipper (left) would be diverted through the new Line 3 border crossing (right).


CREDIT: U.S. DISTRICT COURT MINNESOTA



While awaiting a presidential permit for the Alberta Clipper expansion, Enbridge decided to build connections on either side of the border between two existing pipelines — Line 67, or the Alberta Clipper, and Line 3, an older cross-border pipeline. Because Line 3 is an older pipeline — built in the 1960s, under a different, vaguely worded presidential permit — Enbridge felt that it could send more barrels of tar sand through its pipelines by capitalizing on unused capacity on Line 3, without having to apply for the same permit required for the Alberta Clipper. In August of 2014, the State Department approved Enbridge’s plan to move forward with certain elements of this connection scheme.


That decision garnered vehement criticism from environmentalists, who accused the State Department of “secretly approving” the pipeline project. In April, a coalition of tribal and environmental groups sued the State Department for allegedly allowing the construction of these pipeline connections — and the expansion of Line 3 — without conducting necessary environmental assessments.


Some of the groups involved with that lawsuit were present at Tuesday’s protest, which was organized by Energy Action Coalition in cooperation with several youth organizations throughout the Midwest. Participants held signs that featured quotes from Secretary Kerry speaking out on the Vietnam War, or pictures of Enbridge’s previous pipeline failures. The protest began with a rally at Georgetown’s Volta Park, followed by a march to Secretary Kerry’s home. There, a group of about twenty protesters climbed police barricades to sit in front of Kerry’s home, while others remained behind the barricades, chanting and singing. That group of protesters remained in front of the house for more than four hours — some with arms linked by foam cylinders to make a human pipeline — before being arrested by police. The protesters that were arrested were led away without handcuffs and cited and released on the scene.


[image error]

Protesters march from Volta Park to Secretary Kerry’s home Tuesday.


CREDIT: Natasha Geiling



To some protesters — many of whom vocally fought against approval of the Keystone XL pipeline — the lack of environmental review for the Alberta expansion project was baffling.


“Why should one pipeline get that treatment and another be ignored?” Kenny Bruno, campaign coordinator at Corporate Ethics International, told ThinkProgress. “Keystone XL would have been approved if we hadn’t screamed and yelled, so we have to scream and yell to force them to explain that decision and walk it back.”


The protest took place exactly one month after the five year anniversary of the Kalamazoo River Oil Spill, which is still the largest inland oil spill in U.S. history. It was also the first major pipeline rupture involving diluted bitumen — the combination of tar sands crude and natural gas that allows the normally thick tar sands to flow through pipelines. When the pipeline ruptured, the natural gas component of the diluted bitumen (also known as dilbit) vaporized, while the tar sands bitumen sank to the bottom of the river, making cleanup especially difficult. This May, Enbridge agreed to pay the state of Michigan $75 million for its role in the spill, on top of the $9.95 million it had already paid in previous settlements.


“The bitumen that spilled five years ago remains in the Kalamazoo even today,” Greta Herrin, a student from Kalamazoo, Michigan, told a crowd at a rally before the protest. “This is a corporation that has shown a complete disregard for people and the Earth. If any company should undergo layers of scrutiny, it’s Enbridge.”


John Kerry is such an incredible climate champion … it just doesn’t make sense.

Williams — who was born in Ann Arbor, Michigan — echoed Herrin’s sentiment.


“This company spilled one million gallons of toxic tar sands into the Kalamazoo River, took 17 hours to shut it down, and is still cleaning it up,” he said. “Enbridge will never be a responsible company and we shouldn’t be trusting them with this pipeline.”


Other protesters worried about the consequences that expanding Enbridge’s pipeline operations could have on climate change. In June, more than 100 scientists from the U.S. and Canada published a statement calling tar sands expansion “incompatible” with limiting climate change. A January study also argued that to avoid warming above 2°C, the majority of Canada’s tar sands would need to remain in the ground.


To Greg Mathews, a Minnesota native who attends college in Wisconsin, the protest was about envisioning a future beyond fossil fuels.


“Pipelines spill. It happens. Everyone knows that,” he told ThinkProgress. “I believe in, and I’m fighting for, a future where these tar sands will stay in the ground and we don’t have to take them out.”


Organizers could not confirm that Kerry was at home during the protest, and the State Department did not respond to ThinkProgress’ request for comment at the time of publication. But for Bruno, the stopping the Alberta Clipper expansion would be a way for Kerry to preserve his climate legacy.


“John Kerry is such an incredible climate champion,” he said. “To allow this to happen — the dirtiest oil in the world — it just doesn’t make sense.”



Tags

Climate ChangeEnbridgeState DepartmentTar Sands

The post Sit-In Staged At Kerry’s House To Dispute Little Known Pipeline Expansion To Be Bigger Than Keystone appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 26, 2015 05:00

August 25, 2015

Climate Change Is Increasing ER Visits For Diseases And Injuries Unrelated To Heat

People of all ages — not just the elderly — are more at risk of death and emergency room visits as the earth warms, a recent study has found.


The study, published this month by researchers at Brown University and the Rhode Island Department of Health in the journal Environmental Health Perspectives, focused on the population of Rhode Island. Researchers found that it didn’t need to be that hot for people to start visiting the ER in higher numbers — according to the study, a temperature of 75 degrees compared to 65 made heat-related emergency room visits increase by 3.3 percent. But, as it got hotter, the jump in visits was more acute: on days with highs of 85 degrees, ER visits jumped 23.9 percent compared to days with highs of 75 degrees. In addition, Rhode Island’s death rate increased by 4 percent on 85-degree days compared to 75-degree days.


“Our primary finding is that as temperatures increase, the number of emergency room visits and deaths increase,” Samantha Kingsley, a Brown University public health graduate student and lead author of the study, said in a statement. “But people were going to the hospital for heat-related reasons at temperatures below what we would typically consider extreme.”


And, according to the research, it wasn’t the elderly in Rhode Island that experienced the most emergency room visits. Instead, it was among the large age group of 18 to 64 year olds. The study’s authors weren’t sure exactly why this group seemed most susceptible to heat, but they did offer a few ideas.


“Whether stronger associations in this age group reflect increased opportunities for exposure (eg: through increased outdoor recreational or occupational activities), less careful attention to heat warnings, or are simply a function of the relatively lower baseline rate of [emergency room] admissions in this age group remains unclear,” the report states.


Previous studies and accounts have also linked higher temperatures to increased hospital visits and deaths, but in heat waves, the elderly have often been most at risk. Seniors may not be able to leave their homes if they’re too warm, and if they have health problems — such as heart disease — they may be less effective at circulating blood and keeping cool. This May, a heat wave in India killed about 2,000 peoplemany of whom were elderly.


The study warned that, if climate change continues to drive temperatures up, Rhode Island’s residents “would experience substantially higher morbidity and mortality.” If, by the end of this century, days in Rhode Island become 10 degrees warmer — a projection that’s on the high end of climate models — the summertime death rate in the state would increase by 1.5 percent, or about 80 additional deaths per summer. In addition, the ER visit rate would jump by 25 percent, or an increase of about 1,500 visits every summer. And since other states — not just Rhode Island — are expected to see higher temperatures with climate change, the study’s results could serve to make residents around the U.S. wary of high heat.


Heat’s connection to hospital visits and deaths is well-established, but climate change has been linked to other health impacts too. Increased temperatures and wetter springs can help disease vectors, like mosquitoes, expand their range, meaning that more people are at risk of contracting disease. And increased temperatures can also exacerbate air pollution, making the air more dangerous to breathe, especially for people with asthma. In April, the White House announced a plan to tackle the health impacts of climate change, one that involved educating Americans about the climate-related risks to health and private and public investments in health-related projects.



Tags

Climate ChangeHeatRhode Island

The post Climate Change Is Increasing ER Visits For Diseases And Injuries Unrelated To Heat appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2015 11:12

Climate Change Is Set To Drive Up Visits To The Emergency Room

People of all ages — not just the elderly — are more at risk of death and emergency room visits as the earth warms, a recent study has found.


The study, published this month by researchers at Brown University and the Rhode Island Department of Health in the journal Environmental Health Perspectives, focused on the population of Rhode Island. Researchers found that it didn’t need to be that hot for people to start visiting the ER in higher numbers — according to the study, a temperature of 75 degrees compared to 65 made heat-related emergency room visits increase by 3.3 percent. But, as it got hotter, the jump in visits was more acute: on days with highs of 85 degrees, ER visits jumped 23.9 percent compared to days with highs of 75 degrees. In addition, Rhode Island’s death rate increased by 4 percent on 85-degree days compared to 75-degree days.


“Our primary finding is that as temperatures increase, the number of emergency room visits and deaths increase,” Samantha Kingsley, a Brown University public health graduate student and lead author of the study, said in a statement. “But people were going to the hospital for heat-related reasons at temperatures below what we would typically consider extreme.”


And, according to the research, it wasn’t the elderly in Rhode Island that experienced the most emergency room visits. Instead, it was among the large age group of 18 to 64 year olds. The study’s authors weren’t sure exactly why this group seemed most susceptible to heat, but they did offer a few ideas.


“Whether stronger associations in this age group reflect increased opportunities for exposure (eg: through increased outdoor recreational or occupational activities), less careful attention to heat warnings, or are simply a function of the relatively lower baseline rate of [emergency room] admissions in this age group remains unclear,” the report states.


Previous studies and accounts have also linked higher temperatures to increased hospital visits and deaths, but in heat waves, the elderly have often been most at risk. Seniors may not be able to leave their homes if they’re too warm, and if they have health problems — such as heart disease — they may be less effective at circulating blood and keeping cool. This May, a heat wave in India killed about 2,000 peoplemany of whom were elderly.


The study warned that, if climate change continues to drive temperatures up, Rhode Island’s residents “would experience substantially higher morbidity and mortality.” If, by the end of this century, days in Rhode Island become 10 degrees warmer — a projection that’s on the high end of climate models — the summertime death rate in the state would increase by 1.5 percent, or about 80 additional deaths per summer. In addition, the ER visit rate would jump by 25 percent, or an increase of about 1,500 visits every summer. And since other states — not just Rhode Island — are expected to see higher temperatures with climate change, the study’s results could serve to make residents around the U.S. wary of high heat.


Heat’s connection to hospital visits and deaths is well-established, but climate change has been linked to other health impacts too. Increased temperatures and wetter springs can help disease vectors, like mosquitoes, expand their range, meaning that more people are at risk of contracting disease. And increased temperatures can also exacerbate air pollution, making the air more dangerous to breathe, especially for people with asthma. In April, the White House announced a plan to tackle the health impacts of climate change, one that involved educating Americans about the climate-related risks to health and private and public investments in health-related projects.



Tags

Climate ChangeHeatRhode Island

The post Climate Change Is Set To Drive Up Visits To The Emergency Room appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2015 11:12

Obama Just Picked The Customer’s Side Against The Nevada Utility That’s Trying To Kill Rooftop Solar

Everything that is happening in the solar industry is happening in Nevada right now.


On the one hand, President Obama announced a slew of new incentives and financing mechanisms for renewable energy and efficiency Monday at the 8th annual Clean Energy Summit in Las Vegas, hosted by solar champion Sen. Harry Reid (D-NV).


But he also seemed to be taking the local utility to task during the speech. On Wednesday, Nevada’s public utility commission (PUC) will decide whether to agree with a proposal by the local utility and implement huge rate increases in solar customers that industry insiders say will completely devastate the residential market in the state.


“We see the trend lines. We see where technology is taking us. We see where consumers want to go,” Obama said Monday. “That, let’s be honest, has some fossil fuel interests pretty nervous, to the point where they are trying to fight renewable energy.”


Nevada is — so far — a solar success story. The state is in first place in per capita solar jobs. Investment in solar quadrupled last year to more than half a billion dollars. There are more than 100 solar companies in Nevada, including six manufacturers. And the only reason solar customers’ electricity rates are even up for debate is that the installation cap under the old rate was hit last week, six months earlier than the utility estimated.


We see the trend lines. We see where technology is taking us. We see where consumers want to go.

Under an agreement hammered out earlier this year, public utility NV Energy’s net metering program — under which solar customers are paid market rate for the electricity they put back onto the grid — would be reconsidered by the end of the year, sometime before the 235-megawatt (MW) cap was hit. Now, solar installers are in limbo. The PUC could accept NV Energy’s proposal that would add fees and charges to solar customers’ bills.


“This proposal is a thousand pages. It’s incredibly complicated,” Chandler Sherman, a spokesperson for SolarCity, told Think Progress.


Sherman said the proposal includes nine new fees, taxes, and charges, which are often difficult for residential customers to understand. “They are really just designed to make it harder for people to go solar,” she said.


Among the additions is a nearly $14 per kilowatt hour (kWh) demand charge. That means that a customer’s peak demand during the month would be multiplied by $14 and added to the electricity bill, which currently just calculates a customer’s usage. The surcharge would only apply to solar customers. These types of charges are difficult to estimate and are incredibly rare for residential customers in the United States.


“If your teenage kid plugs his guitar into the garage, you’re screwed,” Will Craven, a spokesperson for the Alliance for Solar Choice, a rooftop solar industry advocacy group, told ThinkProgress.


Craven, who is also a spokesperson for SolarCity, would know. He was active in the fight against Salt River Project’s proposal to hit solar customers in Arizona with fees. That proposal, which passed last year, gutted the solar industry in SRP’s service area.


After the charges went through, SolarCity relocated 85 employees out of Arizona, as residential applications for solar fell 96 percent. “A lot of the state’s domestic companies were not able to survive,” Craven said. “You are not able to go solar in SRP’s territory anymore.”


If this happens in NV Energy’s territory, it will be a critical blow to Nevada’s solar industry. NV Energy serves 85 percent of Nevada’s population, 2.4 million customers.


On Monday night, Obama was bullish on solar. “It’s an American energy revolution,” Obama told the audience at the Clean Energy Summit.


He took the opportunity to announce a new financing option for residential solar — property assessed clean energy (PACE). PACE allows homeowners to borrow money for renewable energy investments and pay the loan off as a property tax. The solar industry has been working towards PACE for years, so this is good news for them. In addition, Obama announced another $1 billion in loan guarantees and a private-public partnership that will put solar power on more than 40 military bases across the country.


Sara Birmingham from the Solar Energy Industry Association (SEIA) applauded the president’s announcements, but cautioned that federal policies will only go so far without on the ground reinforcement.


“Anytime there is additional financing open to people who want to go solar, that is a great thing,” Birmingham told ThinkProgress. “But unless we have these underlying policies like net metering backing it up, it may be difficult to translate that into installations.”


For his part, Obama seemed to realize that the real solar fight is on the ground. NV Energy and its upcoming policy change was the “gorilla in the room” at the Clean Energy Summit, Craven said, and during his keynote address, Obama seemed to be speaking between the lines.


Utilities “are trying to undermine competition in the marketplace,” he said. “They are trying to fight renewable energy.”


The president did not mention NV Energy by name — instead calling out several utilities around the country that have put pro-solar policies in place and are working to figure out how to transition the electricity industry from a simple provider to an integrated system of distribution. But he did issue a tacit warning to utilities that don’t get on board.


“America always comes down on the side of the future,” he said.



Tags

ElectricityNevadaNV EnergyObamapowerPUCSolarSolarCityUtilities

The post Obama Just Picked The Customer’s Side Against The Nevada Utility That’s Trying To Kill Rooftop Solar appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2015 07:34

New Study Attributes Fewer Carbon Emissions To China. So Where Did They Go?

The pollution caused by China’s coal use gets a great deal of attention, and for good reason. It causes health problems in both China and America — helping to kill 4,000 Chinese people per day and traveling across the Pacific Ocean to increase smog levels in the western United States.


China burns four times as much coal as the United States does. Coal is, in large part, why China surpassed the United States as the biggest carbon polluter on the planet in 2006. In two years, despite a massive head start, China will surpass the United States in cumulative greenhouse gas emissions since 1990.


[image error]

CREDIT: Harvard University



But a Harvard-led study released last week in Nature found that the carbon pollution caused by burning coal in China is actually 14 percent lower than originally thought.


Researchers found that from 2000 to 2012, total energy consumption was 10 percent higher than the official statistics. Meanwhile, emissions factors — the carbon content of the coal — for coal in China are actually 40 percent lower than what the Intergovernmental Panel on Climate Change (IPCC) assumed. They also found that Chinese emissions from cement production were actually 45 percent lower than thought.


“Altogether, our revised estimate of China’s CO2 emissions from fossil fuel combustion and cement production is 2.49 gigatonnes of carbon in 2013, which is 14 percent lower than the emissions reported by other prominent inventories,” the study’s abstract reads. This is hundreds of millions of metric tonnes of carbon dioxide less than the world thought China was emitting.


[image error]

Click to enlarge.


CREDIT: Nature



The reason has to do with the way scientists had assumed coal was burned in China. The study found that the coal China uses as fuel actually contains less carbon than assumed, so burning it yields less carbon pollution. Also, the coal is burned less efficiently, yielding less energy and more ash waste. Because China releases no official greenhouse gas emissions information like the United States and other countries do, international organizations have to make broader assumptions for their estimates than normal.


This is one study, however, and some experts noted that it’s premature to draw conclusions from one specific article.


Kevin Trenberth, senior scientist at the National Center for Atmospheric Research, said the study was “unsatisfactory” because it does not address the glaring fact that carbon emissions continue to rise.


Instead of squaring the results with satellite measurements of CO2 emissions, they rely on uncertain and “surely incomplete” estimates, Trenberth told Climate Central.


Ranping Song, the team lead at the World Resource Institute’s China Climate Program, urged caution in using the Nature study to jump to conclusions.


“Experts in China are raising concerns with how these conclusions in the paper have been reached,” he told ThinkProgress. “A one-time sampling is not representative of the emissions of a coal mine or coal plant over time.” Specifically, Song noted that the emissions factors that the researchers compared “are actually not that different.”


As more global emissions come from developing countries, greenhouse gas emissions inventories have become less certain, according to Steven Davis, one of the authors of the study. It’s possible that China’s coal consumption statistics could be revised upward soon — this would offset some of the reductions found in the study.


“One study like this won’t have an impact on China’s climate policy,” WRI’s Song said. “It wouldn’t change the motives at play — you still witness air pollution. It’s absolutely real, both from ordinary people as well as people in high-level government. They are taking actions to reduce emissions.”


The study examined data up to 2013, so actions China has taken since then will not be reflected in the data — which is important, because China has done a lot since then. In 2014, Chinese coal consumption dropped for the first time this century. It has committed to peaking coal by 2020 or earlier, plateauing carbon emissions by or before 2030, and dramatically increasing its renewable energy production. By next year, the last of the coal-fired power plants in Beijing will close.


Still, assuming this study’s conclusions have merit in that China’s coal burning emits even somewhat less CO2 than originally assumed, this does not mean that the global atmospheric carbon dioxide levels are any lower than scientists thought. Either emissions are coming from somewhere else, or natural processes are a little less effective at sucking up atmospheric carbon dioxide than scientists thought.


“The global total emissions are the sum of country emissions, so it would decrease by the same amount,” said study author Davis. “But don’t confuse emissions with atmospheric CO2.”


Davis, an assistant professor at the University of California, Irvine, said it helps to think about this like a bathtub.


“Emissions are how much is coming out of the faucet, and atmospheric CO2 is the level of water in the tub,” he told ThinkProgress. “We’ve revised the emissions, but the levels are unchanged. What this means is that the drain — e.g., how much CO2 is being sucked up by growing plants — must be slower than we thought.”


Changes in how the terrestrial carbon sink works, especially due to land change, “still have huge uncertainties, so those are fluxes we’d expect to take up the slack of this revision,” Davis said.


It’s unclear exactly what natural processes — most often involving plant respiration — could be so different.


Oceans absorb almost a third of the carbon dioxide emissions worldwide. The water itself absorbs CO2, forming carbonic acid — the process of ocean acidifcation. Living phytoplankton also breathe in carbon dioxide, jump-starting the food chain. The bitter irony, according to a recent study in Nature Climate Change, is that more acidic oceans will shift the balance of these critical phytoplankton species, with some dying out altogether. If these trends mean the oceans absorb less CO2 than previously assumed, that could help account for smaller-than-expected Chinese carbon pollution.


It’s possible that worse-than-assumed deforestation could also be somewhat responsible for the extra emissions. A study released Monday found that if humans continue on their current path, the planet will lose an India-sized area of tropical forests by 2050. Total forestation can be measured in large part by satellite photography, but there could be some unanticipated change in how these forests breathe in carbon dioxide.


While it will take more time and research to parse out exactly what is going on with the extra carbon emissions, this serves as a good reminder that carbon pollution anywhere on the planet affects the tenuous balance that has existed between carbon sinks and carbon sources. Take away a carbon sink, burn more carbon-rich fuels, and it gets stuck in the atmosphere, with truly global consequences.



Tags

Carbon EmissionsChinaCoalDeforestation

The post New Study Attributes Fewer Carbon Emissions To China. So Where Did They Go? appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2015 05:00

August 24, 2015

10 Years After Katrina, Will California’s Capital Be The Next New Orleans?

A 2011 New York Times Magazine story sounded the alarm: “Scientists consider Sacramento — which sits at the confluence of the Sacramento and American Rivers and near the Delta — the most flood-prone city in the nation.” The article went on to note that experts fear an earthquake or violent Pacific superstorm could destroy the city’s levees and spur a megaflood that could wreak untold damage on California’s capital region.


Post-mortem studies blamed the U.S. Army Corps of Engineers and its flawed flood control system for the cataclysmic damage to New Orleans in August 2005. In 2006, the corps’ chief publicly owned responsibility, acknowledging that the levees that were supposed to prevent flooding were improperly built and relied on old data: “This is the first time that the corps has had to stand up and say: ‘We’ve had a catastrophic failure.'”


In the decade since Katrina exposed serious flaws in the nation’s levee system, much has happened. A bipartisan Congressional committee investigated and released a report. New levees were built in Louisiana. Congress voted to create a National Levee Safety Program to mitigate the threat to other high-risk areas, and state, local, and federal efforts began in earnest to prevent Sacramento from becoming the next Katrina. But despite these significant steps, the U.S. Army Corps of Engineers calls the Sacramento area “among the most at-risk regions in America for catastrophic flooding.”


In many ways, Katrina proved to be a wake-up call but much of the subsequent investment in the nation’s inadequate levee system has been driven by politics rather than science, has not received the funding required, and has been insufficiently forward-looking. And by relying on data and standards that do not reflect climate change and the rise of stronger storms that has come with it, things may be even worse than assumed.


A 64 Percent Chance

While the levee infrastructure is in grim shape in many parts of the country, multiple experts point to Sacramento and the unique geographic, seismologic, and economic factors that put it at even greater risk than the Gulf Coast.


“The levee situation was worse than New Orleans in Sacramento before Katrina. Now it’s of course worse,” said Robert Verchick, Gauthier-St. Martin Chair in Environmental Law at Loyola University New Orleans and a former EPA deputy associate administrator, who calls the aging flood-control system “a monstrous accident waiting to happen.”


“Ruptures of the levees would swamp Sacramento and places like San Francisco wouldn’t have water for weeks. That’s what keeps me up at night, thinking about those kinds of problems,” he said.


Tyler Stalker, a spokesman for the U.S. Army Corps of Engineers Sacramento District, agreed that the Sacramento area is uniquely at risk. “Two million people live here” at the “confluence of two major river systems,” he said. “Interstate 5 runs through the heart of Sacramento. Interstates 80 and 50 come right through Sacramento. These are major transportation corridors. And it’s the capital,” he added, with “a lot of water flowing through here, relying on a levee system that is quickly aging.”


To illustrate the challenge, Stalker pointed to the bathtub-like Natomas area to the north of the city of Sacramento, and the 41-mile levee system that surrounds it. “You can’t fix just one segment and expect that to work,” he explained. “If it fails anywhere, ultimately [a flood] will get in there and fill up that whole area, because of the way it’s composed.” Things need to be addressed on a system-wide basis, each is “only as good as its weakest link.”


[image error]

November 2012 flood in Sacramento, CA


CREDIT: AP Photo/Rich Pedroncelli



More than a decade ago, geologist Jeffrey Mount made a terrifying prediction: He estimated that there is a 64 percent chance of a disastrous levee failure in the Sacramento-San Joaquin River Delta (or California Delta) over the next 50 years. In an email, he told ThinkProgress that number, which “summed the risk of flood and earthquake failure” in the levee system, “was and is a conservative estimate.”


Mount, a senior fellow at the Public Policy Institute of California Water Policy Center and founding director of the Center for Watershed Sciences at the University of California, Davis, noted that there are two distinct flood risk areas: Metropolitan Sacramento, which “lies at the confluence of two flood-prone rivers, the Sacramento and the American,” and the California Delta, “the confluence of the Sacramento and San Joaquin Rivers and is the freshwater head of San Francisco Bay.”


“Rising sea level, increasing winter floods, and increasing seismic risk combine together to put this system of levees at risk of failure,” Mount warned, and “the status quo of the Delta cannot be sustained.” A 2013 study by the National Oceanic and Atmospheric Administration found that sea level rises spurred by climate change could make storm surges like those seen in Superstorm Sandy a regular occurrence.


While flooding might seem like less of a concern as California contends with its current drought, upcoming storms like the feared “Godzilla El Niño” could be even more of a threat than usual because dry ground is less able to absorb precipitation.


The Nation’s Failing Levee System

To understand the challenges in Sacramento, it helps to understand some of the history of levees and flood risk reduction. FEMA describes levees as “man-made structures, usually an earthen embankment, designed and constructed with sound engineering practices to contain, control or divert the flow of water in order to provide protection from temporary flooding.”


After the Great Mississippi Flood of 1927, Verchick explained, “Congress said it’s gonna be the job of the federal government” to protect against flooding on the “waters of the United States.” The responsibility for building federal levees was delegated to the U.S. Army Corps of Engineers, though states, localities, and private entities can also build their own levees.


Out of the roughly 14,500 miles of federal system levees, only about 2,800 miles — mostly along the Mississippi River — are owned and operated by the corps. The federal government generally pays 65 percent of the building costs and the states or localities pay the rest. After construction, Larry Larson of the Association of State Floodplain Managers notes, the corps “turns it over to a local sponsor, who is required to operate and maintain it.”


Fast forward to today and many of these and other levees are in a state of disrepair, putting millions of people and the nation’s economy at serious risk.


The corps’ National Levee Database — an inventory of all federal system levees and some state and local ones, plus the condition of each — was built thanks to the post-Katrina Water Resources Development Act of 2007. As of mid-August, it contained more than 2,500 levee systems — mostly federally constructed. Some 425 of those were found to be in unacceptable condition in their most recent routine inspections and over 1,000 more were in “minimally acceptable” condition (of the at least 47 levee systems in the database protecting Sacramento and San Joaquin County, 23 were found to be unacceptable in their last routine or periodic inspection).


Based on this data and other information, the American Society of Civil Engineers gave the nation’s levee system a grade of D-minus in its 2013 “Report Card for America’s Infrastructure.”


The Army Corps’ spokesman Stalker said that a “minimally acceptable” rating means only a minor deficiency that is unlikely to prevent a system from working in a flood and that even an “unacceptable” rating does not necessarily mean a system will fail. “Unacceptable,” he explained, means that the corps “believe it could cause an issue during a storm event,” although it is also possible “it could work just fine.” The ratings are intended help localities prioritize steps for risk reduction.


National Levee Database systems rated “unacceptable” in their most recent routine inspection




Source: U.S. Army Corps of Engineers

How Much Is Enough?

For a long time, Mount said, the urban Sacramento area had an inadequate levee system and was “the most at-risk” region in the country. With significant investment in infrastructure underway, he believes that “within the next ten years the area is likely to no longer be the most at-risk large city. Actually, not even close. It still will have considerable risk, but nothing like the other floodplain or coastal cities.”


Major floods in the Central Valley in 1986 “woke up the community,” said Sacramento Area Flood Control Agency (SAFCA) executive director, Richard Johnson. Another major flood in 1997, followed by the jarring images that came out of New Orleans in the wake of Katrina, were both reminders of what was at stake. “We do not want to be the next Katrina. And to the credit of the citizens of Sacramento, they’ve assessed themselves several times already and continue to give over 80 percent of the vote” to flood control assessments, Johnson said.


[image error]

December 2014 flooding in Sacramento, CA


CREDIT: AP Photo/Rich Pedroncelli



In 2006, then-Gov. Arnold Schwarzenegger (R) proposed — and the California voters approved — a pair of water bonds that included nearly $5 billion for flood risk reduction, with a significant focus on levee repair and maintenance. “We can spend money on flood protection or we can spend money on cleanup,” he predicted. With these state funds, the California Department of Water Resources says “many of the most urgent repairs have been completed or are near completion.”


The area has also been especially fortunate to have higher-than-average federal investment in levee projects, a fact Johnson attributes to the hard work of Sacramento’s Congresswoman Doris Matsui (D). “Flood control is her number one priority. She spends a huge amount of time, effort, and resources on making sure our projects are funded,” he said, adding that a “concerted effort at all levels of government” have helped keep levee construction and maintenance moving forward.


Frank Mansell, a natural hazards program specialist at FEMA in the region that includes Sacramento, noted that both the city of Sacramento and the surrounding county are taking a lot of positive steps toward floodplain management. “They are doing the right things and exceeding the right things,” he said, noting that both have been granted significant community discounts on federal flood insurance, under FEMA’s community rating system.


But all of this investment may not be enough.


Rising sea level, increasing winter floods, and increasing seismic risk combine together to put this system of levees at risk of failure … the status quo of the Delta cannot be sustained.

Even with the increased attention on its vulnerability, the California Delta is still at a particularly high risk, Mount, the geologist, explained. “More than 1,100 miles of aging levees of uncertain construction currently protect more than 60 ‘islands’ in the Delta,” he said. As the islands got farmed, the soils oxidized and the land elevation dropped — in many cases to 25 feet or more below sea level. “As the land lowered, the levees needed to be both wider and taller to meet the increasing hydraulic pressure. Most of these levees are privately owned and all have failed at some time in the past,” he said. While he calls the levee maintenance effort by the landowners and state “heroic,” he warns that “they are susceptible to failure from floods, extreme high tides, beavers and other rodents, and earthquakes.”


To do the necessary flood-risk reduction work for the larger region, the Army Corps’ Stalker said the corps are “proposing about $2.5 billion” worth of work. Though Congress has provided a “steady stream of funds” for the corps’ projects, with annual civil works budgets for the district between $141 million and $150 million for the past two years, he said that, even “assuming you get the funding you’re looking for, you’re looking at 20-plus years for construction, a long road.”


Nationally, levees are typically built to a “100-year flood” standard — in other words, providing protection from all but the floods that are just 1 percent likely to hit in a given year — at least, in theory. This stems from the National Flood Insurance Program, which requires those in flood paths without at least that level of protection to carry federal flood insurance. California adopted an even tougher 200-year-flood standard for levees protecting urban areas.


The issue with these estimates, however, is that they do not account for future development and climate change. In one North Carolina watershed, the increased development that came with population growth meant “flood levels went up from anywhere from two to nine feet,” Larson recalled. Every time a parking lot is built over absorbent ground, more burden falls onto the levees.


Verchick added that the 100-year flood is calculated by looking back at history and doing statistical modelling based on the question “what is the frequency with which these monster events, abnormal events will occur?” But historical records only go back so far and, with climate change, he warned, “we are looking at a no-analog future. We used to be able to look at the past and extrapolate. But now we have a future that is going to be controlled by rules of its own.”


And current information Verchick has seen suggests that in 30 years, the once-in-every-500-years flood will be a once-in-everyone-100-years flood.


We are looking at a no-analog future. We used to be able to look at the past and extrapolate. But now we have a future that is going to be controlled by rules of its own.

FEMA is tasked with accrediting levees for the National Flood Insurance Program. The decision on whether a levee meets — or continues to meet — the 100-year-flood requirement is determined by a team of engineers like Robert Bezek. The agency’s flood mapping is based on a careful assessment of scientific data relating to rainfall, water flow, storm gauges, and comparable watersheds.


“It’s not voodoo science,” Bezek joked. But as climate change brings more severe weather, these assessments have the potential to be obsolete quickly. “We map our flood maps on current conditions. We don’t map future conditions,” he said, acknowledging that “[climate change] will have an impact; we don’t know how.”


This lack of consideration for future conditions was also evident in the post-Katrina rebuild in New Orleans.


After the levee system failure in 2005, Verchick believes the investment in New Orleans has improved flood mitigation there. “I’m not an engineer, but from talking to engineers, everything I know suggests the new levee system is a good system. There is no reason to doubt that it will provide the kind of protection that it was promised to provide.” But, even that may not be good enough. “I think there is a serious question about whether the standards that the Army Corps of Engineers was meant to follow were high enough,” he said, and the levees may be “well-built, but not protective enough.”


[image error]

New levee wall in New Orleans, in 2012


CREDIT: AP Photo/Gerald Herbert



Sandy Rosenthal, who co-founded the non-profit levees.org with her then-15-year-old son as both were evacuated from New Orleans, echoed these concerns. “In the context of haste and confusion, Congress gave more power, more authority, and $14 billion [to the Army Corps] to rebuild the system that failed. It’s like when your house falls to the ground, you say to your contractor, ‘This time, do it right! And do it really fast!'”


The rebuild still relied on unrealistic standards and lack of foresight, she explained. To provide 500-year-level protection, she said, they’d have needed to build the new levees only two to 2.5 feet higher and 1000-year-protection would have needed only one more foot beyond that. But instead, they were rebuilt to the same height and just 100-year level protection. “Politics and engineering are a lethal mix,” Rosenthal observed.


‘Fix The Roof When It’s Dry’

And while New Orleans and the Sacramento area and a few other cities have benefited from some flood prevention investments, many more are unprepared for floods or other potential disasters. “Everyone acknowledges we’ve got a levee problem and continues walking down the street,” said Gerald Galloway, a professor of engineering at the University of Maryland, College Park and a member of the American Society of Civil Engineers (ASCE). “It’s the [national] infrastructure problem, on steroids.”


Galloway noted that while teams are “working madly in Sacramento,” a better system is “not something you can put up overnight.” While “the corps and FEMA are trying to assess the challenges,” there are simply not enough budgeted funds to do so nationally.


[image error]

CREDIT: Dylan Petrohilos



Other parts of the country face the challenges of extreme weather events, but many of them do not have access to the same level of funding. “Unfortunately, all of this is coming about at a time when resources at all levels of government are as strained as they have ever been,” SAFCA’s Johnson said. Few other states or localities have voted explicitly to increase their own taxes to pay for flood protections — and the Army Corps of Engineers Civil Works budget is a very limited pie.


People say, ‘fix the roof when it’s dry,’ but we tend to fix the roof after the rainfall has gone through it.

While Washington has authorized action to address the nation’s levee system, it has not appropriated the necessary funds to make that happen. The Association of State Floodplain Managers’ Larson notes that the 2014 Water Resources Reform and Development Act created the National Levee Safety Program, but it has not yet gotten off the ground due to lack of funding: “They said we should develop standards nationally. The corps started that process, but haven’t been able to do anything with it, because Congress hasn’t funded it.”


“People say, ‘fix the roof when it’s dry,’ but we tend to fix the roof after the rainfall has gone through it,” Galloway added. “We got money for New Orleans, after the hurricane. We got money for Sandy after it occurred.” While Congress — hamstrung by spending caps and anti-tax pledges — has been unwilling to make the needed investment, the lack of investment in the needed ounces of prevention will ultimately cost the nation a lot more in pounds of cure.


“You’re gonna pay me now or you’re gonna pay me later,” Galloway said. “The bottom line is that ‘levee’ is not a four letter word. The levees on the Mississippi in 2011 did a lot for people and saved billions in damages. The 1927 flood literally crippled our nation.”


While Katrina’s death toll, the product of a delayed and insufficient evacuation effort, is not a typical characteristic of major floods, the astronomical cost associated with the resultant property damage is, Larson said. And the costs go beyond just the immediate damages, including long-term business costs, health costs, and mental health impacts. “We don’t really know how much floods cost us in the United States. We have these silly estimates that are so far off, it’s unbelievable.”


Larson noted one study found levee failure in the Sacramento area could cost ten times more than Katrina. “We need to get serious, he concluded. “We get a four-to-one return on mitigation money. But we don’t invest — we spend almost all of our money on response and immediate recovery.”



Tags

CaliforniaClimate ChangeFlood InsuranceFloodsInfrastructure

The post 10 Years After Katrina, Will California’s Capital Be The Next New Orleans? appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 13:08

10 Years After Katrina, Will Sacramento Be The Next New Orleans?

A 2011 New York Times Magazine story sounded the alarm: “Scientists consider Sacramento — which sits at the confluence of the Sacramento and American Rivers and near the Delta — the most flood-prone city in the nation.” The article went on to note that experts fear an earthquake or violent Pacific superstorm could destroy the city’s levees and spur a megaflood that could wreak untold damage on California’s capital region.


Post-mortem studies blamed the U.S. Army Corps of Engineers and its flawed flood control system for the cataclysmic damage to New Orleans in August 2005. In 2006, the corps’ chief publicly owned responsibility, acknowledging that the levees that were supposed to prevent flooding were improperly built and relied on old data: “This is the first time that the corps has had to stand up and say: ‘We’ve had a catastrophic failure.'”


In the decade since Katrina exposed serious flaws in the nation’s levee system, much has happened. A bipartisan Congressional committee investigated and released a report. New levees were built in Louisiana. Congress voted to create a National Levee Safety Program to mitigate the threat to other high-risk areas, and state, local, and federal efforts began in earnest to prevent Sacramento from becoming the next Katrina. But despite these significant steps, the U.S. Army Corps of Engineers calls the Sacramento area “among the most at-risk regions in America for catastrophic flooding.”


In many ways, Katrina proved to be a wake-up call but much of the subsequent investment in the nation’s inadequate levee system has been driven by politics rather than science, has not received the funding required, and has been insufficiently forward-looking. And by relying on data and standards that do not reflect climate change and the rise of stronger storms that has come with it, things may be even worse than assumed.


A 64 Percent Chance

While the levee infrastructure is in grim shape in many parts of the country, multiple experts point to Sacramento and the unique geographic, seismologic, and economic factors that put it at even greater risk than the Gulf Coast.


“The levee situation was worse than New Orleans in Sacramento before Katrina. Now it’s of course worse,” said Robert Verchick, Gauthier-St. Martin Chair in Environmental Law at Loyola University New Orleans and a former EPA deputy associate administrator, who calls the aging flood-control system “a monstrous accident waiting to happen.”


“Ruptures of the levees would swamp Sacramento and places like San Francisco wouldn’t have water for weeks. That’s what keeps me up at night, thinking about those kinds of problems,” he said.


Tyler Stalker, a spokesman for the U.S. Army Corps of Engineers Sacramento District, agreed that the Sacramento area is uniquely at risk. “Two million people live here” at the “confluence of two major river systems,” he said. “Interstate 5 runs through the heart of Sacramento. Interstates 80 and 50 come right through Sacramento. These are major transportation corridors. And it’s the capital,” he added, with “a lot of water flowing through here, relying on a levee system that is quickly aging.”


To illustrate the challenge, Stalker pointed to the bathtub-like Natomas area to the north of the city of Sacramento, and the 41-mile levee system that surrounds it. “You can’t fix just one segment and expect that to work,” he explained. “If it fails anywhere, ultimately [a flood] will get in there and fill up that whole area, because of the way it’s composed.” Things need to be addressed on a system-wide basis, each is “only as good as its weakest link.”


[image error]

November 2012 flood in Sacramento, CA


CREDIT: AP Photo/Rich Pedroncelli



More than a decade ago, geologist Jeffrey Mount made a terrifying prediction: He estimated that there is a 64 percent chance of a disastrous levee failure in the Sacramento-San Joaquin River Delta (or California Delta) over the next 50 years. In an email, he told ThinkProgress that number, which “summed the risk of flood and earthquake failure” in the levee system, “was and is a conservative estimate.”


Mount, a senior fellow at the Public Policy Institute of California Water Policy Center and founding director of the Center for Watershed Sciences at the University of California, Davis, noted that there are two distinct flood risk areas: Metropolitan Sacramento, which “lies at the confluence of two flood-prone rivers, the Sacramento and the American,” and the California Delta, “the confluence of the Sacramento and San Joaquin Rivers and is the freshwater head of San Francisco Bay.”


“Rising sea level, increasing winter floods, and increasing seismic risk combine together to put this system of levees at risk of failure,” Mount warned, and “the status quo of the Delta cannot be sustained.” A 2013 study by the National Oceanic and Atmospheric Administration found that sea level rises spurred by climate change could make storm surges like those seen in Superstorm Sandy a regular occurrence.


While flooding might seem like less of a concern as California contends with its current drought, upcoming storms like the feared “Godzilla El Niño” could be even more of a threat than usual because dry ground is less able to absorb precipitation.


The Nation’s Failing Levee System

To understand the challenges in Sacramento, it helps to understand some of the history of levees and flood risk reduction. FEMA describes levees as “man-made structures, usually an earthen embankment, designed and constructed with sound engineering practices to contain, control or divert the flow of water in order to provide protection from temporary flooding.”


After the Great Mississippi Flood of 1927, Verchick explained, “Congress said it’s gonna be the job of the federal government” to protect against flooding on the “waters of the United States.” The responsibility for building federal levees was delegated to the U.S. Army Corps of Engineers, though states, localities, and private entities can also build their own levees.


Out of the roughly 14,500 miles of federal system levees, only about 2,800 miles — mostly along the Mississippi River — are owned and operated by the corps. The federal government generally pays 65 percent of the building costs and the states or localities pay the rest. After construction, Larry Larson of the Association of State Floodplain Managers notes, the corps “turns it over to a local sponsor, who is required to operate and maintain it.”


Fast forward to today and many of these and other levees are in a state of disrepair, putting millions of people and the nation’s economy at serious risk.


The corps’ National Levee Database — an inventory of all federal system levees and some state and local ones, plus the condition of each — was built thanks to the post-Katrina Water Resources Development Act of 2007. As of mid-August, it contained more than 2,500 levee systems — mostly federally constructed. Some 425 of those were found to be in unacceptable condition in their most recent routine inspections and over 1,000 more were in “minimally acceptable” condition (of the at least 47 levee systems in the database protecting Sacramento and San Joaquin County, 23 were found to be unacceptable in their last routine or periodic inspection).


Based on this data and other information, the American Society of Civil Engineers gave the nation’s levee system a grade of D-minus in its 2013 “Report Card for America’s Infrastructure.”


The Army Corps’ spokesman Stalker said that a “minimally acceptable” rating means only a minor deficiency that is unlikely to prevent a system from working in a flood and that even an “unacceptable” rating does not necessarily mean a system will fail. “Unacceptable,” he explained, means that the corps “believe it could cause an issue during a storm event,” although it is also possible “it could work just fine.” The ratings are intended help localities prioritize steps for risk reduction.


National Levee Database systems rated “unacceptable” in their most recent routine inspection




Source: U.S. Army Corps of Engineers

How Much Is Enough?

For a long time, Mount said, the urban Sacramento area had an inadequate levee system and was “the most at-risk” region in the country. With significant investment in infrastructure underway, he believes that “within the next ten years the area is likely to no longer be the most at-risk large city. Actually, not even close. It still will have considerable risk, but nothing like the other floodplain or coastal cities.”


Major floods in the Central Valley in 1986 “woke up the community,” said Sacramento Area Flood Control Agency (SAFCA) executive director, Richard Johnson. Another major flood in 1997, followed by the jarring images that came out of New Orleans in the wake of Katrina, were both reminders of what was at stake. “We do not want to be the next Katrina. And to the credit of the citizens of Sacramento, they’ve assessed themselves several times already and continue to give over 80 percent of the vote” to flood control assessments, Johnson said.


[image error]

December 2014 flooding in Sacramento, CA


CREDIT: AP Photo/Rich Pedroncelli



In 2006, then-Gov. Arnold Schwarzenegger (R) proposed — and the California voters approved — a pair of water bonds that included nearly $5 billion for flood risk reduction, with a significant focus on levee repair and maintenance. “We can spend money on flood protection or we can spend money on cleanup,” he predicted. With these state funds, the California Department of Water Resources says “many of the most urgent repairs have been completed or are near completion.”


The area has also been especially fortunate to have higher-than-average federal investment in levee projects, a fact Johnson attributes to the hard work of Sacramento’s Congresswoman Doris Matsui (D). “Flood control is her number one priority. She spends a huge amount of time, effort, and resources on making sure our projects are funded,” he said, adding that a “concerted effort at all levels of government” have helped keep levee construction and maintenance moving forward.


Frank Mansell, a natural hazards program specialist at FEMA in the region that includes Sacramento, noted that both the city of Sacramento and the surrounding county are taking a lot of positive steps toward floodplain management. “They are doing the right things and exceeding the right things,” he said, noting that both have been granted significant community discounts on federal flood insurance, under FEMA’s community rating system.


But all of this investment may not be enough.


Rising sea level, increasing winter floods, and increasing seismic risk combine together to put this system of levees at risk of failure … the status quo of the Delta cannot be sustained.

Even with the increased attention on its vulnerability, the California Delta is still at a particularly high risk, Mount, the geologist, explained. “More than 1,100 miles of aging levees of uncertain construction currently protect more than 60 ‘islands’ in the Delta,” he said. As the islands got farmed, the soils oxidized and the land elevation dropped — in many cases to 25 feet or more below sea level. “As the land lowered, the levees needed to be both wider and taller to meet the increasing hydraulic pressure. Most of these levees are privately owned and all have failed at some time in the past,” he said. While he calls the levee maintenance effort by the landowners and state “heroic,” he warns that “they are susceptible to failure from floods, extreme high tides, beavers and other rodents, and earthquakes.”


To do the necessary flood-risk reduction work for the larger region, the Army Corps’ Stalker said the corps are “proposing about $2.5 billion” worth of work. Though Congress has provided a “steady stream of funds” for the corps’ projects, with annual civil works budgets for the district between $141 million and $150 million for the past two years, he said that, even “assuming you get the funding you’re looking for, you’re looking at 20-plus years for construction, a long road.”


Nationally, levees are typically built to a “100-year flood” standard — in other words, providing protection from all but the floods that are just 1 percent likely to hit in a given year — at least, in theory. This stems from the National Flood Insurance Program, which requires those in flood paths without at least that level of protection to carry federal flood insurance. California adopted an even tougher 200-year-flood standard for levees protecting urban areas.


The issue with these estimates, however, is that they do not account for future development and climate change. In one North Carolina watershed, the increased development that came with population growth meant “flood levels went up from anywhere from two to nine feet,” Larson recalled. Every time a parking lot is built over absorbent ground, more burden falls onto the levees.


Verchick added that the 100-year flood is calculated by looking back at history and doing statistical modelling based on the question “what is the frequency with which these monster events, abnormal events will occur?” But historical records only go back so far and, with climate change, he warned, “we are looking at a no-analog future. We used to be able to look at the past and extrapolate. But now we have a future that is going to be controlled by rules of its own.”


And current information Verchick has seen suggests that in 30 years, the once-in-every-500-years flood will be a once-in-everyone-100-years flood.


We are looking at a no-analog future. We used to be able to look at the past and extrapolate. But now we have a future that is going to be controlled by rules of its own.

FEMA is tasked with accrediting levees for the National Flood Insurance Program. The decision on whether a levee meets — or continues to meet — the 100-year-flood requirement is determined by a team of engineers like Robert Bezek. The agency’s flood mapping is based on a careful assessment of scientific data relating to rainfall, water flow, storm gauges, and comparable watersheds.


“It’s not voodoo science,” Bezek joked. But as climate change brings more severe weather, these assessments have the potential to be obsolete quickly. “We map our flood maps on current conditions. We don’t map future conditions,” he said, acknowledging that “[climate change] will have an impact; we don’t know how.”


This lack of consideration for future conditions was also evident in the post-Katrina rebuild in New Orleans.


After the levee system failure in 2005, Verchick believes the investment in New Orleans has improved flood mitigation there. “I’m not an engineer, but from talking to engineers, everything I know suggests the new levee system is a good system. There is no reason to doubt that it will provide the kind of protection that it was promised to provide.” But, even that may not be good enough. “I think there is a serious question about whether the standards that the Army Corps of Engineers was meant to follow were high enough,” he said, and the levees may be “well-built, but not protective enough.”


[image error]

New levee wall in New Orleans, in 2012


CREDIT: AP Photo/Gerald Herbert



Sandy Rosenthal, who co-founded the non-profit levees.org with her then-15-year-old son as both were evacuated from New Orleans, echoed these concerns. “In the context of haste and confusion, Congress gave more power, more authority, and $14 billion [to the Army Corps] to rebuild the system that failed. It’s like when your house falls to the ground, you say to your contractor, ‘This time, do it right! And do it really fast!'”


The rebuild still relied on unrealistic standards and lack of foresight, she explained. To provide 500-year-level protection, she said, they’d have needed to build the new levees only two to 2.5 feet higher and 1000-year-protection would have needed only one more foot beyond that. But instead, they were rebuilt to the same height and just 100-year level protection. “Politics and engineering are a lethal mix,” Rosenthal observed.


‘Fix The Roof When It’s Dry’

And while New Orleans and the Sacramento area and a few other cities have benefited from some flood prevention investments, many more are unprepared for floods or other potential disasters. “Everyone acknowledges we’ve got a levee problem and continues walking down the street,” said Gerald Galloway, a professor of engineering at the University of Maryland, College Park and a member of the American Society of Civil Engineers (ASCE). “It’s the [national] infrastructure problem, on steroids.”


Galloway noted that while teams are “working madly in Sacramento,” a better system is “not something you can put up overnight.” While “the corps and FEMA are trying to assess the challenges,” there are simply not enough budgeted funds to do so nationally.


[image error]

CREDIT: Dylan Petrohilos



Other parts of the country face the challenges of extreme weather events, but many of them do not have access to the same level of funding. “Unfortunately, all of this is coming about at a time when resources at all levels of government are as strained as they have ever been,” SAFCA’s Johnson said. Few other states or localities have voted explicitly to increase their own taxes to pay for flood protections — and the Army Corps of Engineers Civil Works budget is a very limited pie.


People say, ‘fix the roof when it’s dry,’ but we tend to fix the roof after the rainfall has gone through it.

While Washington has authorized action to address the nation’s levee system, it has not appropriated the necessary funds to make that happen. The Association of State Floodplain Managers’ Larson notes that the 2014 Water Resources Reform and Development Act created the National Levee Safety Program, but it has not yet gotten off the ground due to lack of funding: “They said we should develop standards nationally. The corps started that process, but haven’t been able to do anything with it, because Congress hasn’t funded it.”


“People say, ‘fix the roof when it’s dry,’ but we tend to fix the roof after the rainfall has gone through it,” Galloway added. “We got money for New Orleans, after the hurricane. We got money for Sandy after it occurred.” While Congress — hamstrung by spending caps and anti-tax pledges — has been unwilling to make the needed investment, the lack of investment in the needed ounces of prevention will ultimately cost the nation a lot more in pounds of cure.


“You’re gonna pay me now or you’re gonna pay me later,” Galloway said. “The bottom line is that ‘levee’ is not a four letter word. The levees on the Mississippi in 2011 did a lot for people and saved billions in damages. The 1927 flood literally crippled our nation.”


While Katrina’s death toll, the product of a delayed and insufficient evacuation effort, is not a typical characteristic of major floods, the astronomical cost associated with the resultant property damage is, Larson said. And the costs go beyond just the immediate damages, including long-term business costs, health costs, and mental health impacts. “We don’t really know how much floods cost us in the United States. We have these silly estimates that are so far off, it’s unbelievable.”


Larson noted one study found levee failure in the Sacramento area could cost ten times more than Katrina. “We need to get serious, he concluded. “We get a four-to-one return on mitigation money. But we don’t invest — we spend almost all of our money on response and immediate recovery.”



Tags

CaliforniaClimate ChangeFlood InsuranceFloodsInfrastructure

The post 10 Years After Katrina, Will Sacramento Be The Next New Orleans? appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 13:08

Wildfires Are Pushing Western States To Their Limits And Pushing Toxic Smoke As Far As New Mexico

The 2015 wildfire season continues to rage throughout the West, as historic drought conditions and record temperatures have pushed many states’ fire suppression capacities to their breaking points.


As of Monday morning, at least 13 large fires burned across the central and eastern portion of Washington, while 11 burned across Oregon. All told, 65 major wildfires are currently burning across seven Western states. According to the National Interagency Fire Center statistics, more than 27,000 firefighters are deployed across the country. To date, this years’ fire season has burned 7,487,737 total acres, more than any other season in the last 10 years.


“Nationally, the system is pretty tapped,” Rob Allen, the deputy incident commander for the fires around the Cascade Mountain resort town of Chelan, told the Associated Press last Wednesday. “Everything is being used right now, so competition for resources is fierce.”


Last week, for the first time since 2006, the National Interagency Fire Center mobilized 200 active-duty military troops to help control the fires that are spreading throughout the West. Along with active-duty soldiers, members of the National Guard and Air Force have already been called to help fight the fires. This weekend, dozens of firefighters from Australia and New Zealand were deployed to help fight blazes in Idaho, Washington, Montana, Oregon, and California. This isn’t the first time that firefighters have come from Australia to help fight U.S. fires — under an exchange program, Australian firefighters have come to the United States 11 times since 2000, according to the Strait Times.


The military and foreign firefighters will provide crucial manpower for Western firefighting teams that have all but exhausted their local resources. Last week, the Los Angeles Times published a story about Rick Anderson, a fire chief in Stevens County, Washington who was forced to fight a fire with just 11 other firefighters and pickup trucks carrying 300-gallon water tanks. When Anderson called surrounding fire agencies to ask for reinforcements, he was told that none had extra manpower to spare — they were all busy fighting their own fires.


Anderson’s story is just one example of a fire season that has pushed local and federal fire agencies to the brink.


“It’s like the fire season gas pedal has been pushed to the floor in a really short period of time, and that’s stressed our resources,” Ken Frederick, a spokesman for the National Interagency Fire Center, told the Associated Press. “And that’s got us relying on help from resources we don’t normally use.”


The exceptionally dangerous season has also brought tragedy: last Wednesday, three firefighters died when their vehicle crashed and was caught by flames near a fire in north-central Washington. Four others were injured in the same incident, according to the New York Times.


On Friday, President Obama declared a federal state of emergency for Washington, which allows both the U.S. Department of Homeland Security and FEMA to coordinate relief efforts in order to help a state stretched thin from battling a surge of recent fires.


The wildfires have also impacted air quality throughout the Pacific Northwest and as far as New Mexico. Air quality throughout Washington is currently listed as “unhealthy,” and during a press conference on Monday, officials implored residents to stay indoors, keep doors closed, and even shut off air conditioning units that could cycle smoke particles into a home. Much of Oregon also had air quality levels deemed “unhealthy” throughout the weekend, though the Western part of the state saw some improvement Monday morning as winds shifted.



Even in areas far from the active fires, air quality has fallen. A doctor at a clinic in Battle Mountain, Nevada told the Guardian that he has seen a “striking increase” in the number of patients admitted for asthma, worsening lung disease, and even pink eye. Nevada’s air quality on Friday hovered between “moderate” and “very unhealthy.”



Tags

Climate ChangeWildfires

The post Wildfires Are Pushing Western States To Their Limits And Pushing Toxic Smoke As Far As New Mexico appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 11:58

Humans Are Set To Wipe An India-Sized Chunk Of Forest Off The Earth By 2050

By 2050, an area of forests the size of India is set to be wiped off the planet if humans continue on their current path of deforestation, according to a new report. That’s bad news for the creatures that depend on these forest ecosystems for survival, but it’s also bad news for the climate, as the loss of these forests will release more than 100 gigatons of carbon dioxide into the atmosphere.


The report, published Monday by the Center for Global Development (CGD), found that, without new policies aimed at cutting back on deforestation, 289 million hectares (about 1,115,840 square miles) of tropical forests will be cleared away. That’s a chunk, the report states, that’s equal to one-seventh of what the Earth’s total tropical forest area was in 2000. And, according to the report, the 169 gigatons of carbon dioxide that this deforestation will unleash is equal to one-sixth of the carbon budget that humans can emit if they want to keep warming below 2°C — the level that’s generally viewed as the maximum warming Earth can endure while still avoiding the most dangerous climate impacts (and even 2°C is seen by many experts as too high).


The study, unlike other recent studies on deforestation, projects that in a business-as-usual scenario, in which the world doesn’t make any effort to reduce deforestation, tropical deforestation will increase, rather than decrease. According to the study, tropical deforestation rates in such a scenario will likely climb steadily in the 2020s and 2030s and then speed up around 2040, “as areas of high forest cover in Latin America that are currently experiencing little deforestation come under greater threat.”


The study does point to one change in policy that would cut deforestation rates and help alleviate climate change: a price on carbon. According to the report, a price of $20 per ton of carbon would keep 41 gigatons of carbon dioxide from being emitted between 2016 and 2050, and a price of $50 per ton would keep 77 gigatons from being emitted.


“Our analysis corroborates the conclusions of previous studies that reducing tropical deforestation is a sizable and low-cost option for mitigating climate change,” the study’s authors write. “In contrast to previous studies, we project that the amount of emissions that can be avoided at low-cost by reducing tropical deforestation will increase rather than decrease in future decades.”


The study also noted that, if all tropical countries put in place anti-deforestation laws that were “as effective as those in the Brazilian Amazon post-2004,” then 60 gigatons of carbon dioxide would be kept out of the atmosphere. Brazil took action against deforestation in 2004 and 2008, and deforestation rates in the country have fallen from 27,000 square kilometers (about 10,424 square miles) in 2004 to 7,000 square kilometers (about 2,700 square miles) in 2010. According to the Climate Policy Initiative, this slowdown in deforestation rates helped keep about 2.7 billion tons of carbon dioxide in these forests and out of the atmosphere.


Forests can act as major carbon sinks, but for some forests, that role may be changing. A study from this year published in Nature documented the “long-term decline of the Amazon carbon sink,” which the study says could be occurring due to changes in climate. The study also points to increasing tree mortality rate — via deforestation — as another factor in the forests’ decreasing ability to store carbon.


Monday’s study noted that decreasing emissions from deforestation is a relatively cheap way for countries to reduce their overall emissions. If countries implemented a system in which wealthy countries paid tropical countries to keep their forests intact, those payments by wealthy countries would constitute a cheaper way to fight climate change than some alternatives.


“Conserving tropical forests is a bargain,” CGD research fellow and report co-author Jonah Busch said in a statement. “Reducing emissions from tropical deforestation costs about a fifth as much as reducing emissions in the European Union.”


Other studies have warned of the danger the world is in if countries don’t curb rates of deforestation and forest degradation. A study published this week in Science warned that, without policy changes, the world’s forests will become increasingly broken into unconnected patches — a fragmentation that will endanger the species that live in the forests.


“I fear a global simplification of the world’s most complex forests,” Simon Lewis, lead author of the study and tropical forest expert at the University of Leeds said in a statement. “Deforestation, logging and road building all create fragmented patches of forest. However, as the climate rapidly changes the plants and animals living in the rainforest will need to move to continue to live within their ecological tolerances. How will they move? This is a recipe for the mass extinction of tropical forest species this century.”



Tags

DeforestationForest

The post Humans Are Set To Wipe An India-Sized Chunk Of Forest Off The Earth By 2050 appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 10:23

Joseph J. Romm's Blog

Joseph J. Romm
Joseph J. Romm isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Joseph J. Romm's blog with rss.