Joseph J. Romm's Blog, page 101
August 27, 2015
California Farms Raking In Cash Despite Drought
California is an agricultural powerhouse, supplying the majority of the nation’s fruits, vegetables, and nuts.
California also happens to be in the midst of the fourth year of a historic drought — the most severe in recorded history — which is a problem for all that agricultural production, because growing crops — or raising livestock, or tending to nurseries — requires water, and large-scale agriculture, like the kind in California, requires a lot of water.
[Groundwater] is really helping us today, but it’s shifting the cost and the burden to others
Numerous models predicted that California’s agricultural sector would be hit hard by the drought. But emerging data shows that California’s farmers have been more resilient than expected — according to the first comprehensive analysis of the actual impacts of the drought published Wednesday by the Pacific Institute, California’s agricultural sector experienced record sales in 2013 and 2014.
That economic boom comes at a cost to the environment, however, as farmers drill deeper into the ground to tap groundwater, raising questions about who will eventually bear the costs of depleted groundwater resources and damaged infrastructure.
“We think that looking at drought impacts can provide important learning opportunities,” Heather Cooley, co-director of the Pacific Institute’s Water Program, told ThinkProgress. “It can help us prepare for time when water is constrained, and we think it’s important to better understand what the actual impacts are.”
To understand the actual impacts, researchers at the Pacific Institute — a California-based nonpartisan research institute — looked at emerging data from the USDA and the National Agricultural Statistics Survey (NASS), as well as employment data from the Employment Development Department. Because the study relied on whatever data was publicly available at the time, it focused only on crop revenue — not livestock, dairy, nurseries, or greenhouses — and analyzed that data on a state-wide scale, not at the county level.
Cooley admits that those restrictions limit the completeness of the picture — especially in counties that have seen a lot of farmers letting their fields go fallow, which can significantly impact availability of farm worker jobs in those areas. A recent study from UC Davis found that more than 21,000 people have lost jobs due to the drought, and the majority of those job losses come from the agricultural sector.
Still, Cooley said that analyzing statewide data still provides an important large-scale look at how California’s agricultural sector is faring through years of water scarcity.
“This is really the first assessment to look at the actual data on the impacts,” she said. “This is a piece of the puzzle. It’s an important piece, an estimated $33 billion part of the agricultural economy.”
[image error]
CREDIT: Pacific Institute
According to USDA and NASS data, 2014 was the year with the lowest harvested acreage for field crops in California over the past 15 years. But despite losing acreage, California farmers, on a statewide level, haven’t been losing money — both 2013 and 2014 were record years for revenue, with 2013 reaching a record high of $34 billion.
There are a few reasons for the surge in revenue despite a drop in available surface water and crop acreage. The first is that for the past decade or so, California farmers have increasingly been making the switch from low-value crops — like alfalfa, cotton, and rice — to high-value crops, like almonds. Although the almond has been the source of much environmental rancor — an illustration of California agriculture’s water lust, according to some — it’s actually a pretty smart crop for farmers to grow, in part because it’s worth so much more than the water that goes into it. The same cannot be said for alfalfa, much of which is grown for livestock and dairy feed or shipped overseas.
Farmers have also been adopting better efficiency measures when it comes to water use, like switching from flood irrigation to drip irrigation. According to Cooley, in 2010 — the most recent year for which there is data — drip irrigation was just beginning to overtake flood irrigation across the state. Today, she wagers, most farmers are probably using drip irrigation. Switching to drip irrigation can help farmers use more precise amounts of water for their crops, as opposed to simply flooding the field, which both demands more water upfront and comes with greater risk of agricultural runoff.
But switching from flood to drip isn’t necessarily a panacea. For starters, flood irrigation requires more water initially, but also allows unused water to flow back into rivers and streams and recharge underground aquifers. And, when farmers switch to drip irrigation, they actually might end up using more water than farmers that use flood irrigation, simply because drip irrigation is so much more efficient — so efficient that it boosts yields, allowing farmers to grow more crops, requiring farmers to use more water.
The San Joaquin Valley is sinking faster than ever before — 13 inches in just eight months
Another big reason California farmers have been able to weather the drought so successfully? Even though surface water — the kind of water pumped in from reservoirs and rivers — has been scarce, many have been able to make up for shortages by pumping groundwater — natural underground aquifers that fill with rainwater over centuries and millenia. Right now, California doesn’t actively regulate the use of groundwater; though the state recently passed a bill to regulate the extraction of groundwater, its effects won’t be felt until 2040. In an average year, the state gets about 30 to 40 percent of its water supply from groundwater. In 2014, according to a UC Davis report, farmers replaced some 75 percent of their surface water resources with groundwater.
“To some degree, it’s okay to rely on groundwater during a drought, that’s one of its major advantages,” Cooley said. “The problem is we are pumping very heavily we are massively over-drafting groundwater in parts of the state.”
Groundwater overdraft can lead to a number of problems, Cooley said, because it prioritizes the short-term over the long-term. The aquifers that California is tapping into in order to bolster its water supply took thousands of years to fill — and as surface aquifers have become depleted by years of drought and groundwater pumping, farmers and water districts are drilling deeper and deeper in order to reach water. That has compromised the integrity of California’s land. Last week, NASA released a study showing that the San Joaquin Valley is sinking faster than ever before — 13 inches in just eight months. That rapid sinking can cause roads to buckle or bridges to crack. And as pumping groundwater dries and damages the soil, those aquifers may never be able to hold as much water as they once did.
“This is really helping us today, but it’s shifting the cost and the burden to others,” Cooley said. “There are disadvantages communities in the Central Valley whose wells are going dry. There are going to need to be investments made in order to repair this infrastructure, and while better groundwater management is on the horizon, it’s not for another 25 years.”
Cooley said that while she was surprised to see how well farmers had done, considering how dire the water situation in the state has been, it’s important to think about maintaining healthy groundwater supplies now, so that those resources aren’t completely gone by the time another drought rolls around.
“Thinking about the future of California water, the climate models suggest it’s going to be more variable — we’re going to have some wetter years and some drier years, but they all agree it’s going to be warmer,” Cooley said. “We need to be working more quickly to improve groundwater management in California, because we will undoubtedly have another drought in [the next] 25 years.”
Tags
AgricultureCaliforniaClimate ChangeDrought
The post California Farms Raking In Cash Despite Drought appeared first on ThinkProgress.
New Orleans’ Greatest Threat Is Climate Change Plus The ‘Loop Current’ Plus A Future Katrina
Future Katrinas will become more and more devastating to New Orleans and the entire Gulf of Mexico. If we don’t tackle climate change ASAP, it is hard to see how New Orleans could survive the century.
While most stories making this point tend to focus on sea level rise, I’m going to look at the role of the “Loop Current” — and why Hurricane Katrina (and Gustav) weren’t as strong and hence as devastating at landfall as they could have been.
The key point: All things being equal, if a storm taking the same track of Katrina (or Gustav) occurred in 2050, then, rather than weakening before making landfall, it will probably strengthen considerably, creating far more havoc. To understand why, let’s first answer the question — How did Katrina turn into a powerful Category Five hurricane so rapidly?
The National Climatic Data Center 2006 report on Katrina begins its explanation by noting that the sea surface temperatures (SSTs) in the Gulf of Mexico during the last week in August 2005 “were one to two degrees Celsius above normal, and the warm temperatures extended to a considerable depth through the upper ocean layer.”
The report continues, “Also, Katrina crossed the ‘loop current‘ (belt of even warmer water), during which time explosive intensification occurred. The temperature of the ocean surface is a critical element in the formation and strength of hurricanes.” The Loop Current is “a large eddy of 200-foot-deep warmer water that often breaks off from the Gulfstream and floats around the Gulf.” More on it here.
One of the ways that hurricanes are weakened is the upwelling of colder, deeper water due to the hurricane’s own violent churning action. But if the deeper water is also warm, it doesn’t weaken the hurricane and often continues to intensify it. When a Gulf hurricane passes over the Loop Current or one of its eddies, it spins up. When it moves off the Loop, it spins down.
Katrina doubled in size as it passed over the Loop Current. And it “evolved from a Category 3 hurricane to a Category 5 hurricane in just nine hours by converting heat from the Loop Current into energy,” as one analysis explained. Once it left the Loop and passed over cooler water on its way to landfall, it dropped back to a Cat 3.
You can see all that in this UC-Boulder graph of Katrina’s maximum wind speed as it goes on and off the Loop (where sea surface height is a proxy for sea surface temperature):
[image error]
Katrina rapidly intensifies as it traverses the Loop current, then weakens as it crosses a cool pool of water on the way to landfall.
What precisely would happen if a hurricane was ever able to ride warm Gulf water all the way to landfall? We have some idea because that appears to have happened once in the relatively recent past:
An example of how deep warm water, including the Loop Current, can allow a hurricane to strengthen, if other conditions are also favorable, is Hurricane Camille, which made landfall on the Mississippi Gulf Coast in August of 1969. Camille formed in the deep warm waters of the Caribbean, which enabled it to rapidly intensify into a Category 3 hurricane in one day. It rounded the western tip of Cuba, and its path took it directly over the Loop Current, all the way north towards the coast, during which time the rapid intensification continued. Camille became a Category 5 hurricane, with an intensity rarely seen, and extremely high winds that were maintained until landfall (190 mph / 305 km/h sustained winds were estimated to have occurred in a very small area to the right of the eye).
That of course was pure happenstance — bad luck. By the time of Katrina, global warming was certainly one part of the reason the waters of the Gulf in August 2005 were warmer than normal — though transiting the Loop Current for so long was still happenstance.
But by mid-century, the whole Gulf of Mexico in the summer is going to be much, much warmer, thanks not to happenstance but to human emissions. Global warming heats both the sea surface and the deep water, thus creating ideal conditions for a hurricane to survive and thrive in its long journey from tropical depression to Category Four or Five superstorm.
Here’s one simulation of the region in 2050 by Oak Ridge National Laboratory, assuming no serious effort is made to reverse current emissions trends.
[image error]
This suggests that future Katrinas will spin up into even stronger storms and be much less likely to weaken as much before they hit the shore. I wrote about this in my 2006 book, Hell and High Water. At the same time, the inland United States will heat up at an even faster rate, so the Mississippi River will not be such as cool a stream of water pouring into the Gulf. Also, as the sea level rises, the protective outer delta of the Mississippi River will continue to disappear and storm surges will penetrate deeper inland. Hurricanes weaken rapidly over land. Even one foot of shallow delta water can dramatically reduce this weakening effect, allowing hurricanes to reach deeper inland with their destructive force.
As of the end of August 2015, there has not been a major hurricane (Category Three or stronger) that make U.S. landfall in over nine years, which is a record. Two things are worth noting about that. First, Sandy was not a major hurricane at landfall, indeed it was close to not even being a Category 1. Yet this warming-worsened storm was the second costliest storm in U.S. history (after Katrina) — and the “largest hurricane in Atlantic history measured by diameter of gale force winds (1,040mi).” So you don’t need to be a Category Three storm to cause devastation.
Second, hurricane researchers who have studied the matter have concluded the dry spell in major landfalling hurricanes is mostly a matter of luck. “Lucky break kept major hurricanes offshore since 2005,” is how the American Geophysical Union summed up one 2015 study it published. “It seems to be an accident of geography, random good luck,” explained lead author, NASA’s Timothy Hall, in NASA’s news release. “The last nine hurricane seasons were not weak — storms just didn’t hit the U.S.” And Colorado State University meteorologist Phil Klotzbach, who was not part of the study, said: “I think that there has been a significant ‘luck’ component involved.”
If you want an idea of just how lucky New Orleans has been, consider Hurricane Gustav, “the second most destructive hurricane of the 2008 Atlantic hurricane season,” which caused caused some $6.6 billion in damages and killed over 150 people. It peaked in intensity at Category Four but “dropped just below the Category 3 threshold to Category 2 by landfall.”
Here are some plots of the Gulf’s sea surface temperature (SST) as Gustav would have seen it (via Jeff Masters’ WunderBlog):
[image error]
Tropical Cyclone Heat Potential (TCHP) for August 28, 2008. Values of TCHP greater than 80 are commonly associated with rapid intensification of hurricanes. The forecast points from the NHC [National Hurricane Center] 5 am Saturday forecast are overlaid. Gustav is currently crossing over a portion of the Loop Current with extremely high value of TCHP of 120. However, Gustav will then cross over a cold eddy, and will miss crossing the warm Loop Current eddy that broke off in July.
Masters added, “Note that this forecast is old, and the newer forecasts bring Gustav much closer to New Orleans.” The same is true of this chart:
[image error]
Forecast track and sea surface temperature response to the passage of Gustav, as simulated by the GFDL model at 8 am EDT Saturday 8/30/08. Passage of Gustav over the relatively shallow depth of warm water near the coast will allow Gustav to upwell large amounts of cold water from the depths. This will chill the surface waters down by up to 5°C (9°F).
So yes, New Orleans got lucky and the fact Gustav wasn’t a major hurricane at landfall was just happenstance.
Now imagine we are in the year 2050 with the same storm track. Then Gustav in 2050 doesn’t weaken before landfall and perhaps even strengthens more. We could easily be talking a Gustav in 2050 that is a Category Four or even Category Five at landfall, rather than just a strong Category Two.
Katrina was able to ride the Gulf Loop Current and Eddy Vortex closer to the coast than Gustav, but it still smashed into colder water. Again, in 2050, that weakening is going to be a lot less likely to occur, and again, we could see a Category Five at landfall.
One issue remains. Clearly global warming means warmer surface water and warmer deep water. All things being equal, that means future hurricanes that travel the same path are going to stay stronger longer and possibly even intensify where earlier hurricanes had weakened.
What we don’t know for certain is if, in fact, all things will be equal. Perhaps global warming will create other conditions that might serve to change their storm path or weaken hurricanes (by, say, increasing wind shear). The recent literature, however, suggests that Category Four and Five hurricanes have become more common — and will keep doing so as long as we keep warming the Earth and its oceans. A 2013 study found:
The response to a 1°C warming is consistently an increase [in Katrina-level storm surges] by a factor of 2–7…. This increase does not include the additional increasing surge threat from sea level rise.
Moreover, serious global warming has been going on for a few decades now, and just in the past ten years we’ve had two major hurricanes ride straight up into the New Orleans area, three if you count Rita. So it makes little sense to hope or plan that future global warming will mean significantly fewer major hurricanes that end up on a path towards the Louisiana coast.
I emailed MIT climatologist Kerry Emanuel, one of the world’s leading experts on hurricanes and how climate change affects them. He is author of a 2013 study, “Downscaling CMIP5 climate models shows increased tropical cyclone activity over the 21st century,” which found that human-caused global warming is projected to intensify future hurricanes. He explained than when he looks at “tropical cyclones in the Gulf of Mexico using seven CMIP5 models, following the technique described in” that paper (for the business as usual case), “I find substantial increases in hurricane risk during the 21st century in six of the seven models.”
We are stuck with a fair amount of warming over the next few decades no matter what we do. But if we don’t reverse emissions trends ASAP, then Category Four and Five storms smashing into the Gulf coast seem likely to become rather common in the second half of this century. Business as usual warming takes us to this hot world where the Gulf is one very warm tub of water:
[image error]
And that future of supercharged hurricanes will be doubly untenable in the business as usual case, as experts now say we may be looking at seas 4 to 6 feet (or more) higher by 2100, with sea level rise as much as one foot per decade after that.
Preserving the habitability of the Gulf and South Atlantic Coasts this century can only plausibly be achieved if we reverse U.S. and global emissions trends sharply and quickly.
Tags
Climate ChangeHurricane Katrina
The post New Orleans’ Greatest Threat Is Climate Change Plus The ‘Loop Current’ Plus A Future Katrina appeared first on ThinkProgress.
Jindal Writes Letter To Obama Telling Him Not To Talk About Climate During Katrina Anniversary Visit
President Obama is heading to New Orleans Thursday to mark the tenth anniversary of Hurricane Katrina. But there’s one subject Louisiana Gov. Bobby Jindal hopes the president won’t broach in his remarks to city residents: climate change.
Jindal, who’s also running for president on the Republican ticket, sent a letter to Obama Wednesday urging the president not to mention climate change during his trip to Louisiana.
“Although I understand that your emphasis in New Orleans will – rightly – be on economic development, the temptation to stray into climate change politics should be resisted,” Jindal states in the letter. “While you and others may be of the opinion that we can legislate away hurricanes with higher taxes, business regulations and EPA power grabs, that is not a view shared by many Louisianians. I would ask you to respect this important time of remembrance by not inserting the divisive political agenda of liberal environmental activism.”
This week, Jindal writes in the letter, should be solely about mourning those lost in the storm and celebrating the recovery New Orleans has made so far.
But Obama had a different plan for his time in New Orleans. The trip is the second stop in his 11-day climate tour, a trip that started at the Clean Energy Summit in Las Vegas and will end in Alaska. Along the way, Obama is planning on speaking about the country’s need to invest in renewable energy, protect coastal communities, and be part of a global effort to address climate change.
In New Orleans, Obama is planning to discuss the city’s “extraordinary resilience” in the face of the storm, and is also going to address the government’s failings when it came to serving the people of New Orleans. But climate change — and the need to protect communities like New Orleans from future disasters — is also expected to come up.
“One thing that the president will certainly talk about in New Orleans tomorrow is the need for the federal government, and in communities all across the country, to make the kinds of investments in resilience so that our communities can better withstand stronger tornadoes, more violent hurricanes, more widespread wildfires, those kinds of things,” White House Press Secretary Josh Earnest said.
The White House hasn’t yet responded to Jindal’s letter. But the governor, who’s trailing behind his fellow Republican presidential contenders in opinion polls, was clear in his letter that the president shouldn’t stray far from the topic of Louisiana and its response to the storm.
“A lecture on climate change would do nothing to improve upon what we are already doing. Quite the opposite; it would distract from the losses we have suffered, diminish the restoration efforts we have made, and overshadow the miracle that has been the Louisiana comeback,” Jindal writes. “Partisan politics from Washington, D.C. are unwelcome in Louisiana at the best of times. This week it would be met with nothing but derision.”
Jindal’s long been skeptical of government efforts to address climate change and wary of the scientific consensus that humans are the main cause of climate change. Jindal said in 2014 that though he does think humans play a role in climate change, “the real question is how much.” He’s also called climate change a “Trojan horse” used by the federal government to increase regulation.
Despite Jindal’s views of climate change and how it relates to Katrina, however, climate change is already contributing to extreme weather events around the world. And as oceans heat up, the warmer waters are expected to intensify hurricanes.
Tags
Bobby JindalClimate ChangeHurricane Katrina
The post Jindal Writes Letter To Obama Telling Him Not To Talk About Climate During Katrina Anniversary Visit appeared first on ThinkProgress.
New Orleans Parade To Highlight How Climate Change And Inequality Intersect
The tenth anniversary of Hurricane Katrina will — as in years past — be commemorated by a second line parade to honor the lives lost in the storm and remember that many are still struggling to rebuild. But this year, activists and New Orleans residents want to make the parade the biggest the city’s ever seen, and they want to connect the disaster to the outsized threat poor communities and communities of color still face from pollution and climate change.
We know from Hurricane Katrina that it is the poor and people of color who get left behind
Rev. Lennox Yearwood, president and CEO of the Hip Hop Caucus, a group that seeks to organize young people around political causes, is helping highlight the issues of climate and environmental justice in discussions about Katrina. The second line — a parade that, in New Orleans, traditionally follows after a main parade — will commemorate the tenth anniversary of Katrina on August 29 and will feature environmental leaders such as 350.org’s Bill McKibben and Sierra Club’s Michael Brune.
After the second line, Yearwood is setting out on the People’s Climate Music bus tour, which aims to spread the message of climate justice — the idea that the poor and communities of color are particularly at risk from climate change — to cities around the country. Yearwood said that the discussion of environmental justice — that pollution and energy development often disproportionately affects the poor and communities of color — has been wrapped up in discussions of Hurricane Katrina for several years. Louisiana is home to “cancer alley,” a corridor dotted with industrial plants that has a high rate of cancer, and it was also impacted by the BP oil spill. So Louisiana residents know about the impacts caused by environmental pollution.
But this year, Yearwood wants to make the link specifically to climate change, to “draw attention to the injustice inherent in the fossil fuel economy” and call on U.S. lawmakers to embrace clean energy.
“We know from Hurricane Katrina that it is the poor and people of color who get left behind,” he told ThinkProgress. “Climate change is a life and death issue, not only for New Orleans but for communities across the U.S.”
Climate change — and the pollution that causes it — doesn’t affect all regions or communities equally. Globally, climate change and the sea level rise, extreme weather, and unpredictable precipitation that comes along with it is expected to hit poor countries the hardest. Overall, poor people are more highly at risk of climate impacts for various reasons: they might have a manual labor job that has them working outdoors for long hours of the day, putting them at higher risk of heat-related complications as temperatures rise. Or they may not have the money for air conditioning, or the resources to relocate when a major storm comes through town. In the U.S., non-white Americans breathe in air that on average is more polluted than white Americans — according to one study, people of color in the country breathe in air with 38 percent more nitrogen dioxide than white people.
This disparity is what Yearwood wants to highlight. He said the anniversary of Katrina is the perfect time to do so — research has shown that African Americans were more heavily impacted by Hurricane Katrina, and many Louisiana residents said after the storm that they thought race was an issue in the response to the disaster.
“Decisions made centuries ago exerted their influence in the lives and deaths of victims of Hurricane Katrina,” a 2008 study from the Joint Center for Political and Economic Studies concludes. “A mindnumbing parade of zoning and land-use choices, highway and seaway budgets, and social and political desensitization helped to bring this nation to the flooded rooftops of the Lower Ninth Ward.”
[image error]
CREDIT: JOINT CENTER FOR POLITICAL AND ECONOMIC STUDIES
Now, especially as the Black Lives Matter movement gains steam, climate and environmental justice issues should be at the forefront of people’s minds, Yearwood said.
“The epitome of black lives matter is Katrina,” Yearwood said. The outsized impact that Katrina had on African Americans didn’t happen in the time of Twitter or hashtags, he said, but it still resonates with the racial justice movement today.
Aaron Mair, president of the Sierra Club, agrees. He said on a press call Tuesday that, though it’s often said that nature doesn’t discriminate, it’s true that communities of color are often on the front lines of climate change.
“Yes it is seen in the police, but it’s also seen in climate,” he said.
That’s why, Yearwood said, the People’s Climate Music bus tour’s first stop will be Ferguson, Missouri — the city where, in 2014, 18-year-old Michael Brown was shot and killed by a white police officer, an event that spurred protests in the city and throughout the country. The tour will also be hitting other cities with histories of racial unrest, including Baltimore and Chicago.
Along with bringing attention the the climate justice movement, Yearwood is using the tour to call on world leaders to adopt a strong climate agreement during the U.N.’s Paris climate talks this winter, and on U.S. officials to uphold the president’s Clean Power Plan and implement pro-climate policies. But most of all, he wants the tour to bring in more people from all races into the climate movement — an attempt to diversify a movement that’s historically been largely white.
“We want to expand the frame and popularize the climate movement using culture,” he said.
Tags
Black Lives MatterEnvironmental JusticeHurricane Katrina
The post New Orleans Parade To Highlight How Climate Change And Inequality Intersect appeared first on ThinkProgress.
August 26, 2015
New Jersey Settles For $8.6 Billion Less Than It Said It Needs To Clean Up Exxon Pollution
A state supreme court judge approved a settlement between Exxon and New Jersey on Monday, despite the fact that the settlement was $8.68 billion less than the $8.9 billion the state had originally requested.
Environmentalists are calling foul on the agreement, which they say falls dramatically short of the amount needed to clean up and restore 1,500 polluted acres of wetlands and surrounding natural environment in northern New Jersey, where Exxon operated a petrochemical operations for decades. Exxon was found responsible for the pollution in 2008.
“It’s certainly really disappointing and a little hard to understand from the outside,” Margaret Brown, an attorney with the Natural Resources Defense Council (NRDC) told ThinkProgress. Brown said that up until the closing arguments last November, the state was saying that it needed $2.5 billion to clean up the site and was asking for $8.9 billion in costs for clean up and restoration. NRDC and a group of environmental advocates applied to be named as intervenors in the case in June, but the judge turned them down.
“Once Exxon and the state agreed to the settlement, there was no one arguing the other side,” Brown said.
Calling it a “landmark settlement,” the acting attorney general’s office released a statement saying the decision “represents the single largest environmental settlement with a corporate defendant in state history and, following resolution of any appeals, ends more than a decade of aggressive and costly litigation and negotiations by the state spanning multiple administrations.”
The environmental degradation in the area is severe. According to court documents, wetlands in the area were “mostly covered with a tar of petroleum products or filled with other hazardous constituents and debris,” and there were 45 acres described as “sludge lagoons.” A state report found that “many of these dredge fill areas still look and smell like petroleum waste dumps,” the New York Times reported. “Spilled materials from pipeline ruptures, tank failures or overflows, and explosions have resulted in widespread groundwater, soil and sediment contamination,” the report continues.
The final settlement was a long time in the making. The state first filed its lawsuit in 2004, for contamination that happened over decades. That contamination includes pollution that occurred before 1976, when the state passed the so-called Spill Act, which holds polluters accountable. In an Frequently Asked Questions document provided by the attorney general’s office, the state said Exxon argued “vigorously” that discharges prior to the passage of the Spill Act should not be covered. The state suggested that continuing to pursue an award from the courts could result in the court agreeing that Exxon did not have to cover those damages.
According to the New York Times, the Christie administration petitioned the court twice to hold off on issuing an award, citing ongoing settlement talks. In February, the two litigants submitted the settlement.
In his decision, Judge Michael Hogan wrote, “For seven years, Exxon repeatedly responded to the State’s olive branches with only token offers. An aggressive trial strategy is often the only way to bring reluctant parties to the table, and the state employed this tactic with success. The February 2015 agreement was not made on a whim, but was the end product of lengthy negotiations and zealous advocacy at trial.”
Only $50 million of the settlement is set aside for natural resources, Brown said. Another $50 million will go to outside counsel, and the rest will fall to New Jersey’s general fund.
“To us its just simply unacceptable to accept such a small amount, and it’s just really a loss for the people of New Jersey and the environment,” she said.
Tags
Chris ChristieExxonNew Jersey
The post New Jersey Settles For $8.6 Billion Less Than It Said It Needs To Clean Up Exxon Pollution appeared first on ThinkProgress.
New Jersey Is Letting Exxon Pay $225 Million For $8.9 Billion Worth Of Pollution
A state supreme court judge approved a settlement between Exxon and New Jersey on Monday, despite the fact that the settlement was $8.68 billion less than the $8.9 billion the state had originally requested.
Environmentalists are calling foul on the agreement, which they say falls dramatically short of the amount needed to clean up and restore 1,500 polluted acres of wetlands and surrounding natural environment in northern New Jersey, where Exxon operated a petrochemical operations for decades. Exxon was found responsible for the pollution in 2008.
“It’s certainly really disappointing and a little hard to understand from the outside,” Margaret Brown, an attorney with the Natural Resources Defense Council (NRDC) told ThinkProgress. Brown said that up until the closing arguments last November, the state was saying that it needed $2.5 billion to clean up the site and was asking for $8.9 billion in costs for clean up and restoration. NRDC and a group of environmental advocates applied to be named as intervenors in the case in June, but the judge turned them down.
“Once Exxon and the state agreed to the settlement, there was no one arguing the other side,” Brown said.
Calling it a “landmark settlement,” the acting attorney general’s office released a statement saying the decision “represents the single largest environmental settlement with a corporate defendant in state history and, following resolution of any appeals, ends more than a decade of aggressive and costly litigation and negotiations by the state spanning multiple administrations.”
The environmental degradation in the area is severe. According to court documents, wetlands in the area were “mostly covered with a tar of petroleum products or filled with other hazardous constituents and debris,” and there were 45 acres described as “sludge lagoons.” A state report found that “many of these dredge fill areas still look and smell like petroleum waste dumps,” the New York Times reported. “Spilled materials from pipeline ruptures, tank failures or overflows, and explosions have resulted in widespread groundwater, soil and sediment contamination,” the report continues.
The final settlement was a long time in the making. The state first filed its lawsuit in 2004, for contamination that happened over decades. That contamination includes pollution that occurred before 1976, when the state passed the so-called Spill Act, which holds polluters accountable. In an Frequently Asked Questions document provided by the attorney general’s office, the state said Exxon argued “vigorously” that discharges prior to the passage of the Spill Act should not be covered. The state suggested that continuing to pursue an award from the courts could result in the court agreeing that Exxon did not have to cover those damages.
According to the New York Times, the Christie administration petitioned the court twice to hold off on issuing an award, citing ongoing settlement talks. In February, the two litigants submitted the settlement.
In his decision, Judge Michael Hogan wrote, “For seven years, Exxon repeatedly responded to the State’s olive branches with only token offers. An aggressive trial strategy is often the only way to bring reluctant parties to the table, and the state employed this tactic with success. The February 2015 agreement was not made on a whim, but was the end product of lengthy negotiations and zealous advocacy at trial.”
Only $50 million of the settlement is set aside for natural resources, Brown said. Another $50 million will go to outside counsel, and the rest will fall to New Jersey’s general fund.
“To us its just simply unacceptable to accept such a small amount, and it’s just really a loss for the people of New Jersey and the environment,” she said.
Tags
Chris ChristieExxonNew Jersey
The post New Jersey Is Letting Exxon Pay $225 Million For $8.9 Billion Worth Of Pollution appeared first on ThinkProgress.
Let’s See What Happens When This Group Of Scientists Retests Studies That Contradict Climate Science
The scientific consensus behind man-made global warming is overwhelming: multiple studies have noted a 97 percent consensus among climate scientists that the Earth is warming and human activities are primarily responsible. Scientists are as sure that global warming is real — and driven by human activity — as they are that smoking cigarettes leads to lung cancer.
But what if all of those scientists are wrong? What if the tiny sliver of scientists that don’t believe global warming is happening, or that human activities are causing it — that two to three percent of climate contrarians — are right?
That’s the hypothetical question that a new study, authored by Rasmus Benestad, Dana Nuccitelli, Stephan Lewandowsky, Katharine Hayhoe, Hans Olav Hygen, Rob van Dorland, and John Cook, sought to answer. Published last week in the journal Theoretical and Applied Climatology, the study examined 38 recent examples of contrarian climate research — published research that takes a position on anthropogenic climate change but doesn’t attribute it to human activity — and tried to replicate the results of those studies. The studies weren’t selected randomly — according to lead author Rasmus Benestad, the studies selected were highly visible contrarian studies that had all arrived at a different conclusion than consensus climate studies. The question the researchers wanted to know was — why?
“Our selection suited this purpose as it would be harder to spot flaws in papers following the mainstream ideas. The chance of finding errors among the outliers is higher than from more mainstream papers,” Benestad wrote at RealClimate. “Our hypothesis was that the chosen contrarian paper was valid, and our approach was to try to falsify this hypothesis by repeating the work with a critical eye.”
It didn’t go well for the contrarian studies.
The most common mistake shared by the contrarian studies was cherry picking, in which studies ignored data or contextual information that did not support the study’s ultimate conclusions. In a piece for the Guardian, study co-author Dana Nuccitelli cited one particular contrarian study that supported the idea that moon and solar cycles affect the Earth’s climate. When the group tried to replicate that study’s findings for the paper, they found that the study’s model only worked for the particular 4,000-year cycle that the study looked at.
“However, for the 6,000 years’ worth of earlier data they threw out, their model couldn’t reproduce the temperature changes,” Nuccitelli wrote. “The authors argued that their model could be used to forecast future climate changes, but there’s no reason to trust a model forecast if it can’t accurately reproduce the past.”
The researchers also found that a number of the contrarian studies simply ignored the laws of physics. For example, in 2007 and 2010 papers, Ferenc Miskolczi argued that the greenhouse effect had become saturated, a theory that had been disproved in the early 1900s.
“As we note in the supplementary material to our paper, Miskolczi left out some important known physics in order to revive this century-old myth,” Nuccitelli wrote.
In other cases, the authors found, researchers would include extra parameters not based in the laws of physics to make a model fit their conclusion.
“Good modeling will constrain the possible values of the parameters being used so that they reflect known physics, but bad ‘curve fitting’ doesn’t limit itself to physical realities,” Nuccitelli said.
The authors note that these errors aren’t necessarily only found in contrarian papers, and they aren’t necessarily malicious. In their discussion, they offer a suite of possible explanations for the mistakes. Many authors of the contrarian studies were relatively new to climate science, and therefore may have been unaware of important context or data. Many of the papers were also published in journals with audiences that don’t necessarily seek out climate science, and therefore peer review might have been lacking. And some of the researchers had published similar studies, all omitting important information.
These same errors and oversights, the authors allow, could be present in consensus climate studies. But those errors don’t contribute to a gap between public understanding and scientific consensus on the issue, the researchers argued. The mistakes also seemed to be particularly present in contrarian studies, Nuccitelli wrote.
In the end, the researchers stressed the overall importance of reproducibility in science, both for consensus views and contrarian ones.
“Science is never settled, and both the scientific consensus and alternative hypotheses should be subject to ongoing questioning, especially in the presence of new evidence and insights,” the study concluded. “True and universal answers should, in principle, be replicated independently, especially if they have been published in the peer-reviewed scientific literature.”
Tags
Climate ChangeClimate Science
The post Let’s See What Happens When This Group Of Scientists Retests Studies That Contradict Climate Science appeared first on ThinkProgress.
After 4 States Approved A Big Utility Merger, DC Shocked Everyone By Denying It Over Clean Energy
In a move that shocked both industry observers and grassroots clean energy advocates, the Public Service Commission of Washington, D.C. unanimously rejected a proposed merger between Exelon and Pepco. Together, they would have created the nation’s largest utility.
The commission wrote in its official summary, released Tuesday, that Exelon and Pepco “have not met their burden of persuading this Commission that the Proposed Merger is in the public interest.”
Why? The summary listed several points but a central conflict was over how renewable energy would fare.
“We are also concerned about the inherent conflict of interest that might inhibit our local distribution company from moving forward to embrace a cleaner and greener environment,” the Commission wrote in its summary.
“Therefore, the application is denied,” the summary concluded.
The merger, valued at $6.4 billion, would have resulted in Chicago-based, nuclear-heavy utility giant Exelon buying Pepco Holdings, which provides power to Washington, D.C. and parts of Virginia, Maryland, Delaware, and New Jersey.
Exelon and Pepco’s joint statement said they were “disappointed” with the decision, believing “it fails to recognize the benefits of the merger to the District of Columbia and its residents and businesses.” The utilities made the case that a merger would benefit customers and enhance reliability.
“We will review our options with respect to this decision and will respond once that process is complete,” the statement concluded.
“The public policy of the District is that the local electric company should focus solely on providing safe, reliable and affordable distribution service to District residences, businesses and institutions,” Betty Ann Kane, the commission chairman, said in her statement announcing the decision. “The evidence in the record is that sale and change in control proposed in the merger would move us in the opposite direction.”
Exelon and Pepco have 30 days to ask the Public Service Commission (PSC) to reconsider the denial, and it is likely they will do so, according to Gabe Elsner, executive director of the Energy & Policy Institute in Washington, D.C.
“Exelon has a long history of using the company’s political influence to restrict renewable energy policies”
The normally boring debate about a utility merger became a pitched battle between residents concerned about electricity rates and renewable energy advocates on one side, and the two utilities and their allies on the other. Nuclear plants provide most of Exelon’s power generation, followed by natural gas, oil, hydro, wind, and coal. Yet it is the utility’s past fights against state renewable energy policies that caused so many people to oppose the idea of Exelon purchasing Pepco.
“Exelon has a long history of using the company’s political influence to restrict renewable energy policies,” Elsner told ThinkProgress. “If the D.C. PSC had approved the merger, Exelon would have been empowered to continue its anti-renewable campaign, but the PSC’s rejection of the merger could help ensure that these two states’ renewable energy policies remain in place and continue to support the growth of renewable energy industry in D.C. and Maryland.”
Exelon fought hard against the renewal of the Wind Production Tax Credit at the national level because it would affect the bottom lines of their nuclear plants.
Local renewable advocates saw this decision as a line in the sand.
“Utilities that work to impede people’s desire to produce their own clean energy do so at their peril,” said Anya Schoolman, founder and executive director of the nonprofit Community Power Network. Schoolman called the PSC’s decision “a great victory for renewable energy” and played a central role in the fight to halt the merger. Her fight for neighborhood solar power access began several years ago when she and several neighbors found the process to install solar panels too complicated, and eventually founded an advocacy organization called D.C. Solar United Neighborhoods. She was honored at the White House last year as a “Champion of Change.”
“D.C. Solar United Neighborhoods fought hard because of Exelon’s well documented opposition to locally owned renewable energy, their opposition to wind, and their hypocritical track record to oppose policies that support renewable energy in the name of the ‘free market’ while asking for handout after handout for their nuclear business,” she told ThinkProgress. “Fundamentally, we argued, and the D.C. PSC agreed, that there was a conflict of interest between Exelon’s generation business and D.C.’s goals to build a modern, equitable, renewable, clean grid.”
“Utilities that work to impede people’s desire to produce their own clean energy do so at their peril”
The PSC seemed fully aware of the importance of the decision, and how much attention it would receive.
“This proceeding has generated more interest and more active participation by parties and interested persons than any other proceeding in the Commission’s more than a century of operations,” the PSC’s summary stated. “As we have been reminded throughout this process, this decision is one of the most significant decisions that the Commission will ever make.”
D.C.’s laws and statutes place obligations on the Commission that include determining whether the deal was in the public interest — and unlike some states, one of those factors is “conservation of natural resources and preservation of environmental quality.” This was certainly part of the PSC’s ultimate decision.
“In this proceeding,” the summary stated, “the Commission must decide who will control the District’s only local electric distribution company at a time when our city’s leadership, at the urging of many residents, has mandated that the District must pursue a cleaner and greener future that includes more renewable energy resources and more distributed generation and at a time when the electric industry is undergoing significant transformation.”
Because Pepco is an energy distribution company, and Exelon receives most of its revenue from energy generation, the PSC noted that this could cause a conflict with adverse consequences to D.C. ratepayers.
“What happened today is good for the long-term health of wind and solar.”
Despite generating no power on its own, Pepco is not exactly a national leader in renewable energy either. A recent study found that the distribution-focused utility was the worst in the country in connecting its residents to solar power. Advocates worried that Pepco becoming a subsidiary to a much larger utility that gets most of its revenue from generation would complicate efforts to generate more power from distributed renewable energy.
Pepco and Exelon have both made the case that the merger would have brought lower electricity rates to the District, but the PSC found little benefit for ratepayers, especially without Pepco representation on Exelon’s executive committee.
District officials, despite several council members’ close relationships with Pepco, had largely opposed the merger. This includes Mayor Muriel Bowser — earlier this year after a letter from the D.C. Attorney General’s office argued that the benefits of the merger would flow from D.C. ratepayers to shareholders in the Chicago-based company, Bowser said the letter spoke for the District.
The long march to Tuesday’s rejection seemed almost certain to result in an approval, after PSCs in Virginia, New Jersey, Delaware, and finally Maryland approved the merger.
“It was a fascinating process, that’s for sure,” Mike Tidwell, director of the Chesapeake Climate Action Network, told ThinkProgress. Tidwell said that Exelon wanted the merger “to prop up its nuclear fleet on the backs of low-carbon generation in the Mid-Atlantic.”
The Maryland Public Service Commission was supposed to be the wrench that stopped the otherwise smoothly-running gears of the merger process. Martin O’Malley, as governor, helped lead the fight against it while still in office. Tidwell noted that “O’Malley basically said ‘hell no.'” Even when his chosen successor lost to Republican Gov. Larry Hogan, Hogan defied expectations that the new administration would clear the way for the merger. Hogan stayed fairly neutral. The opposition of the Maryland Attorney General, the staff of the PSC, the environmental groups all suggested that Maryland may actually reject the merger.
Instead, the Maryland PSC voted 3-2 to approve it in May.
“The two commissioners who opposed it wrote a scathing letter to the majority, saying verbatim ‘we find the majority’s decision incomprehensible,'” Tidwell said. But he noted that that narrow decision may have helped to set the stage for D.C. to reject it.
“Most of us were shocked when they voted unanimously” to deny the merger…
Consumer and renewable energy advocates in D.C. did not have high hopes, however, because of the clout utilities have and the revolving door nature of the close ties local officials have with Pepco. But the decision to deny the application for the merger shocked the domestic electricity sector, clean energy advocates, as well as the stakeholders.
“Most people felt the best shot of stopping this merger was in Maryland, not D.C., so most of us were shocked when they voted unanimously” to deny the merger, Tidwell said.
“Exelon is likely as stunned to lose today as we were stunned when Exelon won in Maryland,” Tidwell said Tuesday. Exelon and Pepco’s stock prices dropped immediately after the decision, a year and a half after they spiked when the market found out about the proposed merger. “Exelon is a sick and ailing company because they’re overly burdened by a nuclear fleet,” Tidwell said, and wanted to seek revenue from a merger like this one. Other companies may want to buy Pepco; advocates hope that any eventual buyer supports distributed renewable energy adoption and energy efficiency efforts.
Going forward, Exelon and Pepco have a few options, and none of them seem likely to result in a desired merger. They have 30 days to petition the D.C. PSC for a rehearing, but this is unlikely because the commission seemed clear that the denial had solid legal footing. They could appeal to the D.C. Court of Appeals, but courts are often reluctant to overturn such decisions. Exelon could also go back to the drawing board, restructuring the merger in a way that excludes Pepco D.C., but this would be hard to do because they would have to go back and get a new round of approvals from all the other states Pepco serves. Exelon shareholders would also have to approve a new deal, which would likely be less favorable to Exelon.
Local advocates tried to engage Exelon earlier in the process to negotiate better terms for ratepayers and renewable energy access, but Tidwell said that “Exelon was not interested in any kind of substantive dialogue with the opposing entities.” They took one meeting and then barely followed up.
Tidwell said Tuesday that “what happened today is good for the long-term health of wind and solar.”
Tags
DCExelonPepcoRenewable EnergyUtilitiesWashington
The post After 4 States Approved A Big Utility Merger, DC Shocked Everyone By Denying It Over Clean Energy appeared first on ThinkProgress.
The Inside Story Of How D.C. Blew Up A $6.8 Billion Utility Merger Over Renewable Energy
In a move that shocked both industry observers and grassroots clean energy advocates, the Public Service Commission of Washington, D.C. unanimously rejected a proposed merger between Exelon and Pepco. Together, they would have created the nation’s largest utility.
The commission wrote in its official summary, released Tuesday, that Exelon and Pepco “have not met their burden of persuading this Commission that the Proposed Merger is in the public interest.”
Why? The summary listed several points but a central conflict was over how renewable energy would fare.
“We are also concerned about the inherent conflict of interest that might inhibit our local distribution company from moving forward to embrace a cleaner and greener environment,” the Commission wrote in its summary.
“Therefore, the application is denied,” the summary concluded.
The merger, valued at $6.4 billion, would have resulted in Chicago-based, nuclear-heavy utility giant Exelon buying Pepco Holdings, which provides power to Washington, D.C. and parts of Virginia, Maryland, Delaware, and New Jersey.
Exelon and Pepco’s joint statement said they were “disappointed” with the decision, believing “it fails to recognize the benefits of the merger to the District of Columbia and its residents and businesses.” The utilities made the case that a merger would benefit customers and enhance reliability.
“We will review our options with respect to this decision and will respond once that process is complete,” the statement concluded.
“The public policy of the District is that the local electric company should focus solely on providing safe, reliable and affordable distribution service to District residences, businesses and institutions,” Betty Ann Kane, the commission chairman, said in her statement announcing the decision. “The evidence in the record is that sale and change in control proposed in the merger would move us in the opposite direction.”
Exelon and Pepco have 30 days to ask the Public Service Commission (PSC) to reconsider the denial, and it is likely they will do so, according to Gabe Elsner, executive director of the Energy & Policy Institute in Washington, D.C.
“Exelon has a long history of using the company’s political influence to restrict renewable energy policies”
The normally boring debate about a utility merger became a pitched battle between residents concerned about electricity rates and renewable energy advocates on one side, and the two utilities and their allies on the other. Nuclear plants provide most of Exelon’s power generation, followed by natural gas, oil, hydro, wind, and coal. Yet it is the utility’s past fights against state renewable energy policies that caused so many people to oppose the idea of Exelon purchasing Pepco.
“Exelon has a long history of using the company’s political influence to restrict renewable energy policies,” Elsner told ThinkProgress. “If the D.C. PSC had approved the merger, Exelon would have been empowered to continue its anti-renewable campaign, but the PSC’s rejection of the merger could help ensure that these two states’ renewable energy policies remain in place and continue to support the growth of renewable energy industry in D.C. and Maryland.”
Exelon fought hard against the renewal of the Wind Production Tax Credit at the national level because it would affect the bottom lines of their nuclear plants.
Local renewable advocates saw this decision as a line in the sand.
“Utilities that work to impede people’s desire to produce their own clean energy do so at their peril,” said Anya Schoolman, founder and executive director of the nonprofit Community Power Network. Schoolman called the PSC’s decision “a great victory for renewable energy” and played a central role in the fight to halt the merger. Her fight for neighborhood solar power access began several years ago when she and several neighbors found the process to install solar panels too complicated, and eventually founded an advocacy organization called D.C. Solar United Neighborhoods. She was honored at the White House last year as a “Champion of Change.”
“D.C. Solar United Neighborhoods fought hard because of Exelon’s well documented opposition to locally owned renewable energy, their opposition to wind, and their hypocritical track record to oppose policies that support renewable energy in the name of the ‘free market’ while asking for handout after handout for their nuclear business,” she told ThinkProgress. “Fundamentally, we argued, and the D.C. PSC agreed, that there was a conflict of interest between Exelon’s generation business and D.C.’s goals to build a modern, equitable, renewable, clean grid.”
“Utilities that work to impede people’s desire to produce their own clean energy do so at their peril”
The PSC seemed fully aware of the importance of the decision, and how much attention it would receive.
“This proceeding has generated more interest and more active participation by parties and interested persons than any other proceeding in the Commission’s more than a century of operations,” the PSC’s summary stated. “As we have been reminded throughout this process, this decision is one of the most significant decisions that the Commission will ever make.”
D.C.’s laws and statutes place obligations on the Commission that include determining whether the deal was in the public interest — and unlike some states, one of those factors is “conservation of natural resources and preservation of environmental quality.” This was certainly part of the PSC’s ultimate decision.
“In this proceeding,” the summary stated, “the Commission must decide who will control the District’s only local electric distribution company at a time when our city’s leadership, at the urging of many residents, has mandated that the District must pursue a cleaner and greener future that includes more renewable energy resources and more distributed generation and at a time when the electric industry is undergoing significant transformation.”
Because Pepco is an energy distribution company, and Exelon receives most of its revenue from energy generation, the PSC noted that this could cause a conflict with adverse consequences to D.C. ratepayers.
“What happened today is good for the long-term health of wind and solar.”
Despite generating no power on its own, Pepco is not exactly a national leader in renewable energy either. A recent study found that the distribution-focused utility was the worst in the country in connecting its residents to solar power. Advocates worried that Pepco becoming a subsidiary to a much larger utility that gets most of its revenue from generation would complicate efforts to generate more power from distributed renewable energy.
Pepco and Exelon have both made the case that the merger would have brought lower electricity rates to the District, but the PSC found little benefit for ratepayers, especially without Pepco representation on Exelon’s executive committee.
District officials, despite several council members’ close relationships with Pepco, had largely opposed the merger. This includes Mayor Muriel Bowser — earlier this year after a letter from the D.C. Attorney General’s office argued that the benefits of the merger would flow from D.C. ratepayers to shareholders in the Chicago-based company, Bowser said the letter spoke for the District.
The long march to Tuesday’s rejection seemed almost certain to result in an approval, after PSCs in Virginia, New Jersey, Delaware, and finally Maryland approved the merger.
“It was a fascinating process, that’s for sure,” Mike Tidwell, director of the Chesapeake Climate Action Network, told ThinkProgress. Tidwell said that Exelon wanted the merger “to prop up its nuclear fleet on the backs of low-carbon generation in the Mid-Atlantic.”
The Maryland Public Service Commission was supposed to be the wrench that stopped the otherwise smoothly-running gears of the merger process. Martin O’Malley, as governor, helped lead the fight against it while still in office. Tidwell noted that “O’Malley basically said ‘hell no.'” Even when his chosen successor lost to Republican Gov. Larry Hogan, Hogan defied expectations that the new administration would clear the way for the merger. Hogan stayed fairly neutral. The opposition of the Maryland Attorney General, the staff of the PSC, the environmental groups all suggested that Maryland may actually reject the merger.
Instead, the Maryland PSC voted 3-2 to approve it in May.
“The two commissioners who opposed it wrote a scathing letter to the majority, saying verbatim ‘we find the majority’s decision incomprehensible,'” Tidwell said. But he noted that that narrow decision may have helped to set the stage for D.C. to reject it.
“Most of us were shocked when they voted unanimously” to deny the merger…
Consumer and renewable energy advocates in D.C. did not have high hopes, however, because of the clout utilities have and the revolving door nature of the close ties local officials have with Pepco. But the decision to deny the application for the merger shocked the domestic electricity sector, clean energy advocates, as well as the stakeholders.
“Most people felt the best shot of stopping this merger was in Maryland, not D.C., so most of us were shocked when they voted unanimously” to deny the merger, Tidwell said.
“Exelon is likely as stunned to lose today as we were stunned when Exelon won in Maryland,” Tidwell said Tuesday. Exelon and Pepco’s stock prices dropped immediately after the decision, a year and a half after they spiked when the market found out about the proposed merger. “Exelon is a sick and ailing company because they’re overly burdened by a nuclear fleet,” Tidwell said, and wanted to seek revenue from a merger like this one. Other companies may want to buy Pepco; advocates hope that any eventual buyer supports distributed renewable energy adoption and energy efficiency efforts.
Going forward, Exelon and Pepco have a few options, and none of them seem likely to result in a desired merger. They have 30 days to petition the D.C. PSC for a rehearing, but this is unlikely because the commission seemed clear that the denial had solid legal footing. They could appeal to the D.C. Court of Appeals, but courts are often reluctant to overturn such decisions. Exelon could also go back to the drawing board, restructuring the merger in a way that excludes Pepco D.C., but this would be hard to do because they would have to go back and get a new round of approvals from all the other states Pepco serves. Exelon shareholders would also have to approve a new deal, which would likely be less favorable to Exelon.
Local advocates tried to engage Exelon earlier in the process to negotiate better terms for ratepayers and renewable energy access, but Tidwell said that “Exelon was not interested in any kind of substantive dialogue with the opposing entities.” They took one meeting and then barely followed up.
Tidwell said Tuesday that “what happened today is good for the long-term health of wind and solar.”
Tags
DCExelonPepcoRenewable EnergyUtilitiesWashington
The post The Inside Story Of How D.C. Blew Up A $6.8 Billion Utility Merger Over Renewable Energy appeared first on ThinkProgress.
Russia Committed To Emission Reducing Treaty But Used Deceptive Practices To Increase Them
A plan to decrease greenhouse gas emissions backfired after what some say was “criminal” activity in Russia and Ukraine flooded the carbon credit market, resulting in 600 million metric tons of emissions.
A new study, published Monday in Nature, found that 80 percent of the projects certified under the United Nations Framework Convention on Climate Change’s Joint Implementation (JI) scheme, part of the Kyoto Protocol, did not actually reduce emissions. Projects in many cases would have happened anyway — with or without Kyoto — and some were even fake, Vladyslav Zhezherin, one of the report’s authors, told the Guardian.
The Kyoto Protocol was signed in 1997 and committed countries to emissions cuts. Under the program, countries that are party to the protocol can issue emissions reduction credits for projects that reduce carbon emissions or increase carbon sinks “additional to what would otherwise have occurred.”
In one example, operators of three chemical plants actually increased waste gas emissions, only to turn around and earn credits by reducing them again. “If you produced more greenhouse gases only to destroy them and generate more carbon credits, you would essentially be damaging the climate for profit,” said Lambert Schneider, a co-author of the study, which was put out by the Stockholm Environment Institute (SEI), an international nonprofit research organization.
But the system was broken from the beginning, said Anja Kollmuss, an author of the study and an an associate at SEI. In simple terms, countries set emissions targets and then certified credits from projects — like wind farms or reforesting — that reduced emissions. The countries could buy the credits themselves (retire them) or they could be sold them to other countries or companies that needed to meet reduction goals.
But there were two big problems with the system, Kollmuss said. For one, there was no international oversight for certifying projects. Second, and perhaps more importantly, Russia and Ukraine had overall emissions targets that were greater than their emissions — resulting in billions of excess, valueless credits.
“They received literally billions of spare Kyoto Protocol allowances,” Kollmuss told ThinkProgress. The excess allowances led to projects being certified that would have happened anyway — such as projects that were started in 2002 and certified in 2012, Kollmuss said. “Not all the projects were bad, just the overwhelming majority of them,” she said.
As with any market, when supply overwhelms demand, prices collapse. Suddenly, credits went from €13 (about $14.80) to less than €.5 (about 57 cents), which hurt projects that had been financed on the assumption that the market would hold.
The study’s authors hope the analysis will help inform negotiators during the UNFCC conference in Paris in December.
Carbon credit markets can work, Kollmuss said. (In fact, nine New England and Mid-Atlantic states have an emissions credit program that has reduced emissions and lowered electricity bills.)
“Do we know how to design a market so it has integrity? Yes, we do,” Kollmuss said. But it comes down to political willingness, and will only work if all party countries have ambitious targets; the targets are calculated in multi-year terms; there are clear accounting rules; and there is international oversight, she said.
“What we know of this future climate treaty if it is passed in Paris, it is much more of a bottom-up treaty. Countries can decide themselves what they are going to do and how they are going to do it,” Kollmuss said.
Tags
Climate ChangeUnited Nations
The post Russia Committed To Emission Reducing Treaty But Used Deceptive Practices To Increase Them appeared first on ThinkProgress.
Joseph J. Romm's Blog
- Joseph J. Romm's profile
- 10 followers
