Joseph J. Romm's Blog, page 158

April 10, 2015

The State Department ‘Secretly Approved’ Two Pipeline Projects, Lawsuit Alleges

[image error]

CREDIT: shutterstock



Tribal and environmental groups are suing the State Department for allegedly “secretly” approving two pipeline projects last year, approvals that the groups say violated national environmental regulations.


The lawsuit was filed last year by Minnesota’s White Earth Nation tribe along with environmental groups including the Indigenous Environmental Network, the Sierra Club, and Center for Biological Diversity, but the groups filed a motion for summary judgment in Minnesota federal court this week. In it, the groups claim that in 2014, the State Department “short-circuited” the approval process for the expansion of Enbridge’s Line 67 — also known as the Alberta Clipper. They also claim the department approved the construction of a new pipeline that would carry tar sands oil from Alberta, Canada to Superior, Wisconsin, without necessary public input.


According to the summary judgment motion, the State Department sought to build the new pipeline by using “an existing permit for another pipeline known as Line 3.” Doug Hayes, staff attorney for the Sierra Club, told ThinkProgress that Enbridge, while waiting on the State Department to conduct an environmental analysis on the Alberta Clipper expansion, found a way to replace parts of Line 3 to allow it transport tar sands at a higher volume while the analysis was taking place.


[image error]

Oil from the Alberta Clipper (left) would be diverted through the new Line 3 border crossing (right), the lawsuit alleges.


CREDIT: U.S. District Court Minnesota



“In effect, what the State Department has done is allow the Alberta Clipper expansion to go forward in the interim, but they’ve also allowed this new higher capacity line at the border under the guise of line 3 maintenance,” he said, referring to the stronger pipe that he says Enbridge replaced Line 3 with at the Canadian border. “Clearly it’s not just maintenance of a pipeline — it’s something different.”


In approving those projects, the State Department violated the National Environmental Policy Act (NEPA) and the National Historic Preservation Act (NHPA), the groups allege.


“The State Department’s hasty and uninformed decision-making increases the risk of harm to plaintiffs’ members’ health, as well as to their property, recreational, aesthetic, cultural, spiritual, and economic interests,” the lawsuit reads. “Because of its failure to conduct any NEPA or NHPA analysis before approving these projects, the State Department lacks the information it needs to effectively mitigate the projects’ environmental risks.”


This is worrying for the Sierra Club — which is currently running a campaign and petition against what it calls Enbridge’s “illegal scheme” — and other environmental groups in part because it means more high-carbon tar sands oil will be shipped across the border. Enbridge is looking to double the capacity of the Alberta Clipper pipeline from 450,000 barrels per day to nearly 800,000 bpd — almost as much, Sierra Club points out, as Keystone XL would carry.


The Sierra Club isn’t supportive of the expansion in the first place, but this alleged attempt to subvert the State Department’s process makes the issue more concerning for the group.


“We’re asking for state department to stick to their word, stick to the process, and analyze the impacts of this expansion before they allow it to go forward,” Hayes said.


The State Department, for its part, denies the allegations.


“The State Department made no such decision to approve construction and operation of a new border-crossing crude oil pipeline or a significant increase in the capacity of an existing cross-border crude oil pipeline,” the department states in its answer to the lawsuit.


The department also confirmed that it’s putting together a Supplemental Environmental Impact Statement and “conducting associated public outreach consistent with the National Environmental Policy Act” as part of its review of Enbridge’s Line 67 application.


Hayes said he thought the allegations against Enbridge were an example of the oil industry’s secretive nature. He noted that this isn’t the first time that a government agency or pipeline company has been accused of fast-tracking pipelines or avoiding full environmental review. In 2013, the Sierra Club filed a lawsuit involving Enbridge’s now-complete Flanagan South pipeline, which accused the Army Corps of Engineers of treating each of Flanagan’s water crossings as a separate project — instead of looking at the whole, 589-mile pipeline — so that the pipeline could qualify for expedited approval.


“The oil industry and pipeline companies are going to greater and greater lengths to cut the public out of the process,” he said. “I think this just is the latest scheme they’ve come up with and it’s entirely designed to avoid the public environmental review that’s required under NEPA.”


The post The State Department ‘Secretly Approved’ Two Pipeline Projects, Lawsuit Alleges appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 10, 2015 09:10

In The Midst Of Toxic Oil Spill, Vancouver Announces It Will Go 100 Percent Renewable

[image error]

Vancouver, B.C., a land of confused environmental news.


CREDIT: Shutterstock



There’s some mixed news coming out of Vancouver, Canada this week. On the one hand, the city announced at an international sustainability summit that it would commit to using 100 percent renewable energy to power its electricity, transportation, heating and air conditioning within 20 years. On the other hand, Vancouver is also dealing with a fuel spill in the waters of English Bay that is washing up on beaches and threatening wildlife.


On March 26, Vancouver’s city council voted unanimously to approve Mayor Gregor Robertson motion calling for a long-term commitment to deriving all of the city’s energy from renewable sources. At the ICLEI World Congress 2015 this week in Seoul, South Korea, the city went a step further, committing to reaching that goal of 100 percent renewable electricity, transportation, heating and air conditioning by 2030 or 2035.


Right now, Vancouver gets 32 percent of its energy — that includes electricity, transportation, heating, and cooling — from renewable sources, so the goal is ambitious, but not impossible. According to the Guardian, Vancouver could get all of its electricity from renewables within a few years, but transportation, heating, and cooling may prove more difficult.


The city’s cars, buses, and trucks are still largely powered by gas and diesel fuel — apart from a fleet of electric trolleys — but the city is already taking measures to reduce the amount of fossil fuels used for transportation by encouraging residents to take more trips via bike, public transportation, or on foot, and reducing the average distance driven by residents 20 percent from 2007 levels.


“Cities around the world must show continued leadership to meet the urgent challenge of climate change, and the most impactful change we can make is a shift toward 100% of our energy being derived from renewable sources,” Robertson said in a statement after his motion passed. Vancouver joins cities like San Francisco, Copenhagen, and Sydney, which have also pledged to work toward 100 percent renewable energy.


Win some, lose some: while the city was making its announcement in South Korea, toxic fuel was spreading across Vancouver’s English Bay, washing up on many of the city’s beaches.


It appears fuel may be leaking from one of the cargo ships anchored in English Bay. Still unconfirmed @News1130radio pic.twitter.com/g8P1ehP0UY


— Chad Dey (@chad_dey) April 9, 2015



According to CBC News, the Canadian Coast Guard was notified about the spill at 5 p.m. PT on Wednesday, but underestimated its size. On Thursday morning, when it became apparent that the spill was larger than initially thought, cleanup crews were deployed. But the oil — thought to be fuel from an unidentified freighter — had already reached several beaches, and Vancouver residents were warned to stay away from the shore on both sides of the bay.


.@inthehouse7 is at the site of the #VanFuelSpill and will have more tonight on @APTNNews pic.twitter.com/z9fGmScwM5


— Dennis Ward (@DennisWardNews) April 9, 2015



The spill prompted concern for wildlife, especially species like the killer whale, which occasionally appear in the area.


“First and foremost, we’re going to be looking for marine mammals on the water,” Peter Ross, who runs the Vancouver Aquarium’s Marine Mammal Rescue Center’s program on ocean pollution research, told CBC News. According to Ross, some 25 species, including fish and seabirds, could be at risk due to the spill.


English Bay today. English Bay a month ago. #VanFuelSpill @VancityBuzz #Vancouver pic.twitter.com/ol8KPLgzbS


— Paul Tomkinson (@paultomkinson) April 9, 2015



As of late Thursday, officials had not identified the composition of the oily, black material, though cleanup crews are treating the spill as a worst-case scenario and assuming the material is either bunker fuel or raw crude until test results come back. The Coast Guard estimates that around 3,000 liters — about 792 gallons — of the material spilled into the bay, an amount that it deems “not massive by spill standards” but enough to get the city’s attention, according to the HuffPost British Columbia.


The spill also sparked concern over the Northern Gateway Pipeline, which would stretch from the Alberta tar sands to British Columbia and whose oil would be shipped to overseas refineries via tankers. Opponents of the project worry that the pipeline would require large oil tankers to increasingly traffic British Columbia’s inland waters, increasing the chance of an oil spill along the area’s ecologically sensitive coastline.


The post In The Midst Of Toxic Oil Spill, Vancouver Announces It Will Go 100 Percent Renewable appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 10, 2015 08:48

L.A.’s New City Plan Will Make You Want To Move There

[image error]

Palm trees at the Magic Hour in Los Angeles, C.A.


CREDIT: flickr/ Chris Goldberg



Los Angeles is a city of a million intersections, and a new sustainability plan intends to cross some of the biggest ones in an effort to transform the city into a breezy renewable metropolis rather than an overheated desert island in the coming decades.


On Wednesday, L.A. mayor Eric Garcetti released an ambitious plan that puts environmental, economic, and equality issues front and center in helping determine the trajectory of the city, which plans to add another half-million residents by 2035. The plan comes at a pivotal moment for the state and the city, as a four-year drought prepares to settle in for the summer months. Governor Jerry Brown just announced statewide water restrictions for the first time and the state just had its hottest 12-month stretch on record.


“Our first ever sustainability plan details actions we must take in the coming months and years to secure a future for L.A. that is environmentally healthy, economically prosperous and equitable in opportunity for us all,” Mayor Garcetti said. “My back to basics approach is about making sure our city has the strong foundation it needs to soar to new heights.”


A few of the plan’s highlights include: becoming “the first big city in the nation to achieve zero waste” by 2025, fully divesting from coal-powered electricity by 2025, reducing greenhouse gas emissions by 80 percent below 1990 levels by 2050, having zero smog days by 2025, and making it so that 50 percent of all trips taken by city residents are by bike, foot, or public transportation by 2035. The plan also makes commitments to reduce energy use in all buildings by 30 percent by 2035.


According to Jonathan Parfrey, executive director of the L.A.-based Climate Resolve and a former commissioner at the L.A. Department of Water and Power, the plan is a powerful first step.


“This is just the beginning,” he told ThinkProgress. “The hard stuff isn’t the planning, it’s the implementation.”


Parfrey, who helped plan parts of the effort, said he is especially pleased with the 80 percent greenhouse gas reduction goal on the mitigation side and with the proposal to deal with the urban heat island effect on the climate resiliency side. The plan calls for a reduction of the urban heat island effect differential — the difference between the temperature of the city and the surrounding area — by 1.7°F by 2025 and 3°F by 2035.


The heat island effect, in which urban areas are noticeably warmer than their surroundings, can cause large cities to get 1.8°F to 5.4°F warmer than surrounding areas in the day, and 22°F warmer at night, according to the EPA. This effect happens when buildings, roads, and other developments replace formerly open land and greenery, causing surfaces to become moist and impermeable, and to warm up.


According to Parfrey, 20 percent of L.A. is covered in rooftops and 40 percent in pavement of some form. Changing the reflective capacity of these areas and adding more greenspace will play a big role in reducing the heat island effect. Parfrey and other city officials have already been pushing for these changes. In December 2013, the Los Angeles City Council unanimously passed a building code update requiring all new and refurbished homes to have cool roofs — which use sunlight-reflecting materials — making L.A. the first major city to require such a measure.


Cool roofs “can be more than 50°F cooler on the surface of the roof during a hot summer day and can cool the interiors of buildings by several degrees Fahrenheit, reducing chances of heat-related injuries or deaths,” according to Climate Resolve.


The city’s new sustainability plans calls for 10,000 of these cool roofs to be in place by 2017.


“These are innovations the we haven’t seen in any other sustainability plan,” said Parfrey.


The full plan spans 108 pages, covering everything from reducing potable water use by 10 percent in city parks to ensuring that 50 percent of the city’s light-duty vehicle purchases are electric vehicles by 2025. With the drought in full swing and no reason to believe that prayers for rain will bring lasting results, the city is hoping to reduce overall municipal water use by 25 percent by 2025 and 30 percent by 2030.


L.A. has some of the highest levels of income inequality in the country, and the plan attempts to combine the fight against climate change with the one against poverty. By 2017, the city will begin “constructing 17,000 new units of housing within 1,500 feet of transit.” There are also targets to reduce the number of rent-burdened households, fight asthma, limit food deserts, and distribute cap-and-trade funds.


“This plan puts L.A. at the very heart of sustainability efforts in cities across the country,” said Parfrey. “The environmental community wants it to be a great success.”


The post L.A.’s New City Plan Will Make You Want To Move There appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 10, 2015 06:59

There’s A 60 Percent Chance El Niño Could Last All Year

[image error]

California’s January-March temperature since 1895. An arrow points to state’s amazing recent warmth. (NOAA via Climate Central)



The National Oceanic and Atmospheric Administration (NOAA) is predicting a 60 percent chance that the El Niño it declared in March will continue all year. An El Niño is a weather pattern “characterized by unusually warm ocean temperatures in the Equatorial Pacific.”


Robust El Niños are associated with extreme weather around the globe. They also generally lead to global temperature records, as the short-term El Niño warming adds to the underlying long-term global warming trend. El Niños are typically California drought-breakers, but as the top graph shows, that hasn’t been the case so far.


As I discussed last week, some climatologists believe that we may be witnessing the start of the long-awaited jump in global temperatures — a jump that could be as much as as 0.5°F. It already appears likely that March will be hot enough to set yet another global record for the hottest 12 months on record (April 2014 through March 2015) and a global record for hottest start to a year (January through March) ever.


NOAA released its “consensus probabilistic forecast” of the El Niño Southern Oscillation (ENSO) for the rest of this year, from its Climate Prediction Center (CPC) and Columbia University’s International Research Institute (IRI) for Climate and Society. Note that the ENSO state — El Niño, neutral, or La Niña — is generally based on the sea surface temperature (SST) anomaly in the NINO3.4 region of the Equatorial Pacific.


[image error]

NOAA concludes there’s about a “70% chance that El Niño will continue through Northern Hemisphere summer 2015, and a greater than 60% chance it will last through autumn.”



I asked climatologist Kevin Trenberth of the National Center for Atmospheric Research what a prolonged El Niño would mean for the planet and California. He told me it “goes along with the jump” in global temperatures he has said is imminent.


California is a more complicated situation for a few reasons. Trenberth notes that the famous song “It never rains in Southern California” is mostly accurate “at least from May through October – and Governor Jerry Brown has every right to be concerned.” If a full-blown El Niño develops over the year, “then the prospects go up a LOT for rains in California” during the state’s traditional rainy season, late October through March, which has not been very rainy at all in the last few years.


Climate expert Professor John Abraham also noted that a full-blown El Niño “just might bring some relief from this unbelievable drought.” But he was quick to note that “an El Niño will also break more temperature records and may make 2015 the hottest year ever. It will also bring extreme weather to other parts of the planet so this El Niño is a double-edged sword.”


In fact, on Monday, NOAA’s Climate Prediction Center released this “seasonal outlook” for U.S. temperatures April to June. It combines “the effects of long-term trends, soil moisture, and, when appropriate, ENSO”:


[image error]


If that forecast comes true, then California and the West are due for more blistering temperatures, which would likely make the drought even worse in the near term.


Finally, most of Brazil is suffering through a devastating drought, too. São Paulo “is suffering its worst drought in almost a century.” Dr. Trenberth writes, “These water crises both in CA and Brazil (and perhaps we can add Australia) are excellent analogs for the sort of problems I expect to become more common with global warming. Water is likely to be the biggest pressure point on society.”


Indeed, many recent studies have projected that large parts of the world including much of the Southwest, Great Plains, Brazilian Amazon, and Australia will turn into near permanent dust bowls if the world doesn’t slash carbon pollution soon. The time to act is now.


The post There’s A 60 Percent Chance El Niño Could Last All Year appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 10, 2015 05:00

April 9, 2015

Scientists Have Found A New Way To Save The World’s Coral Reefs, And It’s Pretty Fishy

[image error]

Redfin butterflyfish and coral reef.


CREDIT: Tim McClanahan/WCS



The secret to a healthy coral reef is a healthy population of fish, a new study has found.


The study, published this week in Nature, looked at the fish biomass — the total mass of all fish species in a reef — of 832 reefs around the world, and used data on the reefs’ health to estimate the levels of fish biomass needed to sustain that health.


The researches found that reefs with no fishing had about 1,000 kilograms (2,204 pounds) of fish biomass per hectare (2.47 acres), and that to avoid a total collapse of ecosystem health, reefs needed to stay above a minimum of 100 kgs (about 220 lbs) of fish biomass per hectare. To keep their ecosystems healthy and be able to sustain fishing needs, reefs needed to keep fish biomass at at least 500 kgs (1,102 lbs) of fish biomass per hectare.


Unfortunately, according to the study, most of the world’s coral reefs aren’t succeeding at maintaining this biomass level. The researchers found that 83 percent of the coral reefs studied didn’t have a fish biomass of 1,102 lbs per 2.47 acres. But this doesn’t mean those reefs can’t still recover, said Aaron MacNeil, lead author of the study and senior research scientist for the Australian Institute of Marine Science.


MacNeil told ThinkProgress in an email that, because of the wide range of regulatory measures that can be put in place to regulate fishing methods or limit the amount or type of fish taken from a region, even poorer countries can take steps to improve their reef health.


“Our study is important because it shows that there are a range of management options available, including gear restrictions, limits on the species that can be caught, and caps on who can access the fishery,” he said. “Ultimately the most effective regulations are those that will be complied with and it is up to local people (and governments) to figure out what those are.”


MacNeil said that he has seen effective reef management in places like coastal Kenya, a region that’s “heavily dependent” on its reefs. Kenya has established six national marine reserves in an attempt to protect some of the reefs that surround the nation, and has also banned a way of fishing that uses beach seines, a type of large net that tends to capture the majority of the fish in an area rather than targeting one or two species, on its southern coast.


Other places have also seen success in reef recovery. Cabo Pulmo, a marine protected area on the coast of Mexico’s Baja Peninsula, was established in 1995 after a long history of overfishing caused a noticeable change in the reef’s fish population. Since the park was created and the fishing ban was put in place, fish biomass in the reef has increased by more than 460 percent, and the reef’s population of large fish has also increased.


But even though the benefits of fishing restrictions can be seen in reefs within a decade or so, the study said heavily-fished reefs that have seen their fish stock severely depleted would need about 59 years to recover completely under fishing restrictions. Averagely-fished reefs would need about 35 years.


Different species of fish play different ecosystem roles in reefs: Parrotfish and urchins, for instance, are key reef players because they eat algae off coral. That algae, if left to its own devices, can smother coral, so studies have shown that protecting parrotfish and other grazers is key to reef health.


But other types of fish also have major roles to play, MacNeil explained. Fish that feed on the reef’s dead organic material help clear out space for more coral to grow, and predators help keep the system balanced by controlling the population of smaller fish.


MacNeil said that one of the most exciting findings of the study was that these varying ecosystem functions can be protected “by such a diverse range of fisheries regulations.”


“This really got us excited because it means that people have many more options that just marine protected areas when it comes to managing their reef fisheries,” he said.


Despite the study’s confirmation that more fish leads to a healthier reef, and that there are many steps regions can take to protect their reefs, the authors write that marine reserves and fishing regulations aren’t enough to combat the threats reefs face from climate change and ocean acidification.


“Addressing the coral reef crisis ultimately demands long-term, international action on a global-scale issues such as ocean warming and acidification,” the study reads.


Still, better management will likely help reefs be more resilient to these challenges in the future. A report last year found that coral reefs protected from stressors like pollution and overfishing can bounce back from warming-induced events like bleaching.


The post Scientists Have Found A New Way To Save The World’s Coral Reefs, And It’s Pretty Fishy appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2015 09:20

Reports: Japan Will Promise To Reduce Carbon Emissions 20 Percent By 2030

[image error]

In this photo taken Thursday, April 17, 2014, Kazuhiro Onuki, right, and his wife, Michiko, wearing white protective gears and filtered masks, walk along the coast damaged by the 2011 tsunami against a backdrop of Fukushima Dai-ni Nuclear Power Plant.


CREDIT: AP Photo/Shizuo Kambayashi



One of the world’s largest emitters of greenhouse gases is reportedly planning to reduce its carbon output by 20 percent in the next 15 years.


On Thursday, Reuters reported that Japan is planning to announce the 20 percent reduction in greenhouse gas emissions as its contribution to international negotiations on slowing human-caused climate change, which are scheduled to take place in Paris later this year. Japan is currently the world’s fifth-biggest emitter of carbon dioxide.


As of now, though, it’s unclear whether Japan’s reported pledge is actually significant, because it’s similarly unclear which year’s emissions the promised reductions will be based off of. Kyodo News reported that government would cut emissions 20 percent from 2005 levels, a year when carbon emissions were relatively low because Japan was still heavily reliant on nuclear power. However, the leading business daily Nikkei reported the reduction would be from 2013 levels, which following the country’s nuclear shutdown were the second-highest in the country’s history. Both reports cited unnamed sources.


Right now Kyodo News and Nikkei are the only two outlets claiming direct knowledge of the plan, so we’ll update this post if we hear anything more definitive.


If Japan’s reductions are indeed based off 2013 levels, a 20 percent reduction would be minimal. Compared to carbon levels in 1990 — the year when the country had its lowest recorded emissions — it would represent only an 11 percent reduction. And that’s not good enough, according to Jennifer Morgan, the global director of the climate program at the World Resources Institute.


“If this is the Japanese offer to the world, it is clearly out of step with recent developments in the U.S., China, and Europe where countries are shifting away from coal,” Morgan said in an e-mailed statement. “Japan should use this chance to reach their potential and join the ranks of India, Germany and China in the clean energy race.”


Either way, it is somewhat surprising that Japan would make any carbon reduction pledge considering the significant level of uncertainty facing its energy policies. Following the country’s Fukushima nuclear disaster in 2011, Japan shut down all of its working nuclear reactors and switched to more carbon-spewing fossil fuels to fill the energy production void. At the time, Japan was getting about 30 percent of its power from nuclear, and planned to increase that to 40 percent by 2017.


[image error]

Japan severely decreased its use of low-carbon nuclear energy following the Fukushima disaster in 2011, and replaced generation with carbon-intensive coal, oil, and gas.


CREDIT: EIA.gov



Now, tougher safety standards have been imposed on nuclear power, and Japan is considering bringing it back into the mix. According to a Reuters report last week, the country’s Liberal Democratic Party approved a proposal to bring nuclear back up to 20 percent of the country’s energy mix. That proposal goes against Japanese public opinion — according to one poll, 59 percent of Japanese voters oppose restarting nuclear capacity, and only 28 percent support it. Either way, the proposal still has to be approved by Japan’s Prime Minister, Shinzo Abe.


Despite those uncertainties, Japan’s reportedly imminent pledge would make it the last of the world’s top carbon emitters to make some sort of indication on what it will do during the international climate negotiations later this year.


The United States — the world’s second-largest carbon emitter and the largest emitter on a per-person basis — has pledged to cut its emissions 26 to 28 percent below 2005 levels by 2025. The world’s biggest emitter, China, has promised via a deal with the U.S. to get 20 percent of its energy from non-fossil-fuel sources by 2030, and to peak its overall carbon dioxide emissions that same year.


In a plan submitted to the United Nations earlier this month, Russia said it could cut greenhouse gas emissions by up to 30 percent of its 1990 levels by 2030. The European Union has said it will cut its emissions by 40 percent by 2030 from 1990 levels. And India, while it has not developed any concrete emissions reductions goals, has agreed to “cooperate closely” with the United States for a “successful and ambitious” agreement at the Paris climate talks at the end of the year.



Update
Share



facebook
twitter


This article has been updated to include comments from the World Resources Institute.




The post Reports: Japan Will Promise To Reduce Carbon Emissions 20 Percent By 2030 appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2015 09:00

U.S. Power Sector In 2015: More Renewable Energy, Less Carbon Emissions

[image error]

2015 will bring big changes for the U.S. power sector.


CREDIT: Shutterstock



A new report from Bloomberg New Energy Finance (BNEF) has some good news for anyone who supports a greener American energy sector: 2015 will be a “transformative year” for U.S. power, as more natural gas and renewable energy will combine with fewer coal plants to create a 20-year low in U.S. power sector emissions.


“This should prove to be a watershed year for the ‘de-carbonization’ of the US power sector, with record volumes of coal-fired capacity to be shuttered, renewables capacity to be built, and natural gas to be consumed,” BNEF said in a press release, concluding that these three factors will combine to drive carbon emissions from the power sector to their lowest levels since 1994.


First, 2015 is expected to be a record-breaking year for the installation of renewable energy, with around 18 new gigawatts (GW) of power coming online from solar and wind. The previous record, set in 2012, was 17.1 GW, and most of that came from wind plants built ahead of tax credit expirations.


This year will be different, because experts think we’ll see an equal mix of solar and wind projects installed across the country. For years, wind has been leading solar, but solar has been exploding in growth recently thanks to the falling price of solar panels.


According to the report, solar will reach record installations in three areas: utility scale installations, like mega-projects in California, rooftop installations, and non-residential roof-space. This year and 2016 are important years for solar — especially for utility-scale solar — because the federal Investment Tax Credit, which offers a dollar-for-dollar reduction in federal income tax for those who invest in solar projects, is set to fall from 30 percent to 10 percent in 2017, a policy change that will make large-scale solar projects less appealing for investors.


But it’s not just a record number of renewables that BNEF thinks will drive power sector emissions to a 20-year low — it’s also the decline of coal, the most-carbon intensive type of fuel to burn.


“The US coal fleet is entering an unprecedented period of retirements,” the report states. “As the industry faces a three-pronged assault from low gas prices, an aging fleet, and stringent environmental compliance.”


Around 7 percent of U.S. coal plants are expected to be retired in 2015. Low gas prices and new EPA standards limiting the amount of mercury, acid gases, and toxic metals that can be emitted from coal plants encourage utilities to take old coal plants offline, rather than spend money trying to retrofit them.


As coal plants close, more utilities will switch to burning natural gas, a less-carbon intensive (though still not completely green) power source. Previously, natural gas use peaked in 2012, with 25 billion cubic feet burned per day. According to the BNEF report, 2015 is poised to eclipse — or at least tie — that record. This will be spurred not only by the closing of coal plants, but by the low cost of natural gas, which could help it undercut the cost of coal-fired electricity even in places where coal plants remain open.


Combined, these factors will push power sector emissions to 15.4 percent below their 2005 level, the BNEF report says. But it also warns that this might be the largest decrease we see in a while, because the chance of retiring such a large portion of the U.S. coal fleet seems unlikely to happen again, and key renewable tax credits are set to expire or take significant cuts in 2016. And, as the Washington Posts’ Chris Mooney points out, even a record-breaking year is still just a single year, not a long-term trend.


“In the grand scheme, we still will be getting more power from coal than from natural gas in 2015, more power from natural gas than from nuclear, more power from nuclear than from renewables,” he writes.


Still, a 15.4 percent drop in the power sector from 2005 is welcome news for the U.S., which has pledged to cut its total greenhouse gas emissions — from power, agriculture, transportation, industrial, and residential sectors — 28 percent from its 2005 baseline by 2025.


The post U.S. Power Sector In 2015: More Renewable Energy, Less Carbon Emissions appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2015 08:40

Scientists Link Pennsylvania’s Fracking Boom To Increased Radioactive Gas In Homes

[image error]

This Jan. 17, 2013 file photo shows a fracking site in New Milford, Pa.


CREDIT: AP Photo/Richard Drew



The amount of radioactive material in Pennsylvania homes has increased alongside the state’s fracking boom, according to a new study published Thursday in the journal Environmental Health Perspectives.


Researchers from Johns Hopkins Bloomberg School of Public Health asserted that levels of radon — a odorless, carcinogenic, radioactive gas — have been on the rise in Pennsylvania homes since 2004, around the same time the state’s Department of Environmental Protection (DEP) began rapidly increasing the number of permits it issued for unconventional gas drilling. There had been no similar increases in indoor radon concentrations prior to 2004, the study said.


Radon is the second-leading cause of lung cancer in the world after smoking, according to the the Environmental Protection Agency. Approximately 40 percent of Pennsylvania homes are believed to have radon levels above the recommended limits, according to the state DEP.


The researchers were careful to note that the side-by-side increases in radon and fracking only represented a correlation, and should not be taken as a declaration that fracking directly caused an increased radon presence in homes. However, study leader Brian Schwartz told ThinkProgress that it’s possible fracking could be the cause, and noted that more research is needed on the subject.


“We’re not convinced this industry is playing a role [in increased radon levels],” he said, noting Pennsylvania has had a radon problem long before fracking started. “All we’re saying is these findings provide no reassurance that the industry is not playing a role.”


To make their claims, the Johns Hopkins researchers obtained radon data on nearly 2 million indoor radon tests conducted in the state between 1987 to 2013. Using some of that data, they associated radon levels with the homes’ respective geologies and water sources, along with the season the measurements were taken, fracking activity in the area, and the weather at the time.


What they found, along with average overall radon increases since 2004, was that buildings using well water had a 21 percent higher concentration of radon than those using municipal water. In addition, buildings located in townships had a 39 percent higher concentration of radon than buildings located in cities. Townships are more likely to house fracking wells than cities, the study noted.


The research is not the first to raise concerns about radioactivity exposure from Pennsylvania’s fracking operations. The state’s Marcellus Shale — the underground rock formation where gas is derived from — contains a lot of Naturally Occurring Radioactive Material, or NORM. Specifically, the Marcellus Shale has 20 times more NORM than typical shale formations, according to the Institute for Energy & Environmental Research.


Because of this, there’s evidence that radioactive elements have made their way into the environment, particularly though wastewater. Fracking uses a lot of water to blast underground shale rock, and the produced water that’s leftover from Marcellus Shale drilling operations contains on average nearly 500 times as much radon as the federal drinking water limit, according to the Johns Hopkins study. That water is generally injected back underground in disposal wells or stored in pits, where it can leach into the environment.


After analyzing the radon data, the study found a “statistically significant association” between how close houses are to Marcellus Shale fracking wells and how much radon is concentrated on the houses’ first floors in the summertime. This, the study said, “suggests a pathway through outdoor ambient air, but does not rule out the possibility of radon moving from the basement to the first floor.”


However, the researchers did note that their study had limitations. For one, they had “no information on radon-resistant construction, construction year, types of remediation completed, type of heating and cooking systems, quantity of natural gas and water used in the building, degree of sealing of the building for energy efficiency, soil type near the building, [or] wind speed and direction.” They also noted that it’s possible the increases could because normal levels of radon in the atmosphere are being more effectively trapped in buildings because of better wall sealing.


Still, neither the link to better construction nor the link to fracking can be definitively stated, prompting the researchers to call for more analysis on how and why radon could be affecting human health in Pennsylvania.


“Radon exposure represents a major environmental health risk, and in addition to future studies to understand the impact of drilling on radon levels, there is continuing need for a radon program in Pennsylvania to track and evaluate radon concentrations and to encourage testing and remediation,” the study said.


The post Scientists Link Pennsylvania’s Fracking Boom To Increased Radioactive Gas In Homes appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2015 05:16

Republicans Are Saying Environmentalists Caused California’s Drought. Here’s Why They’re Wrong

[image error]

Californian artist Ross Dickinson dramatized his home state’s eternal confrontation of nature and man by exaggerating the steep slopes of the hills and the harsh contrast between the dry red wilderness and the green cultivated land (1934).


CREDIT: flickr/ Cliff



Any way you cut it, California is in the midst of a dire drought — one that has been exacerbated by climate change. As the drought amplifies in impact and exposure — with statewide mandatory restrictions imposed for the first time last week — some would rather attribute its severity to a lack of water infrastructure, rather than the lack of rain. They would rather blame small fish than a changing climate and a growing population.


This straw man argument is not only disingenuous, it is also irresponsible. And it could set back the efforts of those focused on meeting the challenges of the state’s water stresses and climate impacts — like Governor Jerry Brown (D) and many in the state legislature — further harming all stakeholders, from Central Valley farmers to coastal residents.


Example A: Carly Fiorina, former Hewlett Packard CEO, failed 2010 GOP nominee for U.S. Senate, and friend of the fossil fuel industry. On Monday, Fiorina, who is considering a presidential bid, told Glenn Beck that the California drought is a “man-made disaster.” And by man-made she means it has been caused by “liberal environmentalists” who have prevented the state from building the appropriate reservoirs and other water infrastructure.


“In California, fish and frogs and flies are really important,” she said. ” … California is a classic case of liberals being willing to sacrifice other people’s lives and livelihoods at the altar of their ideology.”


In an interview with MSNBC that same day, Fiorina placed her comments within the context of climate change, saying that whatever California does to address climate change “won’t make a bit of difference.”


“A single state, or single nation, acting alone can make no difference at all, that’s what the scientists say,” she said. “We’re disabling our own economy and not having any impact at all on climate change.”


This is not the first time Fiorina has lambasted the state’s efforts to address the drought or climate change as exercises in economic ruin, nor is she the first one to have made the misguided argument. The House Natural Resources Committee, chaired by Rob Bishop (R-UT), holds the viewpoint that this “man-made drought” is responsible for fallowing hundreds of thousands of acres of fertile farmland in California and that much of the state’s farmland is in “danger of becoming a dust bowl unless immediate action is taken to change policies that put the needs of fish above the livelihood of people.”


Last January, Rep. Devin Nunes (R-CA) said his area of the state has been decimated by drought due to politicians using “water as a weapon” by cutting supply off to farmers in order to better protect liberal voters.


“The elites that live in Hollywood and in San Francisco and along the coast support these radical environmental policies that cut off the infrastructure that’s been built and then Jerry Brown and others run around saying ‘oh gosh, we have to do something about this, it’s the drought and global warming,'” he said. “No, that’s nonsense you morons. It’s because you shut all the infrastructure off.”


Nunes is attacking, in part, policies surrounding the Delta smelt, a species that has been listed as threatened since 1993 under the federal Endangered Species Act and is approaching extinction. A 2008 decision by the Fish and Wildlife Service to safeguard the fish restricted the amount of water that can be pumped from the Sacramento-San Joaquin Delta and sent south to agricultural interests and water districts.


Both Nunes and Fiorina lamented the amount of water that the state wastes each year on ecological flows, with Fiorina saying that 70 percent of California’s rainfall “washes out to sea” every year.


Andrew Fahlund, deputy director of the California Water Foundation, disagrees with that number and with most everything Fiorina said.


“Thinking that building more reservoirs will get you out of a drought is like assuming that opening more checking accounts when you’ve lost your income will help you pay your bills,” he told ThinkProgress.


According to Fahlund, only 50 percent of water in California flows to the coast. Fahlund said that according to the Bureau of Reclamation’s own numbers, building the reservoirs that Fiorina is referring to would have only resulted in a net increase of one percent to the state’s water supplies.


“And by this year, the fourth year of a drought, that water would have been used up just like the water in most of the rest of the state’s reservoirs,” he said. Fahlund said the real reason the state hasn’t invested in more dams or pipelines is that no one wants to pay for them, most of all not taxpayers.


“Study after study shows that the three projects most cited by advocates of new infrastructure don’t pass any sort of cost-benefit test,” he said.


Most of the recent investments in water infrastructure have gone unnoticed, according to Fahlund, who said that these advances have come in the form of efficiency measures. Results of these efforts include Los Angeles using the same amount of water it used in the 1970s but with a much larger population and “agriculture growing increasingly productive and lucrative despite using the same amount of water.”


Future efficiency projects could include things like upgrading urban water infrastructure to prevent massive leaks, such as the one that occurred at the University of California, Los Angeles last year when a pipe ruptured, spewing 20 million gallons of water into the street.


Jay Famiglietti, the senior water scientist at the NASA Jet Propulsion Laboratory at the California Institute of Technology, also invoked harsh terms in response to Fiorina’s statements.


“There is zero truth to any argument that attempts to characterize the current California drought as man-made,” he told ThinkProgress via email. “All you need to do is look at up the mountains and realize that there is no snow, look at the reservoirs and see that they are nearly empty, and look at last January to see that it was the driest on record. A lack of infrastructure is not the issue when there is nothing to put in it.”


According to Famiglietti, suggesting that allocating water for environmental flows is a waste shows a lack of appreciation for the many benefits of these flows.


“Stemming environmental flows could do irreparable damage to the ecosystems that sustain us, and would be counterproductive at best,” he said. For instance, if not enough water flows into the California Delta, salt water from San Francisco Bay could intrude further into it, making it more saline and lowering the quality of the water used for drinking and agricultural purposes.


Famiglietti went on to say that while the lack of water is already “right before our eyes and is undeniable” that “climate change will create a new class of water ‘haves’ and ‘have-nots,'” and that we need to begin preparing now for the challenges and complexities this will present.


This dynamic can already be seen playing out in Brown’s recent executive order for local water agencies to cut usage 25 percent from 2013 levels — the first time statewide mandatory restrictions have been imposed. While Fiorina and other right-wingers make noise about the state’s preference for urban elites and ecological flows, the executive order exempts any mandatory cuts from agricultural sources.


Agricultural water use accounts for around four-fifths of the state’s human water use. The executive order addresses this impact mainly in the form of increased enforcement against “illegal diversions and waste and unreasonable use of water.” Statewide, average water use is roughly 50 percent environmental, 40 percent agricultural, and 10 percent urban, according to the Public Policy Institute of California (PPIC). More than half of California’s environmental water use occurs in rivers along the state’s north coast — sources that, according to PPIC, “are largely isolated from major agricultural and urban areas and cannot be used for other purposes.”


Brown defended his treatment of the state’s agriculture industry by saying that California’s farms are “providing most of the fruits and vegetables of America” and that cutting off water allocations off would displace hundreds of thousands of people.


Jay Lund, director of the Center for Watershed Sciences at the University of California, Davis, added to the chorus saying that building more water infrastructure would not solve the crisis. According to Lund, the most impactful new storage projects as considered by the state would only offer an additional 5 to 15 percent of new storage capacity.


In conducting a study on California’s potential for future water storage, Lund found that the limitation “stems primarily from a lack of streamflow to reliably fill larger amounts of storage space.”


In one sense at least, Lund agrees with Fiorina about the human causes of the drought.


“I suppose in a way all droughts are man-made, in the sense that without human water demands, we wouldn’t usually consider these conditions to be a drought,” he told ThinkProgress.


Even if you don’t live in California, this likely includes your demands too.


The post Republicans Are Saying Environmentalists Caused California’s Drought. Here’s Why They’re Wrong appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2015 05:00

April 8, 2015

Tesla Trumps Toyota: The Seven Reasons Hydrogen Fuel Cell Cars Are Stalled

[image error]

Toyota Mirai fuel cell vehicle on display during 2015 Detroit Auto Show.


CREDIT: Ed Aldridge / Shutterstock.com



For reasons that mostly defy logic, the otherwise shrewd car company, Toyota, is placing a large bet on hydrogen fuel cell cars, starting with the Mirai. At the same time, it has backed away from its partnership with Tesla to build an all-electric vehicle.


Toyota is going to lose this bet. There is little reason to believe hydrogen fuel-cell vehicles (FCVs) will ever beat electric vehicles in the car market. There is even less reason to believe they will ever be a cost-effective carbon-reducing strategy (as EVs already are close to being) as I discussed in Part 1, Part 2, and Part 3.


To be clear, this isn’t FCVs versus Tesla. This is FCVs versus every other car, including electric vehicles (EVs). The series has the “Tesla Trumps Toyota” headline since it was inspired by an article titled “Toyota Bets Against Tesla With New Hydrogen Car,” which also discussed their broken partnership.


I believe we now know enough about EVs, the global car market, the world’s growing response to the threat of unrestricted carbon pollution, and the steady advances being made in battery technology to know with confidence that EVs will be a major player in the global car market in the coming years. That is true even if Tesla doesn’t succeed. And it has been clear for a decade, at least, that if electric cars can succeed in the marketplace that leaves very little room for hydrogen cars, as the links above make clear.


As an aside, FCV advocates have often responded by saying, well, EVs may be better for cars, but FCVs are better for bigger vehicles like minivans, SUVs and light trucks. That is essentially the point both Honda and Toyota made in their response to several serious questions about hydrogen cars posed last fall by “Green Car Reports.” The point is entirely moot until FCVs actually solve all seven problems they face, some of which get bigger for bigger vehicles. It also bears noting that Toyota’s first FCV is a sedan!


In any case, after looking into hydrogen cars for the umpteenth time in my career, I seriously doubt they hold any prospect for either marketplace success — or contributing to the climate solution — for decades (if ever). They simply have too many barriers to success as a mass-market alternative fuel vehicle car. Indeed, they have every barrier there is!


I was first briefed on advances in transportation fuel cells within days of my arrival at the Department of Energy in mid-1993. One of DOE’s national labs, Los Alamos, had recently figured out how to reduce the amount of platinum in the best fuel cells for vehicles, proton exchange membrane (PEM) fuel cells (as had researchers elsewhere). This did not make them affordable in cars — we are still a long way from that — but a time when they might be had become imaginable. I (and others at DOE) quickly began pushing for increases in the budget for both hydrogen research and fuel cell research. Then, in the mid-1990s, when I helped oversee the hydrogen and fuel cell and alternative vehicle programs at DOE’s Office of Energy Efficiency and Renewable Energy, I worked to keep the budgets up even as the Gingrich Congress tried to slash all of DOE’s clean tech funding.


With that funding — and partnerships with the big U.S. automakers — advances were made, slowly. But the FCV research did not pan out as expected — some key technologies proved impractical and others remained stubbornly expensive.


Even so, in 2003 President George W. Bush announced in the State of the Union that he was calling on the nation’s scientists and engineers to work on FCVs “so that the first car driven by a child born today could be powered by hydrogen and pollution free.” That set off another massive increase of spending by the federal government and investments by private companies in hydrogen and fuel cells.


I began researching what was to be a hydrogen primer. But as I read the literature, talked to the experts in and out of government, and did my own analysis, my views on both the green-ness of hydrogen cars and their practicality changed. It became increasingly clear that hydrogen cars were a very difficult proposition. My 2004 book, “The Hype About Hydrogen: Fact and Fiction in the Race to Save the Climate” came out in 2004, just as the National Academy of Sciences came out with a study that was also sobering (as did the American Physical Society).


My conclusion in 2004 was that “hydrogen vehicles are unlikely to achieve even a 5% market penetration by 2030.” And that in turn meant hydrogen fuel cell cars were not going to be a major contributor to addressing climate change for a very long time.


What has changed since then? Less than Toyota and other FCV advocates would have you believe. A 2013 study by independent research and advisory firm Lux Research concluded even more pessimistically that despite billions in research and development spent in the past decade, “The dream of a hydrogen economy envisioned for decades by politicians, economists, and environmentalists is no nearer, with hydrogen fuel cells turning a modest $3 billion market of about 5.9 GW in 2030.” The lead author explains, “High capital costs and the low costs of incumbents provide a nearly insurmountable barrier to adoption, except in niche applications.”


To understand why this is true, you need to understand why, until very recently, alternative fuel vehicles (AFVs) of all kinds haven’t had much success. A significant literature emerged to explain that lack of success by AFVs — as I discussed in my book and a 2005 journal article, “The car and fuel of the future


There have historically been seven major (interrelated) barriers to AFV success in the U.S. market:


1. High first cost for vehicle: Can the AFV be built at an affordable price for consumers? Can that affordable AFV be built profitably?


2. On-board fuel storage issues (i.e. limited range): Can enough alternative fuel be stored onboard to give the car the kind of range consumers expect — without compromising passenger or cargo space? Can the AFV be refueled fast enough to satisfy consumer expectations?


3. Safety and liability concerns: Is the alternative fuel safe, something typical users can easily handle with special training?


4. High fueling cost (compared to gasoline): Is the alternative fuel’s cost (per mile) similar to (or cheaper than) gasoline? If not, how much more expensive is it to use?


5. Limited fuel stations (the chicken and egg problem): On the one hand, who will build and buy the AFVs in large quantity if a broad fueling infrastructure is not in place to service them? On the other, who will build that fueling infrastructure — taking the risk of a massive stranded investment — before a large quantity of AFVs are built and bought, that is, before these particular AFVs have been proven to be winners in the marketplace?


6. Improvements in the competition: If the AFV still needs years of improvement to be a viable car, are the competitors — including fuel-efficient gasoline cars — likely to improve as much or more during this time? In short, is it likely competitors will still be superior vehicles in 2020 or 2030?


7. Problems delivering cost-effective emissions reductions: Is the low-emission or emission-free version of the alternative fuel affordable? Are fueling stations for that version of the fuel affordable and practical?


Every AFV introduced in the past three decades has suffered from at least three of those problems. Besides the tough competition (like the Prius), EVs have suffered most from #1 (high first cost) and #2 (limited range and slow speed of recharging). But major progress is being made in both areas.


FCVs suffer from all of them — and still do! It is very safe to say that FCVs are the most difficult and expensive kind of alternative fuel vehicle imaginable. While R&D into FCVs remains worthwhile, massive investment for near-term deployment makes no sense until multiple R&D breakthroughs have occurred. They are literally the last alternative fuel vehicle you would make such investments in — and only after all the others failed.


As an aside, if you think FCVs have solved #2, the onboard storage issue, they have not — even though this is considered their big advantage over electric vehicles. In fact, they are probably a breakthrough away from doing so, as Ford Motor Company has acknowledged. Infrastructure (#5) remains the most intractable barrier for FCVs. It is far less of a problem for EVs (as I noted here).


For governments and climate hawks, problem #7 may be the most important. As of today, it remains entirely possible that hydrogen fuel cell cars will never solve the problem of delivering cost-effective emissions reductions in the transportation sector — a problem EVs do not have. I discussed that in Part 1 and Part 2.


But the United States, Japan, and other countries — and many automakers — continue to misallocate funds toward near-term deployment of deeply flawed hydrogen fuel cell vehicles. Because of that, and because after 25 years of dawdling on climate action we lack the time to keep making such multi-billion dollar mistakes, I will discuss the 7 barriers FCVs still face today in more detail in subsequent posts. I will also discuss how EVs have been tearing down the few remaining barriers to their marketplace success.


NOTE: Nothing I write here should be taken as a recommendation for or against investing in Tesla (or Toyota or any company, for that matter).


The post Tesla Trumps Toyota: The Seven Reasons Hydrogen Fuel Cell Cars Are Stalled appeared first on ThinkProgress.

 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2015 13:01

Joseph J. Romm's Blog

Joseph J. Romm
Joseph J. Romm isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Joseph J. Romm's blog with rss.