Helen H. Moore's Blog, page 111

April 9, 2018

Tropical rainforests may be near a tipping point beyond our control

deforestation_rect

(Credit: AP/Achmad Ibrahim)


MASSIVE_logo


The planet is losing an estimated 80,000 acres of tropical rainforest every day. And while that amount of deforestation — more than twice the area of Costa Rica each year — seems like it could result only from a colossal bulldozer plowing across an entire continent of forestland, the reality on the ground is anything but uniform.


In the Amazon, Congo, and Indonesia, the three regions that are home to nearly all of the world’s tropical rainforests, the human motivations behind and methods of deforestation are entirely distinct. In South America the most significant driver of forest loss is the need to clear land for industrial-scale agriculture and ranching, so huge swaths of forest are burned into oblivion by human-set wildfires. In Southeast Asia, on the other hand, the high price of timber in the global market makes clear-cutting a lucrative venture. In Africa, deforestation lacks this industrial scale, but is more haphazard as small farmers clear land in piecemeal efforts to plant subsistence crops.


The net result is that the rainforests of today’s post-industrial world are more like millions of tiny, isolated patches of forest than the massive stretches of jungle that blanketed the tropics for millennia. The ramifications echo far beyond sentimental conservation — these forest fragments collectively emit 31 percent more greenhouse gases to the atmosphere than intact rainforests, even after accounting for emissions from deforestation. In addition, numerous plants and animals that call the tropical rainforests home, and that inspire pharmaceuticals for human medicine, have struggled to adapt to patchwork forests, and so face extinction.


These forest patches in the Amazon, Congo, and Indonesia were created in different ways, so it seems inevitable that the complex forest mosaics left behind by deforestation are unique across the regions. Yet a new study, published in Nature last month, found that despite the differences in where, why, and how rainforests are destroyed, from a mathematical perspective the resulting mosaics look essentially the same. And, ominously, the math suggests that.


The key to this warning is a mathematical concept called percolation theory. Percolation theory may not sound familiar — it lacks the fame of the Big Bang theory or evolution — but like gravity, it seems to apply ubiquitously in the natural world. At its simplest, the theory seeks to describe how structures that repeat themselves in nature, like ice crystals in near-freezing water, cliques in a social network, or clouds across the sky, behave when subjected to an outside change. The equations derived from percolation theory seem to fit nearly everywhere scientists look: they’ve been applied to phenomena as disparate as earthquakes, epidemics, and forest fires.


But there’s a common thread: all of these phenomena have a so-called critical point, a point beyond which behavior transforms from moving slowly and steadily to moving all at once. Take disease, for instance. When only a few people are sick, the infection can spread — but only slowly, a few people at a time. However, percolation theory states that once the number of people infected surpasses the critical point, the disease is almost certain to rapidly infect the rest of the population. In the case of deforestation, the critical point represents the point beyond which forests will shift from human-caused destruction to self-collapse, as forest fragments shrink and die due to the inherent instability of small forests fragments.


Now, it appears that the same equations fit the fragmented remains of the planet’s great rainforests. And that we’re getting close to the critical point.


This unexpected finding came about when researchers, led by mathematician Franziska Taubert, began combing through satellite images of the world’s entire remaining stock of tropical rainforest. But rather than simply count up the total area of intact forest that remains across millions of individual fragments, they counted the fragments themselves: how many there were and how much area each covered. While they expected to come away with different equations describing these rainforest fragments for each continent, they found instead that, across the world, the proportion of tiny fragments to large fragments was the same. Even more interesting, that proportion was exactly the ratio predicted by percolation theory.


They followed the thread, developing a computer simulation of deforestation based on the equations, derived from percolation theory, and current rates of rainforest loss. The simulation results exactly matched the patterns Taubert and his team observed from the satellite data. The advantage of the simulation, though, was that it could go one step further — it could identify the level of deforestation that would push the world’s rainforests past the critical point. The result: at current rates of deforestation, tropical rainforest are only several years’ worth of deforestation away from smashing through the critical point.


To get a sense of what happens once deforestation surpasses the critical point, the research team played out their simulations in the Amazon basin. Just as in the case of an epidemic, the simulations predicted that the scale of deforestation would increase exponentially once the forest began to collapse on itself. By 2050, without an end to human deforestation, the overall area of the Amazon rainforest was predicted to drop by half. Even in a scenario in which reforestation was undertaken, but without concerted efforts to end deforestation, the rainforest fared little better.


That’s a bleak forecast considering how essential massive rainforests are to the planet as we know it. A 2015 study suggested that losing the world’s tropical rainforests could not only cause nearly a degree of atmospheric warming, but could also dramatically alter rainfall patterns around the world. Without rainforests to serve as an atmospheric sponge around the equator, places as distant from the tropics as the United States’ corn belt could see drought become commonplace. Closer to where the forests once stood, fires will run rampant as the land dries up. Tropical soils, without trees to turn over nutrients, will turn toxic to agriculture.


However, Taubert’s team identified one scenario that offered a glimmer of hope. When the simulations assume that deforestation is phased out over the next decade and replaced with reforestation programs that increase each year, the Amazon would lose only one-quarter of its area over the next 30 years.


And while that still represents a staggering amount of lost rainforest, this outcome nonetheless demonstrates the immense power of conservation efforts to direct the future of the world’s rainforests. In that vein, this research and percolation theory itself are warnings, rather than death knells for tropical rainforests. Without action, both grassroots and government-driven, the time will come when rainforest fragments begin to collapse on themselves and it is too late to undo the damage. But that time has not yet passed. [image error]



 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2018 00:59

Here’s why you’ll never see “mom” listed in my bio

Thinking sad daughter embracing her mother and looking up

(Credit: Getty)


Thought Catalog


Now that I’m pregnant, I’ve been thinking a lot about what it will be like to be called “mom” — not just by my kid, but by everyone else.


I’m excited to become a mother! My pregnancy could not have been more planned, and raising a child is a challenge I’m psyched to take on. Just not at the expense of my sense of self.


The truth is, I don’t want motherhood to eclipse the rest of my identity — all the pieces of myself I’ve spent the last 35 years building — and I don’t intend to let it. To avoid the fate of typecasting, one simple measure I plan to take is avoiding mention of motherhood in any bio I draft.


Why? Because I fear our cultural tendency to reduce women down to the role of mother too much. I see this in the “mommy wars,” which treat personal parenting choices as the seeds of moral dilemmas and cause for tedious debates. I see this in the way strangers feel entirely comfortable addressing a woman accompanied by a child as “mom” without knowing a thing about her. I see this in the way Instagram commentors admonished Chrissy Teigen for going on a date night with husband John Legend “too soon” after the birth of their daughter. In the way Irina Shayk was chastised for posting a bikini shot a month after childbirth rather than a photo of her baby. In the way Rachel Finch was lambasted for admitting that she leaves her kid with her parents on weekends so she and her husband can enjoy some kid-free quality time.


What the fuck is wrong with us? Why do we feel so comfortable casting judgment upon mothers? I want no part in any of that!


Some people will read this and automatically accuse me of making a mistake. If you’re not ready to put everything else aside, you shouldn’t have a baby! I can hear the naysayers chant. Parenthood demands constant sacrifice! This selfish bitch is going to fuck up her kid if she doesn’t see the light!


On one count, my detractors would be right: I am selfish.


But I don’t think that’s such a bad thing. In fact, I’m pretty sure selfishness is central to the human condition. We spend most of our time imprisoned by our own minds and our individual sets of experiences — thinking thoughts, entertaining fantasies, and nurturing concerns that can never be shared, if only because there’s not enough time to express our every whim. We’re biologically programmed to look out for our own well-being. To do the best we can to survive as the self-piloted ships we are, navigating this big, wide, weird world. Of course, we’re also programmed to look out for our progeny, but to do so properly don’t you have to look out for yourself? Perhaps a reasonable degree of selfishness positions you to be an even better parent.


Don’t get me wrong: I am delighted by the prospect of bringing a new life into the world. I am thrilled to experience the special brand of love that blossoms between mother and child, and I expect to make endless compromises as I adjust to the life-changing milestone that is parenthood.


But I refuse to become entirely selfless as I embark on this whole motherhood journey. And I don’t want to be thought of as a mom foremost in anyone’s mind, including my own. Instead, I’d like to be characterized by the many things I’ve worked towards, plus motherhood.


So you will never see “mom” listed in my bio.


Sure, being a mom will soon become one of my defining traits, and I don’t plan to hide it. I will continue to celebrate my pregnancy and motherhood as I see fit, with the occasional related article or social media post. But I’m uninterested in being associated as a mom above all else. By self-identifying as a mother within the few sentences one gets to draft a brief bio, I worry that I would invite others to think of me primarily in that context.


Arguably, motherhood is a life-changing experience worthy of biographical annotation—far more so than graduating from a particular college, or establishing oneself in a specific industry. I can see why so many mothers mention their parental status in their bios. The funny thing is, I rarely see men do this.


Is it a coincidence that “daddy shaming” isn’t really a thing?



 •  0 comments  •  flag
Share on Twitter
Published on April 09, 2018 00:58

April 8, 2018

American broadcasting has always been closely intertwined with American politics

Sinclair Broadcast Group, Inc.'s headquarters

Sinclair Broadcast Group, Inc.'s headquarters (Credit: AP/Steve Ruark)


Local television viewers around the United States were recently alerted to a “troubling trend” that’s “extremely dangerous to democracy.”


Sinclair Broadcast Group, one of America’s dominant television station owners, commanded its anchors to deliver a scripted commentary, warning audiences about “one sided news stories plaguing our country” and media outlets that publish “fake stories … that just aren’t true.”


This might sound like a media literacy lesson, offered in the public interest. But the invocation of “biased and false news” so closely echoes charges from the Trump administration that many observers cried foul.


Sinclair’s record of broadcasting news content favorable to the Trump administration, including mandated program segments such as the “Terrorism Alert Desk,” and “Bottom Line with Boris,” with former Trump administration official Boris Epshteyn, provides additional evidence of partisan bias.


So, is it time, as some commentators are suggesting, to restore the Fairness Doctrine, which used to require broadcasters “to present controversial issues of public importance and to do so in a manner that was fair and balanced”? That policy, adopted by the Federal Communications Commission in 1949, was repealed in 1987. It supposedly sustained responsible political debate on the nation’s airwaves until its disappearance during the Reagan administration.


I would argue that nostalgic calls for the restoration of a golden age of civil political discussion on America’s airwaves mistake what actually happened in those decades.


Airtime for Nazis, socialists, communists


Politics and broadcasting have been consistently intertwined in American history. As I have found in my own research, the commercial broadcasting community (including advertisers) has consistently aligned news content and commentary in ways favorable to the White House.


But such episodes are often conveniently forgotten.


As Mitchell Stephens’ new biography of journalist Lowell Thomas recounts, and as numerous earlier scholars detailed, U.S. broadcast journalism originated more as subjective and biased commentary than as reportage.


The vast majority of 1930s radio “news” was politically slanted analysis by veteran journalists like Thomas, H.V. Kaltenborn and Boake Carter. Kaltenborn, for example, was notable for his anti-union commentaries.


The uncertain nature of early broadcast regulation, combined with pressure from organized interest groups and politicians, all made the exact parameters of political speech on American radio ambiguous in the 1930s.

So the networks lent their microphones to a wide range of views from the quasi-fascists like Father Charles Coughlin (the “Radio Priest”), to homespun socialists like Huey Long and union leaders like the American Federation of Labor’s William Green. As Douglas Craig, David Goodman and numerous other scholars have pointed out, political broadcasting in the 1930s was vibrant, fertile and diverse to an extent unmatched to the present day.


For example: In 1936, both CBS and NBC aired Nazi propaganda from the Berlin Olympic Games. They also broadcast live from the Communist Party of the United States of America nominating convention. Programs like “University of Chicago Roundtable,” and “America’s Town Meeting of the Air” aired provocative political discussion that engaged and educated American audiences by exposing them to diverse viewpoints.


Airwaves rein themselves in


But as war neared, U.S. political broadcasting narrowed its range.


The Roosevelt administration began to carefully police the airwaves. CBS’ highly rated news commentator, Boake Carter, had often criticized President Roosevelt’s policies. But when he applauded the Anschluss, Germany’s annexation of Austria in 1938, and expressed admiration for Nazi policies, the White House acted.


As media historian David Culbert revealed, Roosevelt’s adviser Stephen T. Early secretly contacted CBS and Carter’s sponsor, General Foods, to silence Carter. Despite high ratings and a popular following, Carter’s CBS contract was not renewed. Within weeks he was gone.


Broadcasting’s self-censorship under government pressure expanded at the start of World War II. Circumscribing critical analysis and channeling commentary to the political center pleased advertisers and politicians.


With the assistance of such broadcasting pioneers as Edward R. Murrow, subjective radio news commentary morphed into the type of observational reporting now identified as broadcast journalism.


The most famous example of this shift occurred in 1943. That year Cecil Brown, CBS’s top-rated news analyst and author of the best-selling “Suez to Singapore,” dared to criticize the war effort he witnessed on the American homefront. Brown was fired, and his dismissal proved a warning to every other broadcast commentator.


Not everyone was happy with the neutering of news and opinion on American airwaves. In response to the Brown firing, FCC Chair James Lawrence Fly criticized what he considered corporate censorship.


“It’s a little strange,” Fly told the press, “to reach the conclusion that all Americans are to enjoy free speech except radio commentators.”


But removing partisan politics from broadcast journalism increased advertising revenue and proved remarkably lucrative for U.S. broadcasters during World War II.


With the lesson learned, and with the support of the advertising community, America’s broadcasters aimed to address only the “vital center” of American politics in the postwar years.


Still, politics persisted


It would, however, be a mistake to believe that the Fairness Doctrine silenced fractious political discourse on the American airwaves.


Throughout the decades that the Fairness Doctrine remained official policy, controversial political broadcasts aired regularly on American television and radio. There was Joe Pyne, whose show at its zenith in the 1960s attracted a reported 10 million viewers. Pyne insulted the hippies, Klansmen and civil rights activists he invited to his studio. Though the show is recalled today more for its outrageousness, it was a political show and Pyne propagated a conservative, law-and-order, patriotic message.


Then there’s Bob Grant, who broadcast a popular radio show in New York City throughout the 1970s. Grant’s “arch disdain for liberals, prominent black people, welfare recipients, feminists, gay people, and anyone who disagreed with him,” wrote The New York Times, “was familiar to his listeners.”


Nationally syndicated programs like “Donohue” offered liberal perspectives, and even the “CBS Evening News” brought back commentary, with veteran journalist Eric Sevareid providing perspective on the daily news each weeknight.


I’m not equating the well-reasoned, often brilliant political commentary offered by Eric Sevareid to Sinclair Broadcast Group’s transparent political advocacy. Sevareid reached a much larger percentage of the American populace than all the Sinclair newscasts combined, and he was therefore far more influential.


But to express surprise that Sinclair now shapes news content and commentary to be more hospitable to political advertising, and more supportive of the current administration, ignores the fact that political commentary has always sold well in the American commercial system.


I believe Sinclair’s management has identified an underutilized segment of the local TV news advertising market — the pro-Trump segment — as the 2018 midterm elections approach. The broadcaster is now shaping its news products to more effectively appeal to the audience for the political advertisements it seeks to sell this fall.


This economic interest closely aligns with Sinclair’s current political and regulatory imperatives. It makes the propagating of biased news content even more effective from Sinclair’s perspective.


Sinclair clearly hopes that the political consultants who purchase campaign ads, and the federal regulators who must approve their planned purchase of Tribune Broadcasting’s 42 stations, will appreciate their recent media literacy efforts.


Michael J. Socolow, Associate professor, communication and journalism, University of Maine



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 20:00

Tariffs aren’t the best way to protect U.S. steelworkers. Global solidarity is

APTOPIX Construction Spending

(Credit: AP Photo/Julio Cortez)


InTheseTimesThe enthusiasm with which the AFL-CIO and United Steelworkers (USW) greeted Trump’s announcement of a global tariff on steel and aluminum exports raises significant questions about the U.S. labor movement’s commitment to international solidarity.


The USW has a strong record of internationalism. Not only does the USW represent workers in Canada, like many U.S. unions, but it has long supported Los Mineros — one of only a small handful of militant, independent trade unions in Mexico — and has discussed the possibility of a merger.


The USW was also the first U.S.-based trade union to make the jump to Europe, in 2008, forming a transatlantic organization with UNITE, the largest trade union in Britain and Ireland. And through international campaigns in collaboration with global union federations (GUFs) like IndustriALL, which bring together trade unions from around the world, the USW has built strong relationships from Germany to Brazil.


Trump’s tariffs initially targeted all of these countries — and yet the USW and AFL-CIO embraced the plan (though the USW did call to omit Canada). Their global allies were not pleased. The Canadian union Unifor issued a strongly worded statement arguing that the AFL-CIO’s position sold out the Canadian members of its affiliate unions. UNITE and Germany’s IG Metall issued anti-tariff statements as well. Brazil’s major trade union federations mounted a significant show of opposition with a joint statement and street protests.


From the perspective of workers in the Global South, Trump’s steel tariffs reflected the actions of a powerful, wealthy country seeking to maintain its wealth and power at the expense of poor countries. Blue-collar workers in the Global North, who often enjoy wages and benefits far superior to those of the United States, believe the United States is already undercutting their markets with its anti-union environment.


Eventually, Trump exempted Argentina, Australia, Brazil, Canada, Mexico, South Korea and the member countries of the European Union. But the USW’s initial reaction may have undermined decades of global coalition-building work that is far more essential than tariffs in the long term.


The USW provides vital support to trade unionists working under incredibly adverse conditions, directly supporting allies in Mexico, Liberia and Colombia, and participating in GUF campaigns to pressure multinationals to sign global agreements on labor standards.


But the USW’s international work also benefits its U.S. members. During the Ravenswood aluminum plant dispute in the 1990s and subsequent disputes with Bridgestone/Firestone and Ameristeel/Geraud, trade unionists from Europe to Latin America to Asia pressured these multinational corporations to help win strong agreements for the USW. In recent years, the USW has effectively used global agreements negotiated by IndustriALL to help resolve domestic contract disputes.


Protectionist policies undermine this tradition of solidarity by falsely pushing a narrative that “fair competition” will raise labor standards. In fact, markets have never been truly open. The United States has always practiced selective protectionism — for example, of the corn industry, which led to the widespread immiseration of Mexican corn farmers after NAFTA — not to mention myriad other forms of control exerted over countries in the Global South, from withholding development aid to loan conditionalities.


Tariffs will do nothing to improve labor rights or working conditions for workers in China, and may perversely result in a greater squeeze on labor as exporters look to cut costs.


And if foreign workers were to be laid off or squeezed as a result of the tariffs, how likely will they be to stand in solidarity with us in the future?


Trade unions should think carefully about opportunistically reverting to nationalism when political openings arise. As the United Electrical union put it, “American workers need … a trade and industrial policy that is based on international cooperation, respect for workers’ rights and environmental sustainability — one that raises living standards for workers across industries and across borders through investment in infrastructure, jobs and social programs.”



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 19:30

Here’s why I’m 100 percent comfortable being a working mom

Working Mom

(Credit: Getty/g-stockstudio)


Thought Catalog


When my baby’s nanny told me that I’m the only new mom she’s EVER known who didn’t cry on day one back at the office following maternity leave, part of me felt a tinge of pride. But a MUCH bigger part of me secretly wondered if I loved my daughter enough.


Walking to the subway on day two as a working mom, I pictured a black heart emoji Pinned to my person, lingering above my head at all times as I went about the business of achieving some semblance of work-life balance.


Certainly, something must be wrong with me if I was able to skip the crying portion of returning to work after giving birth. Why the fuck didn’t I break down as I bid my little love bug good-bye? Why wasn’t I moved to tears by her clueless coos in response to my explanation that mama would be back in about 10 hours? Why didn’t it disturb me that this would be the longest stretch we’d EVER spent apart? That I would have to pump in place of feeding her from my breast for the next several hours? That I wouldn’t know how many times she’d pooped until the nanny told me later on? That I wouldn’t know if she’d finally figured out how to suck her thumb unless I received a text telling me as much?


The truth is, three months into motherhood, I was already eager to reclaim a slice of my former life.


In fact, my decision to return to work involved less than zero internal torment. While the time I’d spent nurturing my daughter around the clock during her first few weeks of life was filled with countless treasured memories, if anything, maternity leave confirmed that being a stay-at-home mom was not the right path for me.


As my official start date approached, I grew more and more excited about the prospect of an 8 to 10 hour stretch five days a week to do the work that fulfills me. Also exciting? The idea of engaging regularly with other potty-trained humans fluent in English, peeing without cradling a baby simultaneously, and feeding myself whenever the hell I pleased! I knew that my daughter was in good hands with the nanny I’d hired after interviewing a slew of candidates. And I knew that working — and maintaining a sliver of my pre-baby identity — was the best possible choice for my mental wellbeing.


Of course I miss my little girl at certain points throughout the workday. I long to hold her and to stare at her smiley, toothless face at least hourly. But I definitely haven’t experienced anything close to emotional trauma while away from her, and I haven’t shed a single tear.


And guess what? That’s okay!


I am not a black heart emoji simply because I happen to relish my time away from home. Without a doubt, I love my job AND my baby. I’m the fucking pink heart with the gold sparkles dancing around it, even if I have to keep reminding myself that there’s no “right” way to be a mom.



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 19:30

Don’t fear germs — at least not too much

Health Overhaul Disease Prevention

(Credit: Associated Press)


MASSIVE_logoThroughout my childhood, my role models in life warned me about bacteria and germs. “Wash your hands so you don’t get any germs” and “Don’t touch that — it’s covered in bacteria” were some of the phrases I took to heart. I was on my way to being a full-blown germaphobe. And it’s not surprising; a we-must-kill-all-bacteria attitude pervades our antibacterial-filled society.


But times are changing. Our long-held fear of microbes now collides with an opposing force: the enthusiasm for the microbiome, which has shown that microbes can actually improve our health. There is a much more complicated interdependent relationship between people and our resident microbes than rampant hand-washing regimes would suggest.


My time in graduate school coincided with the upswing of microbiome research. There, my relationship with bacteria changed as I began to study an organism more nuanced than we once thought.





This organism happened to be the bacterium Helicobacter pylori. Nearly half of the world’s population carries Hpylori in their stomachs. Though most famous for its role in causing stomach ulcers, stomach cancer, and gastritis, Hpylori only does this in tiny percentages of the individuals that live with it. Only 10 to 20 percent of infected individuals develop stomach ulcers, and only 1 to 2 percent of infected individuals develop gastric cancer or B cell MALT lymphoma, a type of cancer that affects immune cells called lymphocytes. You and I may carry Hpylori in our stomachs and not even know it.


The longer I worked with this organism, the more I unraveled about its long evolutionary history. Most diseases caused by Helicobacter species are host specific, meaning that a given species of Helicobacter will cause disease only in one host species. This points to a close co-evolution of the bacterium with its host. Hpylori is estimated to have co-evolved with humans for approximately the past 100,000 years. Two years ago, Hpylori was sequenced from Oetzi the Iceman, a 5,300-year-old mummy found in the Alps. Because of their high prevalence in humans, these microbes that follow us around can even be used to track human migration over the past thousands of years.





Since we have shared our stomach with Hpylori over so many generations, could we consider Hpylori as part of our microbiome? One of the biggest proponents of this argument is Martin Blaser, a microbiology professor at New York University School of Medicine. As a prominent researcher in the Hpylori and Campylobacterjejuni (a close relative of Hpylori) field, Blaser recounts his shifting view on Hpylori in his book Missing Microbes.


“By the mid-1990s, I began to change my mind,” he wrote. Hpylori may seem like a bacterial villain provoking ulcers and stomach cancer in a small percentage of the population. But, he writes, “evidence was beginning to suggest that Hpylori is a member of our normal gut flora and plays a critical role in our health.”


Many researchers have observed that a decline in Hpylori colonization in some areas of the globe, likely due to increased antibiotics use and improved sanitation. But the disappearance of Hpylori also corresponds to an increase in asthma and gastroesophageal reflux disease (GERD). While the exact mechanisms that explain these correlations are unknown, persistent Hpylori has far-reaching effects on our immune system and endocrine system that could alter our predisposition towards these diseases.


Hpylori is so much more than its reputation as causer of disease. Its relationship with humans is more complicated than we once thought and is a perfect example of how microbes are neither purely “good” nor “bad.” As Ed Yong writes in I Contain Multitudes, “these terms belong in children’s stories. They are ill-suited for describing the messy, fractious, contextual relationships of the natural world.”


Hpylori is far from the only microbe that walks the line between these two dichotomies. Can’t we just be OK with knowing that sometimes we get along and sometimes we don’t?



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 18:00

Why prime numbers still fascinate mathematicians, 2,300 years later

magnet_numbers

(Credit: dubassy via iStock)


On March 20, American-Canadian mathematician Robert Langlands received the Abel Prize, celebrating lifetime achievement in mathematics. Langlands’ research demonstrated how concepts from geometry, algebra and analysis could be brought together by a common link to prime numbers.


When the King of Norway presents the award to Langlands in May, he will honor the latest in a 2,300-year effort to understand prime numbers, arguably the biggest and oldest data set in mathematics.


As a mathematician devoted to this “Langlands program,” I’m fascinated by the history of prime numbers and how recent advances tease out their secrets. Why they have captivated mathematicians for millennia?


How to find primes


To study primes, mathematicians strain whole numbers through one virtual mesh after another until only primes remain. This sieving process produced tables of millions of primes in the 1800s. It allows today’s computers to find billions of primes in less than a second. But the core idea of the sieve has not changed in over 2,000 years.


“A prime number is that which is measured by the unit alone,” mathematician Euclid wrote in 300 B.C. This means that prime numbers can’t be evenly divided by any smaller number except 1. By convention, mathematicians don’t count 1 itself as a prime number.


Euclid proved the infinitude of primes — they go on forever — but history suggests it was Eratosthenes who gave us the sieve to quickly list the primes.


Here’s the idea of the sieve. First, filter out multiples of 2, then 3, then 5, then 7 — the first four primes. If you do this with all numbers from 2 to 100, only prime numbers will remain.


With eight filtering steps, one can isolate the primes up to 400. With 168 filtering steps, one can isolate the primes up to 1 million. That’s the power of the sieve of Eratosthenes.


Tables and tables


An early figure in tabulating primes is John Pell, an English mathematician who dedicated himself to creating tables of useful numbers. He was motivated to solve ancient arithmetic problems of Diophantos, but also by a personal quest to organize mathematical truths. Thanks to his efforts, the primes up to 100,000 were widely circulated by the early 1700s. By 1800, independent projects had tabulated the primes up to 1 million.


To automate the tedious sieving steps, a German mathematician named Carl Friedrich Hindenburg used adjustable sliders to stamp out multiples across a whole page of a table at once. Another low-tech but effective approach used stencils to locate the multiples. By the mid-1800s, mathematician Jakob Kulik had embarked on an ambitious project to find all the primes up to 100 million.


This “big data” of the 1800s might have only served as reference table, if Carl Friedrich Gauss hadn’t decided to analyze the primes for their own sake. Armed with a list of primes up to 3 million, Gauss began counting them, one “chiliad,” or group of 1000 units, at a time. He counted the primes up to 1,000, then the primes between 1,000 and 2,000, then between 2,000 and 3,000 and so on.


Gauss discovered that, as he counted higher, the primes gradually become less frequent according to an “inverse-log” law. Gauss’s law doesn’t show exactly how many primes there are, but it gives a pretty good estimate. For example, his law predicts 72 primes between 1,000,000 and 1,001,000. The correct count is 75 primes, about a 4 percent error.


A century after Gauss’ first explorations, his law was proved in the “prime number theorem.” The percent error approaches zero at bigger and bigger ranges of primes. The Riemann hypothesis, a million-dollar prize problem today, also describes how accurate Gauss’ estimate really is.


The prime number theorem and Riemann hypothesis get the attention and the money, but both followed up on earlier, less glamorous data analysis.


Modern prime mysteries


Today, our data sets come from computer programs rather than hand-cut stencils, but mathematicians are still finding new patterns in primes.


Except for 2 and 5, all prime numbers end in the digit 1, 3, 7 or 9. In the 1800s, it was proven that these possible last digits are equally frequent. In other words, if you look at the primes up to a million, about 25 percent end in 1, 25 percent end in 3, 25 percent end in 7, and 25 percent end in 9.


A few years ago, Stanford number theorists Robert Lemke Oliver and Kannan Soundararajan were caught off guard by quirks in the final digits of primes. An experiment looked at the last digit of a prime, as well as the last digit of the very next prime. For example, the next prime after 23 is 29: One sees a 3 and then a 9 in their last digits. Does one see 3 then 9 more often than 3 then 7, among the last digits of primes?


Number theorists expected some variation, but what they found far exceeded expectations. Primes are separated by different gaps; for example, 23 is six numbers away from 29. But 3-then-9 primes like 23 and 29 are far more common than 7-then-3 primes, even though both come from a gap of six.


Mathematicians soon found a plausible explanation. But, when it comes to the study of successive primes, mathematicians are (mostly) limited to data analysis and persuasion. Proofs — mathematicians’ gold standard for explaining why things are true — seem decades away.


Martin H. Weissman, Associate Professor of Mathematics, University of California, Santa Cruz



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 17:00

Growing up with racists: How should I remember them now?

Sunset

(Credit: Getty Images)


None of us players spoke up or left the field in protest of the metaphorical lynching. Our high school varsity team was preparing to play a team with a black and very talented halfback, and our varsity coach had designated one of his runners to play the role of the enemy. From the opposite end of the field, where I was practicing with the junior varsity, I heard him—his nickname was Boomer—over the theatrical snarling of the varsity. “Get the n***er! Get the n***er!”


Boomer was a churchgoer. He was the only one of my high school teachers to contact me after my father died during my freshman year of college; he drove to my home, sat with my family at the kitchen table, and shared gently his sympathy. You can find his actual name on the wall of fame at my old high school.


It might be ungrateful of me to recall my old coach as I have, but when you’ve loved as many bigots as I have, knowing how to remember them can seem as hard as that dusty and cracked football field where he deafened us boys. Maybe I should let bygones be bygones. Some claim that Faulkner was mistaken and the past really is past, racism in contemporary America little more than a rusty whip handle unearthed at the site of a Mississippi plantation. I’ve heard that the election of our first African American president was irrefutable evidence that racism in the United States has been reduced to a group of feeble old men peering watery-eyed through holes in soiled and tattered white sheets. I’ve heard from white people that fear of racism is as irrational as fear of ghosts. It is hoped they learned otherwise when white supremacists, young and old, men and women, many openly armed, marched and rioted in Charlottesville in August 2017. I hope so, but I doubt it.


On my way home from work on election day in 2008, I stopped for a beer. The Irish bartender glanced at my Obama shirt and told a joke to the guy on the stool next to me. “Did you hear Obama is ahead?”


“No. Is he?”


“Don’t worry — that will change when the white people get off work and vote.”


I asked the same guy, loudly enough for the bartender to hear, “Do you know if they serve seven-course Irish dinners here?”


“A what?”


“You know — a six-pack and potato.”


My wife is mostly Irish and I’m partly, but my retort by slur was bigotry all the same. I had added to the stupid hate sputtering like old grease on the grill in that establishment where a patron could scribble whatever he desired on a dollar bill before the bartender tacked it to the wall above the bottles of whiskey. Where George Washington gushed, “I like Boobies!” What’s more, since that election day, I’ve bought beer at that business where I heard the racist joke, and it wouldn’t be impolite of you to ask why. In my neck of the woods, that bar is one of the few with Guinness on tap, and I am a weak man, but the answer is also that some of my fellow Americans drink elbow to elbow there and — for me — climbing up on one of those stools can be like going home again.


The first racist joke I heard as a child was told by a neighbor boy who heard it from his father. In my backyard, the boy asked, “What did God say when he made the second n***er?” I still hear the birdy, quavering voice of my neighbor, who walked to church with me on Sunday mornings, as he finishes the joke by assuming the Word of the Lord. In the punch line, God does not remind us that He created all people in His image, let alone demand an end to laughter at hatred. There on the green grass of my childhood, He says, “Oops, burnt another one.”


Although I’ve allowed myself to forget, surely I laughed: I was already fluent in the tongues of bigotry, though I never used the slur “dago” in the presence of my best friend, who was Italian.


After he led us in prayer, thanking Our Father for supper, my own father made occasional ethnic slurs while telling stories about his day at the power plant or commenting on some news he’d heard on the radio while driving home. Usually the slurs were uttered as if he were reporting the weather, but he was not so casual when race riots erupted in nearby Buffalo. He feared that the violence would spread to Pendleton, home to merely a few black families.


We once ventured 20 miles from home into Buffalo’s inner city to cheer the Buffalo Bills, the blue-collar defending champs of the upstart American Football League. My father parked the car on the small, yellowed yard of a house on mostly boarded-up Jefferson Avenue, paid the owner a two-dollar fee, and marched us to the game among an influx of pale humanity watched—predatorily, I imagined—by blacks sitting on front steps and porches, whole families bemused at the sight of so many whites staring straight ahead with silly terror in their eyes as they hurried up the avenue of false promises. Ticket scalpers and hot dog vendors hawked at busy intersections, and when we reached crumbling War Memorial Stadium, or the Old Rockpile, as it was called in western New York, my father said, for the second time that afternoon, “We’ll be lucky if our car isn’t stripped when we get back.”


Somehow my father and the rest of us whites worrying toward the stadium had come to the backward conclusion that blacks had a history of harming whites. He and I had given little thought to what it felt like for the two blacks who attended my school or the few who labored at the power plant, but now we feared being in the minority. Inside the decaying but thick walls of the stadium, things would be made right again: the coaches and quarterbacks and security guards would be white like most of the fans. Even a boy could sense that football was the way America worked: a hierarchy of owner and directors and coaches and stars right on down to the wounded, grunting, and anonymous offensive linemen on whose wide shoulder pads every touchdown rested. And yet even a boy could sense that our nation had two working classes: one inside and one outside the gates.


Until my grandfather took a new job in the power plant he had helped build, all of the Phillips men were disposable iron workers. In three separate accidents, my great-grandfather and two of his sons died on construction jobs. My grandfather broke two ribs and bruised a lung in another. My maternal grandfather broke a leg on a road construction job; two other kin survived crushing injuries on logging jobs; another lost two and one-half of his fingers in a machine shop. Nearly every iron worker in the family had a damaged back before he reached retirement age, and they were among the lucky ones. When their bodies were broken or lifeless, industry purchased new bodies. Helplessly, my father knew this. On a sidewalk in the small city of Lockport, several miles from our home in Pendleton we once passed a stranger in a grandiose suit and glittering watch and gleaming shoes. Dad spit on the concrete, and muttered, “You son of a bitch.”


My father, his killed grandfather, and two killed uncles did put food on the table while they lived. They could have been limited to starvation wages or sent to the endless unemployment line; and weren’t they forever reminded? Aren’t we all, we who have jobs? On some level they must have sensed that the well-to-do in America had twisted the word “black” into a definition of those who are perceived as inferior—and that their own skin pigment was no guarantee they would always be perceived as white. When my great-grandfather emigrated, he carried with him a Northern Irish and Protestant heirloom of anti-Catholic bigotry. Three generations of Phillipses lived in an Irish neighborhood of South Buffalo, and on their way home from public school my father and uncles and their Protestant pals fought Catholic boys who were on their way home from parochial school. The Protestant Irish thought of the Catholic Irish as black. Both thought of Italians as black. The Protestant and Catholic Irish, together with the Italians, thought of African Americans as black as black could be. My grandfather referred to Catholics as “cat-lickers,” though he married one who agreed to give up her faith. Before I met the woman I would marry, who has kept her faith, I had a vague suspicion that Catholics had tails and horns, a fear she has mostly dispelled.


In his book “How the Irish Became White,” historian Noel Ignatiev could be referring to my kin when he notes of his depiction of oppressed eighteenth- and nineteenth-century Irish Americans, “I hope I have shown that they were as radical in spirit as anyone in their circumstances might be, but that their radical impulses were betrayed by their decision to sign aboard the hunt for the white whale,” which, he adds, “in the end did not fetch them much in our Nantucket market.”


During the hike to the Old Rockpile, Dad bought us lunch at a hamburger stand. On the sidewalk, he counted his change and realized that the black cashier had accidentally handed him a twenty-dollar bill rather than a five; he got back in line, corrected the mistake, and explained to me, “They would have taken it out of her pay.” It was a warm day in autumn, and as usual he was wearing a dark shirt that hid the coal dust, the blackness flushing from his pores as he perspired.


My mother never used the racial epithets that were second nature to other adults in my family and neighborhood. I like to think she was too intelligent to be bigoted — as if bigotry is caused by stupidity, an assumption of mine which probably goes to show that I’m not nearly as smart as I like to suppose. She had graduated first in her high school class but didn’t attend college, as she explained it to me when I was a teenager, “because back then college was just for rich girls who wanted to find richer husbands.” She grew up with Native Americans. Her father’s small, swampy farm edged within a half mile of the Tonawanda Indian Reservation, where, until he died in his eighties, one of her uncles lived with a Native American woman in a cabin with no indoor toilet. My mother’s younger sister married a man from the reservation, and although my grandparents loved their half-Indian grandchildren, their complaints about “lazy Indians” were sometimes slung at their gainfully employed son-in-law, and they felt certain that “them Indians must have took it” whenever a possession disappeared from the farm. Until my maternal grandfather landed a job on a state road crew when he was in his forties, they were poor, but my grandparents could always visit the reservation to witness destitute poverty, to be assured that though they couldn’t afford to buy more than a few pair of underwear for each of their daughters, they were white.


I was spending a weekend 18 miles east of my home, on my maternal grandparents’ farm near Akron, New York, when Charley Moses — the brother of Millie Moses, who was my grandmother’s closest friend — killed himself in his and Millie’s log cabin on the reservation. Millie telephoned my grandmother minutes after the rifle blast. Over the phone, my grandmother asked, “Was he drunk?” I begged them to take me along, but my grandparents ordered me to stay behind as they hurried out to their old American Motors sedan.


Early the next morning they returned to the reservation to clean Millie’s parlor, and I went fishing in the muddy creek that shaped the sinuous east and north boundaries of the farm. I returned to the yard hours later dragging a stringer of gasping and flopping bullheads and rock bass, tormented by a cloud of mosquitoes, and encountered my grandmother kneeling on the grass with her hands plunged in a pail of soapy, pink water. I asked what she was doing, and she replied, “Trying to get brains off these curtains.” She held up a curtain, and said, “Whoever would have thought Charlie Moses had so much brains?”


We danced to James Brown and Aretha Franklin and perhaps the sensual celebration shook us awake to the images and calls of truth arisen. By then it was 1970 and some of us paid attention when our American history teacher taught about slavery, the KKK, and racial segregation, and when he asked, “How come you don’t see anyone except white kids in this class?” Some of us were appalled by the old news footage of police assaulting peaceful civil rights protesters with truncheons, torrents of water, snarling dogs, and Southern law, and were stirred by the brave, truthful poetry of Reverend King, though by then he had been assassinated by a white supremacist. When the school board banned Eldridge Cleaver’s “Soul on Ice” from the library, a small group of us protested, not because we admired the author’s murderous, misogynistic rage but because, we argued, the school was supposed to be educating us, and Cleaver was of the American reality.


Of course, none of us walked off the football field in protest: other players might have been granted our positions.


Abe Lincoln and Stephen Douglas we weren’t, but my father and I entered into a series of debates involving racial issues. At first we disagreed about the banning of “Soul on Ice,” but as in all serious discussions involving race in America, we soon found it necessary to abolish boundaries and time, to visit George Wallace as well as Eldridge Cleaver, South Boston as well as Birmingham, and Africa as well as Harlem. He never argued overtly that blacks were genetically inferior, but my father was opposed to court-ordered integration of schools and affirmative action and believed that African Americans had accumulated more rights and opportunities than had whites. My mother, who knew her socially defined and confined place, listened in silence to our debates, which began during supper and lasted for hours. He thought about our disagreements while at work and I at school, and each of us charged into the new evening armed with arguments we believed to be fresh and potent. Dad actually asked a black worker at the power plant for his opinion on the Black Panthers, and reported to me triumphantly, “He told me they’re all crazy.”


We debated for three or four evenings in a row and then, weary from arguments that seemed to be going nowhere but circular, gave it a rest. We mostly avoided each other until he came to me after two days of quiet and said, “You know, all the black and white stuff we talked about, some of it you were right. You still got a lot to learn in life, but some of it you were right.”


I nodded and looked away, embarrassed and proud like a son who has realized that for once his father has not let him win at basketball, that he has actually beaten his flawed hero. Which only goes to show that my father was right about one thing: even though he never again used a racial epithet in my presence, I still had a lot to learn about hate and love.


He was slowly dying. Men seldom develop cancer of the prostate until at least age 50, but some studies have reported that welders have an earlier and higher incidence. He had been diagnosed with prostate cancer at age 40, and because it had already spread into his bones where it was inoperable, a surgeon had removed my father’s testicles to deprive the tumors of hormonal fuel. He continued to limp into the power plant to support his family. On the days when he was in too much pain to work despite the drugs, his fellow welders did his jobs and hid him in a storage room so the big bosses wouldn’t know to fire him. He eventually found it impossible to climb the stairs to the second-floor time clock, and took an early retirement, which lasted several months.


Even days before his death, he still was unable to wear a white shirt.


Excerpted with permission from “Love and Hate in the Heartland: Dispatches from Forgotten America” by Mark Phillips. Copyright 2018 by Skyhorse Publishing, Inc.  Available for purchase on Amazon, Barnes & Noble, and Indiebound.



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 16:30

Emission impossible? Signs of coal revival in store

Coal

(Credit: AP Photo/Matthew Brown, File)


FairWarningDespite the Trump Administration’s ardent support of coal over renewable energy, the percentage of U.S. electricity from renewable sources continued its gradual rise in 2017.


Wind, solar and hydroelectric energy accounted for 16 percent of power production during President Trump’s first year in office, up from 13 percent in 2016 and nearly double the level when Barack Obama became president in 2009, according to a Natural Resources Defense Council analysis.


Coal, which generated more than half of U.S. electricity in 2000, produced 30 percent last year, about the same as the year before. According to government projections, by 2035 renewable sources will outstrip coal in electric power generation.


“We’ve seen the administration take a pretty anti-clean energy approach,” said NRDC energy analyst Amanda Levin. Even so, she said, “we continue to see growth in wind and solar due to the economics of clean energy as well as state and corporate interest in investing in and promoting clean energy.”


In his election campaign and since taking office, Trump has called for a coal comeback. “We have ended the war on American energy – and we have ended the war on beautiful, clean coal,” he declared in late January in his State of the Union speech. A week earlier, Trump carried through with a threat to impose 30 percent tariffs on solar cell imports. The administration also has started moving to repeal the U.S. Clean Power Plan, which is designed to cut carbon dioxide emitted by power generators.


Analysts say the impact of those moves will be softened by market forces and state government actions. “A lot of the reductions in emissions that we’ve seen over the last five years or so are due to state policies, but most important is the cheap price of natural gas and the rapidly falling cost of renewables,” said Raymond Kopp, an energy and climate specialist at Resources for the Future, a Washington-based think tank.


Twenty-nine states have adopted standards requiring utilities to meet targets for increased reliance on renewable fuels. California, for example, plans to get at least 33 percent of its power from renewable sources by 2020, 40 percent by 2024 and 50 percent by 2030.


Moreover, the nation’s coal-powered electric plants are aging and increasingly being shuttered. According to the U.S. Energy Information Administration, the number of operating coal plants declined from 616 in 2006 to 381 in 2016. In the first year of the Trump administration, nearly 30 more of the remaining coal plants were retired.


In a sign of the times, employment in the solar industry now dwarf jobs in the coal industry. Coal industry jobs actually increased about 1,000 nationwide in 2017, to roughly 52,000, according to federal Mine Safety and Health Administration data. Meanwhile, despite a small dip in 2017, solar jobs totaled 250,271, according to The Solar Foundation, a nonprofit that promotes solar technologies.


The NRDC said the U.S. should generate at least 80 percent of its electricity from renewable resources by 2050 to achieve the goals of the Paris climate accord, but the latest government forecast projects that the nation will be only at the 40 percent level by then. Last year, Trump announced plans to pull the U.S.out of the global agreement, which is intended to combat global warming by cutting greenhouse gas emissions.



 •  0 comments  •  flag
Share on Twitter
Published on April 08, 2018 16:29

April 7, 2018

Stronger fuel standards make sense, even when gas prices are low

New Jeresy Gas Tax

(Credit: AP)


It’s official: The Trump administration is reversing steps its predecessor had taken to curb gasoline and diesel consumption through stricter car pollution and fuel economy standards.


Rather than heed growing concerns about climate change, EPA Administrator Scott Pruitt has formally moved to nix the Obama administration’s carefully written rules. In 2012, the EPA set standards that aimed to halve the global warming pollution from new cars and light trucks by 2025. It made those tailpipe limits in coordination with the Department of Transportation’s separate fuel-economy standards, which targeted a near doubling of new vehicle miles-per-gallon over the same time frame.


Embracing auto lobbyists’ rhetoric, Pruitt declared in a press release that the existing policy “didn’t comport with reality, and set the standards too high.”


As a scholar who has researched automotive technology gains in recent decades, I believe this move is not justified by either a lack of technical know-how or the decline in prices at the pump since the government issued its standards six years ago. My own research has long shown that engineering advances offer many ways to ramp up fuel economy and cut tailpipe emissions without making vehicles too costly.



Industry pressure


This backward step was no surprise. The day after Donald Trump’s election victory, auto industry lobbyists dusted off their anti-regulatory scripts and pleaded their case to the incoming White House.


Automakers pressed for weaker standards, arguing that the necessary technologies cost too much, would not be ready on time, and ran counter to consumer trends.


However, I believe that the EPA had more than adequately addressed these concerns in the draft report it issued in July 2016. That analysis built on the earlier technology and economic assessments that justified the landmark standards established four years earlier.


Cheaper gas


One of the Trump administration’s main arguments is that gas costs less now than it did when the government established these standards six years ago. While that affordability clearly benefits consumers, it is no reason to slow down pollution-cutting progress.


Lower fuel prices do trim the immediate economic benefits of stronger standards. But the main rationale for the EPA’s standards is reducing the emissions that cause global warming, and that’s unchanged. Moreover, maintaining stricter standards would protect Americans from very real risks to their wallets if and when oil prices soar again.


Fuel prices nosedived in mid-2014, contrary to industry expectations. After ticking up slowly for the past two years, they remain about US$1.25 lower than when the standards were first conceived.


When it issued the standards in 2012, the EPA estimated that any added upfront vehicle costs would be paid back by fuel savings within three and a half years on average. I estimate that today’s lower fuel prices would push the payback period to about five years. That’s still a good deal given that the average vehicle stays on the road for nearly 12 years.



Lower gasoline prices do make consumers less eager to seek cars and trucks that get more miles per gallon. And it’s quite true that many top-selling models are gas-thirsty trucks like the Ford F-150, Dodge Ram and Chevrolet Silverado.



But the auto industry had the EPA build enough flexibility into the rules to accommodate such shifts in taste. Trucks, for example, are already held to weaker standards than cars, and the standards are automatically relaxed as any given type of vehicle gets larger. For that reason, the regulatory relief that automakers are seeking is not really needed.


What’s more, engineers are forging ahead with an array of technological enhancements designed to make even the largest pickup trucks more fuel-efficient, such as engines that automatically idle at stoplights, auto bodies made from lighter and stronger kinds of steel and other engineering advances.


Striking a balance


The conundrum the EPA faces is a classic one of how to balance consumer concerns about health and safety with their everyday wish to drive appealing vehicles that meet their needs and fit their budget.


The Trump administration seems to believe that automakers’ natural desire to cater to short-term market desires outweigh long-term concerns about the environment. That brings us back to the real reason the country needs strong and steady clean car rules. History shows that regulations work, and the fact that car manufacturers have clearly improved safety and cut emissions even while vehicle sales reached new highs contradicts the industry’s argument that regulations force them to build cars that customers don’t want to buy.


Automotive technology is always advancing, and at any given point in time it can be used to either reduce emissions or offer more horsepower, added capacity, or any number of other amenities such as heated seats. And the list goes on. There’s nothing wrong with these features, but they don’t justify doing less to curtail global warming.


The history of automotive policymaking shows that consumers can in fact have cars that are both nicer and cleaner. The first car I owned – a 1967 Ford Custom 500 with a stick shift that got less than 20 miles per gallon – was quite spartan by today’s standards. It was no match for my family’s current car, a 2012 Toyota Camry hybrid that gets over 35 miles per gallon and is far more powerful, safer and comfortable while still quite affordable.


Without regulations, that Camry would no doubt be at least as nice in terms of creature comforts, but it would not be as safe, efficient and clean. That is why I’m certain that carefully crafted regulations do strike the right balance, in spite of automakers’ complaints to the contrary.




In short, the Trump administration’s main justification for weakening the standards – less-than-expected consumer interest in efficiency due to lower gas prices – is actually the reason why the nation needs more stringent standards in the first place.


John DeCicco, Research Professor, University of Michigan





 •  0 comments  •  flag
Share on Twitter
Published on April 07, 2018 18:00