Helen H. Moore's Blog, page 905

January 2, 2016

Nixon’s “environmental bandwagon”: Richard Nixon signed the landmark Clean Air Act of 1970 — but not because he had any great concern about the environment

For polluters, America in 1970 was still something of a Wild West. A number of federal, state and municipal laws aimed at improving air quality were already on the books, but few were enforced, and pollution from the nation’s ever-growing stock of motor vehicles, power plants and factories remained uncontrolled in much of the country. A passage from the Ralph Nader Study Group’s "Vanishing Air," published in May 1970, vividly illustrates the extent to which dirty air was a fact of life for city dwellers of the period:
The New Yorker almost always senses a slight discomfort in breathing, especially in midtown; he knows that his cleaning bills are higher than they would be in the country; he periodically runs his handkerchief across his face and notes the fine black soot that has fallen on him; and he often feels the air pressing against him with almost as much weight as the bodies in the crowds he weaves through daily.
New York’s problems with air quality were hardly unique. In an October 1969 letter to the Senate Subcommittee on Air and Water Pollution, a resident of St. Louis expressed similar sentiments about the sheer pervasiveness of pollution in her community:
What really made me take the time to write this letter was the realization that I had begun to take the haze and various odors for granted. Close the doors and windows and they’ll be less noticeable[. I]t is very disturbing to think I’ve become used to the burning-rubber smell in the evening and the slightly sour smell in the morning. What does air smell like?
And air pollution’s costs went far beyond sour smells and dirty handkerchiefs, as a series of deadly “inversions” both here and abroad had made dramatically clear beginning in the late 1940s. Typically, the air at higher altitudes is cooler than that below. This is because the surface of the earth absorbs sunlight and radiates heat, warming the air closest to the ground. That warm surface air then cools as it rises higher into the atmosphere. But in certain weather conditions, this temperature pattern can be flipped. When a warm front moves in above a cooler mass of air, it acts as a sort of lid, preventing the surface air (and any pollutants it contains) from rising into the atmosphere and dispersing. The longer an inversion lasts, the worse surface air quality gets, as more and more pollution becomes trapped beneath the lid. In 1948, Donora, an industrial town in southwest Pennsylvania, made national news after suffering a five-day inversion that killed 20 people and sickened 6,000 others, more than 40 percent of the town’s residents. Four years later came London’s “Great Smog of 1952,” another lengthy inversion that was estimated to have caused the premature deaths of at least 4,000 Londoners. And in 1966, an inversion in New York City killed an estimated 168 people over the course of a week. Furthermore, though acute incidents drew the biggest headlines, scientists of the time were also beginning to understand that even “normal” levels of pollution had serious, long-term health consequences. In December 1970, the acting Surgeon General, in testimony before the House of Representatives, cited “abundant scientific evidence that exposure to polluted air is associated with the occurrence and worsening of chronic respiratory diseases, such as emphysema, bronchitis, asthma, and even lung cancer.” And yet, while 1970 was not an ideal time for the actual environment, it was something of a golden age for environmentalists. On Jan. 1, 1970, President Richard Nixon signed the National Environmental Policy Act into law. Six months later, he proposed the creation of the Environmental Protection Agency, which opened for business in early December. And on Dec. 31, he capped off the year by signing the Clean Air Act, which was then—and still is—our nation’s most important environmental law. At the signing ceremony, Nixon speculated—correctly—that 1970 would later “be known as the year of the beginning, in which we really began to move on the problems of clean air and clean water and open spaces for the future generations of America.” Why did all of these green stars align in 1970? For one thing, public interest in environmental issues was growing rapidly. According to nationwide Opinion Research Corporation polls, only 28 percent of the U.S. population considered air pollution a somewhat or very serious problem in 1965; by 1970, that figure had risen to 69 percent. J. Clarence Davies, a Princeton political scientist who would go on to serve as a senior staffer for the White House Council on Environmental Quality, speculated at the time that heightened concern about environmental degradation was an inevitable product of America’s post-World War II economic boom:
The massive growth in production and in the availability of resources which has characterized the U.S. economy in the past two decades affects the problem of pollution in several ways. The increase in production has contributed to an intensification of the degree of actual pollution; the increase in the standard of living has permitted people the comparative luxury of being able to be concerned about this; and the availability of ample public and private resources has given the society sufficient funds and skilled manpower to provide the potential for dealing with the problem.
In other words, getting rich had come at great cost to the nation’s air and waterways, but, as a result of the nation’s new affluence, Americans were both more inclined to care about improving the quality of their environment and better equipped to succeed in that effort. The most memorable demonstration of environmental protection’s newfound political salience came in April 1970 when Sen. Gaylord Nelson, a Wisconsin Democrat, and Rep. Pete McCloskey, a California Republican, co-chaired the first Earth Day, a “national teach-in on the environment.” The event drew more than 20 million participants all over the country. The same month, "CBS Evening News" anchor Walter Cronkite, who would soon become known as the “most trusted man in America,” began to include an environment-focused story in each night’s broadcast under the provocative heading “Can the World Be Saved?” But even with strong public support for legislative action, the Clean Air Act likely wouldn’t have passed when it did and in the form it did if not for the somewhat unexpected advocacy of President Nixon. According to William Ruckelshaus, whom Nixon appointed as the first EPA administrator, the president did not share the public’s concern for the environment. He thought the environmental movement, along with the antiwar activism of the period, “reflected weaknesses of the American character.” Nixon was, however, very concerned with staying president, and his expected opponent for the 1972 race was Sen. Edmund Muskie, a Maine Democrat whose chairmanship of the Senate Subcommittee on Air and Water Pollution had earned him the nickname “Mr. Clean.” Nixon had no interest in conceding the green mantle to Muskie, so he set out to look even more protective of the environment than his liberal rival. Or, as the Nader report more colorfully explained, “The environmental bandwagon is the cheapest ride in town. . . . President Nixon paid his fare and jumped aboard.” In his State of the Union address that January, President Nixon vowed to present Congress with the “most comprehensive and costly program of pollution control in America’s history.” (Apparently, in 1970, the costliness of a regulatory program was considered a selling point.) The following month, Nixon made good on his pledge with a special address to Congress that outlined a “37-point program, embracing 23 major legislative proposals and 14 new measures being taken by administrative action or Executive Order” aimed at addressing a variety of environmental concerns, including water pollution, air pollution, solid waste management and parklands. Among those 23 legislative proposals was a set of amendments to the nation’s existing air pollution laws that would eventually become the Clean Air Act of 1970. Nixon’s air pollution bill was unquestionably stronger than one that Muskie himself had put forward the previous December, which included only modest improvements to the largely ineffectual Air Quality Act of 1967. But rather than admit defeat, Muskie doubled down. During the summer of 1970, his Senate subcommittee developed a bill that followed the same general structure as the president’s proposal but was much more ambitious: The deadlines were tighter, and the standards were both more stringent and more enforceable. Most controversially, the new bill included a requirement that auto manufacturers cut pollution from new motor vehicles by 90 percent in only six years. In September, the Senate passed Muskie’s bill, 73–0. The House had already passed a bill the previous June that more closely tracked Nixon’s, so a conference committee was charged with reconciling the two versions of the new law. Despite heavy lobbying by the auto industry and pressure from the White House to take a less aggressive approach, particularly with regard to the auto standards, the committee hewed much closer to the Senate version, and the new Clean Air Act was passed by both chambers of Congress in a voice vote on Dec. 18, 1970. When President Nixon signed the bill into law on Dec. 31, he invited a number of the Act’s congressional architects to the signing ceremony. Sen. Muskie was not among them. * * * More than four decades after its passage, the Clean Air Act remains a reliable source of controversy.  Every major rulemaking under the Act has prompted intense lobbying and litigation, and the policies of the Obama administration have been no exception.  Most recently, 45 of the 50 states have taken sides in an ongoing lawsuit over the EPA’s Clean Power Plan, which, for the first time in the Clean Air Act’s history, requires older power plants to reduce their emissions of the greenhouse gas carbon dioxide.  Ultimately, the suit will determine whether the EPA can meaningfully address the greatest environmental threat of our time: global climate change. But whatever the future may hold, on the occasion of the Act’s 45th birthday, it’s important to acknowledge how much good it has already managed to do, despite relentless attempts by recalcitrant polluters to undermine its implementation.  Between 1970 and 2013, total emissions of six of the most common air pollutants—including those that form soot and smog—fell 68 percent, even as GDP more than tripled, energy consumption rose by 44 percent, and the U.S. population grew by 54 percent.  These reductions have yielded enormous dividends for the American public. A 1990 EPA study estimated that the Clean Air Act prevented 205,000 premature deaths between 1970 and 1990. Significant amendments made to the Act in 1990—under another Republican president, George H.W. Bush—have generated even larger benefits. A 2011 EPA study concluded that the 1990 amendments had prevented 160,000 premature deaths in 2010 alone and estimated that the number of lives saved annually would climb to 230,000 by 2020.   In short, it’s been a good 45 years.  Here’s to the next 45. Richard L. Revesz is the Lawrence King Professor of Law and Dean Emeritus at New York University School of Law, where Jack Lienke is a Senior Attorney at the Institute for Policy Integrity. They are the co-authors of "Struggling for Air: Power Plants and the 'War on Coal'” (Oxford University Press 2016), from which this excerpt is adapted.For polluters, America in 1970 was still something of a Wild West. A number of federal, state and municipal laws aimed at improving air quality were already on the books, but few were enforced, and pollution from the nation’s ever-growing stock of motor vehicles, power plants and factories remained uncontrolled in much of the country. A passage from the Ralph Nader Study Group’s "Vanishing Air," published in May 1970, vividly illustrates the extent to which dirty air was a fact of life for city dwellers of the period:
The New Yorker almost always senses a slight discomfort in breathing, especially in midtown; he knows that his cleaning bills are higher than they would be in the country; he periodically runs his handkerchief across his face and notes the fine black soot that has fallen on him; and he often feels the air pressing against him with almost as much weight as the bodies in the crowds he weaves through daily.
New York’s problems with air quality were hardly unique. In an October 1969 letter to the Senate Subcommittee on Air and Water Pollution, a resident of St. Louis expressed similar sentiments about the sheer pervasiveness of pollution in her community:
What really made me take the time to write this letter was the realization that I had begun to take the haze and various odors for granted. Close the doors and windows and they’ll be less noticeable[. I]t is very disturbing to think I’ve become used to the burning-rubber smell in the evening and the slightly sour smell in the morning. What does air smell like?
And air pollution’s costs went far beyond sour smells and dirty handkerchiefs, as a series of deadly “inversions” both here and abroad had made dramatically clear beginning in the late 1940s. Typically, the air at higher altitudes is cooler than that below. This is because the surface of the earth absorbs sunlight and radiates heat, warming the air closest to the ground. That warm surface air then cools as it rises higher into the atmosphere. But in certain weather conditions, this temperature pattern can be flipped. When a warm front moves in above a cooler mass of air, it acts as a sort of lid, preventing the surface air (and any pollutants it contains) from rising into the atmosphere and dispersing. The longer an inversion lasts, the worse surface air quality gets, as more and more pollution becomes trapped beneath the lid. In 1948, Donora, an industrial town in southwest Pennsylvania, made national news after suffering a five-day inversion that killed 20 people and sickened 6,000 others, more than 40 percent of the town’s residents. Four years later came London’s “Great Smog of 1952,” another lengthy inversion that was estimated to have caused the premature deaths of at least 4,000 Londoners. And in 1966, an inversion in New York City killed an estimated 168 people over the course of a week. Furthermore, though acute incidents drew the biggest headlines, scientists of the time were also beginning to understand that even “normal” levels of pollution had serious, long-term health consequences. In December 1970, the acting Surgeon General, in testimony before the House of Representatives, cited “abundant scientific evidence that exposure to polluted air is associated with the occurrence and worsening of chronic respiratory diseases, such as emphysema, bronchitis, asthma, and even lung cancer.” And yet, while 1970 was not an ideal time for the actual environment, it was something of a golden age for environmentalists. On Jan. 1, 1970, President Richard Nixon signed the National Environmental Policy Act into law. Six months later, he proposed the creation of the Environmental Protection Agency, which opened for business in early December. And on Dec. 31, he capped off the year by signing the Clean Air Act, which was then—and still is—our nation’s most important environmental law. At the signing ceremony, Nixon speculated—correctly—that 1970 would later “be known as the year of the beginning, in which we really began to move on the problems of clean air and clean water and open spaces for the future generations of America.” Why did all of these green stars align in 1970? For one thing, public interest in environmental issues was growing rapidly. According to nationwide Opinion Research Corporation polls, only 28 percent of the U.S. population considered air pollution a somewhat or very serious problem in 1965; by 1970, that figure had risen to 69 percent. J. Clarence Davies, a Princeton political scientist who would go on to serve as a senior staffer for the White House Council on Environmental Quality, speculated at the time that heightened concern about environmental degradation was an inevitable product of America’s post-World War II economic boom:
The massive growth in production and in the availability of resources which has characterized the U.S. economy in the past two decades affects the problem of pollution in several ways. The increase in production has contributed to an intensification of the degree of actual pollution; the increase in the standard of living has permitted people the comparative luxury of being able to be concerned about this; and the availability of ample public and private resources has given the society sufficient funds and skilled manpower to provide the potential for dealing with the problem.
In other words, getting rich had come at great cost to the nation’s air and waterways, but, as a result of the nation’s new affluence, Americans were both more inclined to care about improving the quality of their environment and better equipped to succeed in that effort. The most memorable demonstration of environmental protection’s newfound political salience came in April 1970 when Sen. Gaylord Nelson, a Wisconsin Democrat, and Rep. Pete McCloskey, a California Republican, co-chaired the first Earth Day, a “national teach-in on the environment.” The event drew more than 20 million participants all over the country. The same month, "CBS Evening News" anchor Walter Cronkite, who would soon become known as the “most trusted man in America,” began to include an environment-focused story in each night’s broadcast under the provocative heading “Can the World Be Saved?” But even with strong public support for legislative action, the Clean Air Act likely wouldn’t have passed when it did and in the form it did if not for the somewhat unexpected advocacy of President Nixon. According to William Ruckelshaus, whom Nixon appointed as the first EPA administrator, the president did not share the public’s concern for the environment. He thought the environmental movement, along with the antiwar activism of the period, “reflected weaknesses of the American character.” Nixon was, however, very concerned with staying president, and his expected opponent for the 1972 race was Sen. Edmund Muskie, a Maine Democrat whose chairmanship of the Senate Subcommittee on Air and Water Pollution had earned him the nickname “Mr. Clean.” Nixon had no interest in conceding the green mantle to Muskie, so he set out to look even more protective of the environment than his liberal rival. Or, as the Nader report more colorfully explained, “The environmental bandwagon is the cheapest ride in town. . . . President Nixon paid his fare and jumped aboard.” In his State of the Union address that January, President Nixon vowed to present Congress with the “most comprehensive and costly program of pollution control in America’s history.” (Apparently, in 1970, the costliness of a regulatory program was considered a selling point.) The following month, Nixon made good on his pledge with a special address to Congress that outlined a “37-point program, embracing 23 major legislative proposals and 14 new measures being taken by administrative action or Executive Order” aimed at addressing a variety of environmental concerns, including water pollution, air pollution, solid waste management and parklands. Among those 23 legislative proposals was a set of amendments to the nation’s existing air pollution laws that would eventually become the Clean Air Act of 1970. Nixon’s air pollution bill was unquestionably stronger than one that Muskie himself had put forward the previous December, which included only modest improvements to the largely ineffectual Air Quality Act of 1967. But rather than admit defeat, Muskie doubled down. During the summer of 1970, his Senate subcommittee developed a bill that followed the same general structure as the president’s proposal but was much more ambitious: The deadlines were tighter, and the standards were both more stringent and more enforceable. Most controversially, the new bill included a requirement that auto manufacturers cut pollution from new motor vehicles by 90 percent in only six years. In September, the Senate passed Muskie’s bill, 73–0. The House had already passed a bill the previous June that more closely tracked Nixon’s, so a conference committee was charged with reconciling the two versions of the new law. Despite heavy lobbying by the auto industry and pressure from the White House to take a less aggressive approach, particularly with regard to the auto standards, the committee hewed much closer to the Senate version, and the new Clean Air Act was passed by both chambers of Congress in a voice vote on Dec. 18, 1970. When President Nixon signed the bill into law on Dec. 31, he invited a number of the Act’s congressional architects to the signing ceremony. Sen. Muskie was not among them. * * * More than four decades after its passage, the Clean Air Act remains a reliable source of controversy.  Every major rulemaking under the Act has prompted intense lobbying and litigation, and the policies of the Obama administration have been no exception.  Most recently, 45 of the 50 states have taken sides in an ongoing lawsuit over the EPA’s Clean Power Plan, which, for the first time in the Clean Air Act’s history, requires older power plants to reduce their emissions of the greenhouse gas carbon dioxide.  Ultimately, the suit will determine whether the EPA can meaningfully address the greatest environmental threat of our time: global climate change. But whatever the future may hold, on the occasion of the Act’s 45th birthday, it’s important to acknowledge how much good it has already managed to do, despite relentless attempts by recalcitrant polluters to undermine its implementation.  Between 1970 and 2013, total emissions of six of the most common air pollutants—including those that form soot and smog—fell 68 percent, even as GDP more than tripled, energy consumption rose by 44 percent, and the U.S. population grew by 54 percent.  These reductions have yielded enormous dividends for the American public. A 1990 EPA study estimated that the Clean Air Act prevented 205,000 premature deaths between 1970 and 1990. Significant amendments made to the Act in 1990—under another Republican president, George H.W. Bush—have generated even larger benefits. A 2011 EPA study concluded that the 1990 amendments had prevented 160,000 premature deaths in 2010 alone and estimated that the number of lives saved annually would climb to 230,000 by 2020.   In short, it’s been a good 45 years.  Here’s to the next 45. Richard L. Revesz is the Lawrence King Professor of Law and Dean Emeritus at New York University School of Law, where Jack Lienke is a Senior Attorney at the Institute for Policy Integrity. They are the co-authors of "Struggling for Air: Power Plants and the 'War on Coal'” (Oxford University Press 2016), from which this excerpt is adapted.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 02, 2016 11:00

I am the Fox News atheist: “Some call me a militant atheist. Others call me a dick. I am neither”

Once again, abherrant [sic] groups like you and the homosexuals are complaining that you don’t have a golden cup and seat at the table. Listen, in America, you are allowed to exist without persecution. Any other right that you request is another step towards the hedonistic values that contributed to the destruction of Sodom and Gomorrah. The closer America gets as a society to honoring your seat at the table is yet another step towards the decline of western civilization, and deep down inside you know it. You are deviants. —An e-mail to American Atheists, September 2012 An Atheist loves himself and his fellow man instead of a god. An Atheist thinks that heaven is something for which we should work for now—here on earth—for all men together to enjoy. An Atheist knows that a hospital should be built instead of a church. An Atheist knows that a deed must be done instead of a prayer said. An Atheist   strives for involvement   in life and not escape into death. He wants disease conquered, poverty vanquished, war eliminated. —Madalyn Murray O’Hair I wish I were wrong. I wish all the good guys went to heaven, the bad guys somehow justly paid for their crimes, and everyone (especially my daughter) lived forever. Indeed, not a single person on earth wants people to live forever more than I do. I also wish the World Trade Center still stood, hunger was eradicated, and that my mother had lived long enough to see this book published. But wishing doesn’t make it true. We live in the real world, and I make my living telling the truth. I’m that atheist guy—the guy with the atheist shirt and/or jacket and/or baseball cap in the airport or at a street fair. I’m not talking about subtle clothing that someone might pass by and not notice, either. I wear the word atheist in big letters. I cannot be missed, quite by design. I talk to people who ask about what my clothes say (which happens more and more these days), answer their questions as honestly and clearly as possible, and sometimes even have on-the- spot debates. I’m a walking, talking atheist billboard. I love being “that guy” (you should try it). I’m also “that atheist guy on TV,” often Fox News, who espouses such horrible concepts as religious equality and separation of religion and government (same thing), usually to a talking head or audience that simply does not understand the concept and tries to trip me up rather than actually consider that I might be right. Some call me a militant atheist. Others call me a dick. I am neither. A militant atheist, like a militant Muslim, Jew, or Christian, would be someone who uses, threatens, or promotes violence; there is nothing violent in anything I do or endorse. A dick would be someone who makes people angry for the sake of making people angry. I have much better reasons. I promote no harm, violence, or vandalism, opting instead to fight for equality of all people through truth and honesty. I think that makes me That Pretty Good Guy Who Gets Called a Dick So Often He Gets Angry and Writes a Book About It. (My publisher thought such a title might be a bit too long for the cover, though.) I’ve been a walking, talking atheist billboard for nearly twenty years. I have had many conversations with strangers about atheism, as you would expect, and as I intend. In the early years, most of my on-the-street conversations were confrontational and somewhat less than civil. I’ve received more than my fair share of the stereotypical “You’re going to hell” comments, usually delivered in a hit-and-run style, whereby people deliver their “good news” that I will be spending eternity being tortured and then run away before I can continue the conversation. However, in more recent years, I’ve begun receiving more and more positive comments, and the negativity I once received—in every airport, in every city—has fallen silent. Indeed, the hostility that I used to encounter nearly every time I stepped out the door never occurs anymore. Why? Because we are winning. In this enduring battle for freedom from the Lie of God (a phrase I use to describe all deities and the lies, empty promises, and threats that surround them), we are finally winning, mainly because, thanks to the Internet, we are finally capable of trying. We, the atheists of America, are on the cusp of achieving equality. We have no money compared to religion. We have no power compared to religion. Yet our numbers are growing while theirs are shrinking, because it’s not just about money or power, but about truth. Truth beats money and power, and when it comes to religion, atheists have a monopoly on truth. Atheists like me are what I call conclusionary. We have concluded that gods are myths because we’ve seen sufficient evidence and heard or read sufficient arguments to convince us there are no gods. But this is not stubbornness. If any god, anywhere, were proven real even once, I would convert, quit my job, and donate all the proceeds of this book to that correct god’s religion. Of course, that has never happened, and there is no reason to expect that it will. I am convinced this mentality holds true for almost all atheists everywhere— if there were any proof of any god, there would be practically no atheists anywhere. I search for truth, not just confirmation of preferred belief. But religion is not just incorrect, it is malevolent. It ruins lives, splits families, and justifies hatred and bigotry, all while claiming to be the source of morality. People die and suffer needlessly because of religion; such a waste. As the late Christopher Hitchens said, “Religion poisons everything,” and that seems almost literal when we are talking about the minds it infects. It makes good people do bad things while thinking they are doing good—effectively turning good people into bad people, at least sometimes. Religion deserves to die. Some (too many?) people call me a dick because I challenge the absurd notion that religion deserves respect by default. But religion is wrong for demanding respect simply for being, and even more wrong for demanding never to be questioned. Indeed, it is my duty as an American, as an atheist, and as a nice person to do what I can to take religion down—not by force, not by law, but by truth. And the truth is quite simple: all religions are lies, and all believers are victims. From "FIGHTING GOD: An Atheist Manifesto For a Religious World" by David Silverman. Copyright © 2015 by David Silverman and reprinted by permission of Thomas Dunne Books.Once again, abherrant [sic] groups like you and the homosexuals are complaining that you don’t have a golden cup and seat at the table. Listen, in America, you are allowed to exist without persecution. Any other right that you request is another step towards the hedonistic values that contributed to the destruction of Sodom and Gomorrah. The closer America gets as a society to honoring your seat at the table is yet another step towards the decline of western civilization, and deep down inside you know it. You are deviants. —An e-mail to American Atheists, September 2012 An Atheist loves himself and his fellow man instead of a god. An Atheist thinks that heaven is something for which we should work for now—here on earth—for all men together to enjoy. An Atheist knows that a hospital should be built instead of a church. An Atheist knows that a deed must be done instead of a prayer said. An Atheist   strives for involvement   in life and not escape into death. He wants disease conquered, poverty vanquished, war eliminated. —Madalyn Murray O’Hair I wish I were wrong. I wish all the good guys went to heaven, the bad guys somehow justly paid for their crimes, and everyone (especially my daughter) lived forever. Indeed, not a single person on earth wants people to live forever more than I do. I also wish the World Trade Center still stood, hunger was eradicated, and that my mother had lived long enough to see this book published. But wishing doesn’t make it true. We live in the real world, and I make my living telling the truth. I’m that atheist guy—the guy with the atheist shirt and/or jacket and/or baseball cap in the airport or at a street fair. I’m not talking about subtle clothing that someone might pass by and not notice, either. I wear the word atheist in big letters. I cannot be missed, quite by design. I talk to people who ask about what my clothes say (which happens more and more these days), answer their questions as honestly and clearly as possible, and sometimes even have on-the- spot debates. I’m a walking, talking atheist billboard. I love being “that guy” (you should try it). I’m also “that atheist guy on TV,” often Fox News, who espouses such horrible concepts as religious equality and separation of religion and government (same thing), usually to a talking head or audience that simply does not understand the concept and tries to trip me up rather than actually consider that I might be right. Some call me a militant atheist. Others call me a dick. I am neither. A militant atheist, like a militant Muslim, Jew, or Christian, would be someone who uses, threatens, or promotes violence; there is nothing violent in anything I do or endorse. A dick would be someone who makes people angry for the sake of making people angry. I have much better reasons. I promote no harm, violence, or vandalism, opting instead to fight for equality of all people through truth and honesty. I think that makes me That Pretty Good Guy Who Gets Called a Dick So Often He Gets Angry and Writes a Book About It. (My publisher thought such a title might be a bit too long for the cover, though.) I’ve been a walking, talking atheist billboard for nearly twenty years. I have had many conversations with strangers about atheism, as you would expect, and as I intend. In the early years, most of my on-the-street conversations were confrontational and somewhat less than civil. I’ve received more than my fair share of the stereotypical “You’re going to hell” comments, usually delivered in a hit-and-run style, whereby people deliver their “good news” that I will be spending eternity being tortured and then run away before I can continue the conversation. However, in more recent years, I’ve begun receiving more and more positive comments, and the negativity I once received—in every airport, in every city—has fallen silent. Indeed, the hostility that I used to encounter nearly every time I stepped out the door never occurs anymore. Why? Because we are winning. In this enduring battle for freedom from the Lie of God (a phrase I use to describe all deities and the lies, empty promises, and threats that surround them), we are finally winning, mainly because, thanks to the Internet, we are finally capable of trying. We, the atheists of America, are on the cusp of achieving equality. We have no money compared to religion. We have no power compared to religion. Yet our numbers are growing while theirs are shrinking, because it’s not just about money or power, but about truth. Truth beats money and power, and when it comes to religion, atheists have a monopoly on truth. Atheists like me are what I call conclusionary. We have concluded that gods are myths because we’ve seen sufficient evidence and heard or read sufficient arguments to convince us there are no gods. But this is not stubbornness. If any god, anywhere, were proven real even once, I would convert, quit my job, and donate all the proceeds of this book to that correct god’s religion. Of course, that has never happened, and there is no reason to expect that it will. I am convinced this mentality holds true for almost all atheists everywhere— if there were any proof of any god, there would be practically no atheists anywhere. I search for truth, not just confirmation of preferred belief. But religion is not just incorrect, it is malevolent. It ruins lives, splits families, and justifies hatred and bigotry, all while claiming to be the source of morality. People die and suffer needlessly because of religion; such a waste. As the late Christopher Hitchens said, “Religion poisons everything,” and that seems almost literal when we are talking about the minds it infects. It makes good people do bad things while thinking they are doing good—effectively turning good people into bad people, at least sometimes. Religion deserves to die. Some (too many?) people call me a dick because I challenge the absurd notion that religion deserves respect by default. But religion is wrong for demanding respect simply for being, and even more wrong for demanding never to be questioned. Indeed, it is my duty as an American, as an atheist, and as a nice person to do what I can to take religion down—not by force, not by law, but by truth. And the truth is quite simple: all religions are lies, and all believers are victims. From "FIGHTING GOD: An Atheist Manifesto For a Religious World" by David Silverman. Copyright © 2015 by David Silverman and reprinted by permission of Thomas Dunne Books.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 02, 2016 09:00

Close U.S. ally Saudi Arabia kicks off 2016 by beheading 47 people in one day, including prominent Shia cleric

Saudi Arabia kicked off the new year with mass executions. The Saudi monarchy beheaded 47 people in one day, across 13 cities, leading politicians, human rights experts, and journalists on social media to compare the close U.S. ally to the extremist group ISIS. Among those killed was Nimr al-Nimr, a prominent cleric in the Shia minority religious group. Al-Nimr was arrested and tortured for his role leading the peaceful democratic uprising against the western-backed Saudi absolute monarchy in 2011. Numerous publications, including the leading British newspaper The Guardian, mistakenly referred to al-Nimr as an Iranian cleric. This is false; al-Nimr was Saudi. Protests erupted across the Middle East Saturday in response to the mass killings. Protesters filled the streets in neighboring Bahrain, condemning the executions, and were met with tear gas and police repression. Bahrain, which has a Shia majority, had its own peaceful uprising against the monarchy in 2011, which was violently crushed by a Saudi invasion, backed by the U.S. Al-Nimr's brother, Mohammed al-Nimr, publicly insisted that the pro-democracy movement in the Saudi Shia community will continue, the Washington Post reported. "Wrong, misled, and mistaken [are] those who think that the killing will keep us from our rightful demands," he tweeted. Ali Mohammed al-Nimr, nephew of the prominent Shia cleric, was himself arrested in 2012, at age 17, for attending a peaceful pro-democracy protest. The teen was allegedly tortured, before the Saudi monarchy ordered him to be beheaded and crucified. It is unclear whether he was included in the group of 47 people killed. Two other peaceful pro-democracy activists who were arrested when they were teens were also set to be executed by the Saudi monarchy. In November, Saudi Arabia sentenced a Palestinian poet to death for renouncing Islam and criticizing the Saudi royal family. Saudi Arabia is one of the most repressive regimes on the planet. It is an authoritarian, theocratic monarchy that bases its laws on an extreme interpretation of Sharia (Islamic law). Its Wahhabi (Sunni extremist) ideology and frequent use of beheadings have led it to be compared to ISIS. Algerian journalist Kamel Daoud described Saudi Arabia as "an ISIS that has made it" in a November op-ed in The New York Times. Comparisons of the Saudi monarchy to ISIS flooded social media in response to the mass killings, alongside accusations of western hypocrisy and double standards. https://twitter.com/tparsi/status/683... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/RanaHarbi/status/... https://twitter.com/arjunsethi81/stat... https://twitter.com/MaxAbrahms/status... The Saudi monarchy executes hundreds of people a year. In 2015, it executed 158 people, largely by beheading. Every four days, Saudi Arabia executes someone for drugs, even while its princes are caught with tons of drugs at the airport. Scholars have described Saudi Arabia as "the fountainhead" of Sunni Islamic radicalism. Former U.S. Sen. Bob Graham called extremist groups like ISIS and al-Qaeda "a product of Saudi ideals, Saudi money, and Saudi organizational support." U.S. government cables leaked by WikiLeaks show officials like Hillary Clinton admitting that al-Qaeda and other Salafi groups are supported by Saudi businessmen and members of the royal family. Despite its brutality and support for violent extremist groups, the U.S. considers Saudi Arabia a "close ally." In September, the State Department said it "welcomed" the news that the Saudi regime would head a U.N. human rights panel, noting "We're close allies." The State Department also refers to the Saudi monarchy as "a strong partner in regional security and counterterrorism efforts, providing military, diplomatic, and financial cooperation." President Obama's administration has done more than $100 billion in arms deals with the Saudi regime in the past five years. Saudi Arabia has been a close U.S. ally since the early 20th century, when it was discovered to have what were believed to be the world's largest oil reserves (the largest oil reserves are now known to be in Venezuela, but Saudi Arabia comes in a close second). In 1945, just before he died, President Franklin Delano Roosevelt famously met with Ibn Saud, the first king and founder of Saudi Arabia, and promised the absolute monarch the U.S. would support his regime in return for oil.Saudi Arabia kicked off the new year with mass executions. The Saudi monarchy beheaded 47 people in one day, across 13 cities, leading politicians, human rights experts, and journalists on social media to compare the close U.S. ally to the extremist group ISIS. Among those killed was Nimr al-Nimr, a prominent cleric in the Shia minority religious group. Al-Nimr was arrested and tortured for his role leading the peaceful democratic uprising against the western-backed Saudi absolute monarchy in 2011. Numerous publications, including the leading British newspaper The Guardian, mistakenly referred to al-Nimr as an Iranian cleric. This is false; al-Nimr was Saudi. Protests erupted across the Middle East Saturday in response to the mass killings. Protesters filled the streets in neighboring Bahrain, condemning the executions, and were met with tear gas and police repression. Bahrain, which has a Shia majority, had its own peaceful uprising against the monarchy in 2011, which was violently crushed by a Saudi invasion, backed by the U.S. Al-Nimr's brother, Mohammed al-Nimr, publicly insisted that the pro-democracy movement in the Saudi Shia community will continue, the Washington Post reported. "Wrong, misled, and mistaken [are] those who think that the killing will keep us from our rightful demands," he tweeted. Ali Mohammed al-Nimr, nephew of the prominent Shia cleric, was himself arrested in 2012, at age 17, for attending a peaceful pro-democracy protest. The teen was allegedly tortured, before the Saudi monarchy ordered him to be beheaded and crucified. It is unclear whether he was included in the group of 47 people killed. Two other peaceful pro-democracy activists who were arrested when they were teens were also set to be executed by the Saudi monarchy. In November, Saudi Arabia sentenced a Palestinian poet to death for renouncing Islam and criticizing the Saudi royal family. Saudi Arabia is one of the most repressive regimes on the planet. It is an authoritarian, theocratic monarchy that bases its laws on an extreme interpretation of Sharia (Islamic law). Its Wahhabi (Sunni extremist) ideology and frequent use of beheadings have led it to be compared to ISIS. Algerian journalist Kamel Daoud described Saudi Arabia as "an ISIS that has made it" in a November op-ed in The New York Times. Comparisons of the Saudi monarchy to ISIS flooded social media in response to the mass killings, alongside accusations of western hypocrisy and double standards. https://twitter.com/tparsi/status/683... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/RanaHarbi/status/... https://twitter.com/arjunsethi81/stat... https://twitter.com/MaxAbrahms/status... The Saudi monarchy executes hundreds of people a year. In 2015, it executed 158 people, largely by beheading. Every four days, Saudi Arabia executes someone for drugs, even while its princes are caught with tons of drugs at the airport. Scholars have described Saudi Arabia as "the fountainhead" of Sunni Islamic radicalism. Former U.S. Sen. Bob Graham called extremist groups like ISIS and al-Qaeda "a product of Saudi ideals, Saudi money, and Saudi organizational support." U.S. government cables leaked by WikiLeaks show officials like Hillary Clinton admitting that al-Qaeda and other Salafi groups are supported by Saudi businessmen and members of the royal family. Despite its brutality and support for violent extremist groups, the U.S. considers Saudi Arabia a "close ally." In September, the State Department said it "welcomed" the news that the Saudi regime would head a U.N. human rights panel, noting "We're close allies." The State Department also refers to the Saudi monarchy as "a strong partner in regional security and counterterrorism efforts, providing military, diplomatic, and financial cooperation." President Obama's administration has done more than $100 billion in arms deals with the Saudi regime in the past five years. Saudi Arabia has been a close U.S. ally since the early 20th century, when it was discovered to have what were believed to be the world's largest oil reserves (the largest oil reserves are now known to be in Venezuela, but Saudi Arabia comes in a close second). In 1945, just before he died, President Franklin Delano Roosevelt famously met with Ibn Saud, the first king and founder of Saudi Arabia, and promised the absolute monarch the U.S. would support his regime in return for oil.Saudi Arabia kicked off the new year with mass executions. The Saudi monarchy beheaded 47 people in one day, across 13 cities, leading politicians, human rights experts, and journalists on social media to compare the close U.S. ally to the extremist group ISIS. Among those killed was Nimr al-Nimr, a prominent cleric in the Shia minority religious group. Al-Nimr was arrested and tortured for his role leading the peaceful democratic uprising against the western-backed Saudi absolute monarchy in 2011. Numerous publications, including the leading British newspaper The Guardian, mistakenly referred to al-Nimr as an Iranian cleric. This is false; al-Nimr was Saudi. Protests erupted across the Middle East Saturday in response to the mass killings. Protesters filled the streets in neighboring Bahrain, condemning the executions, and were met with tear gas and police repression. Bahrain, which has a Shia majority, had its own peaceful uprising against the monarchy in 2011, which was violently crushed by a Saudi invasion, backed by the U.S. Al-Nimr's brother, Mohammed al-Nimr, publicly insisted that the pro-democracy movement in the Saudi Shia community will continue, the Washington Post reported. "Wrong, misled, and mistaken [are] those who think that the killing will keep us from our rightful demands," he tweeted. Ali Mohammed al-Nimr, nephew of the prominent Shia cleric, was himself arrested in 2012, at age 17, for attending a peaceful pro-democracy protest. The teen was allegedly tortured, before the Saudi monarchy ordered him to be beheaded and crucified. It is unclear whether he was included in the group of 47 people killed. Two other peaceful pro-democracy activists who were arrested when they were teens were also set to be executed by the Saudi monarchy. In November, Saudi Arabia sentenced a Palestinian poet to death for renouncing Islam and criticizing the Saudi royal family. Saudi Arabia is one of the most repressive regimes on the planet. It is an authoritarian, theocratic monarchy that bases its laws on an extreme interpretation of Sharia (Islamic law). Its Wahhabi (Sunni extremist) ideology and frequent use of beheadings have led it to be compared to ISIS. Algerian journalist Kamel Daoud described Saudi Arabia as "an ISIS that has made it" in a November op-ed in The New York Times. Comparisons of the Saudi monarchy to ISIS flooded social media in response to the mass killings, alongside accusations of western hypocrisy and double standards. https://twitter.com/tparsi/status/683... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/RanaHarbi/status/... https://twitter.com/arjunsethi81/stat... https://twitter.com/MaxAbrahms/status... The Saudi monarchy executes hundreds of people a year. In 2015, it executed 158 people, largely by beheading. Every four days, Saudi Arabia executes someone for drugs, even while its princes are caught with tons of drugs at the airport. Scholars have described Saudi Arabia as "the fountainhead" of Sunni Islamic radicalism. Former U.S. Sen. Bob Graham called extremist groups like ISIS and al-Qaeda "a product of Saudi ideals, Saudi money, and Saudi organizational support." U.S. government cables leaked by WikiLeaks show officials like Hillary Clinton admitting that al-Qaeda and other Salafi groups are supported by Saudi businessmen and members of the royal family. Despite its brutality and support for violent extremist groups, the U.S. considers Saudi Arabia a "close ally." In September, the State Department said it "welcomed" the news that the Saudi regime would head a U.N. human rights panel, noting "We're close allies." The State Department also refers to the Saudi monarchy as "a strong partner in regional security and counterterrorism efforts, providing military, diplomatic, and financial cooperation." President Obama's administration has done more than $100 billion in arms deals with the Saudi regime in the past five years. Saudi Arabia has been a close U.S. ally since the early 20th century, when it was discovered to have what were believed to be the world's largest oil reserves (the largest oil reserves are now known to be in Venezuela, but Saudi Arabia comes in a close second). In 1945, just before he died, President Franklin Delano Roosevelt famously met with Ibn Saud, the first king and founder of Saudi Arabia, and promised the absolute monarch the U.S. would support his regime in return for oil.Saudi Arabia kicked off the new year with mass executions. The Saudi monarchy beheaded 47 people in one day, across 13 cities, leading politicians, human rights experts, and journalists on social media to compare the close U.S. ally to the extremist group ISIS. Among those killed was Nimr al-Nimr, a prominent cleric in the Shia minority religious group. Al-Nimr was arrested and tortured for his role leading the peaceful democratic uprising against the western-backed Saudi absolute monarchy in 2011. Numerous publications, including the leading British newspaper The Guardian, mistakenly referred to al-Nimr as an Iranian cleric. This is false; al-Nimr was Saudi. Protests erupted across the Middle East Saturday in response to the mass killings. Protesters filled the streets in neighboring Bahrain, condemning the executions, and were met with tear gas and police repression. Bahrain, which has a Shia majority, had its own peaceful uprising against the monarchy in 2011, which was violently crushed by a Saudi invasion, backed by the U.S. Al-Nimr's brother, Mohammed al-Nimr, publicly insisted that the pro-democracy movement in the Saudi Shia community will continue, the Washington Post reported. "Wrong, misled, and mistaken [are] those who think that the killing will keep us from our rightful demands," he tweeted. Ali Mohammed al-Nimr, nephew of the prominent Shia cleric, was himself arrested in 2012, at age 17, for attending a peaceful pro-democracy protest. The teen was allegedly tortured, before the Saudi monarchy ordered him to be beheaded and crucified. It is unclear whether he was included in the group of 47 people killed. Two other peaceful pro-democracy activists who were arrested when they were teens were also set to be executed by the Saudi monarchy. In November, Saudi Arabia sentenced a Palestinian poet to death for renouncing Islam and criticizing the Saudi royal family. Saudi Arabia is one of the most repressive regimes on the planet. It is an authoritarian, theocratic monarchy that bases its laws on an extreme interpretation of Sharia (Islamic law). Its Wahhabi (Sunni extremist) ideology and frequent use of beheadings have led it to be compared to ISIS. Algerian journalist Kamel Daoud described Saudi Arabia as "an ISIS that has made it" in a November op-ed in The New York Times. Comparisons of the Saudi monarchy to ISIS flooded social media in response to the mass killings, alongside accusations of western hypocrisy and double standards. https://twitter.com/tparsi/status/683... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/MsJulieLenarz/sta... https://twitter.com/RanaHarbi/status/... https://twitter.com/arjunsethi81/stat... https://twitter.com/MaxAbrahms/status... The Saudi monarchy executes hundreds of people a year. In 2015, it executed 158 people, largely by beheading. Every four days, Saudi Arabia executes someone for drugs, even while its princes are caught with tons of drugs at the airport. Scholars have described Saudi Arabia as "the fountainhead" of Sunni Islamic radicalism. Former U.S. Sen. Bob Graham called extremist groups like ISIS and al-Qaeda "a product of Saudi ideals, Saudi money, and Saudi organizational support." U.S. government cables leaked by WikiLeaks show officials like Hillary Clinton admitting that al-Qaeda and other Salafi groups are supported by Saudi businessmen and members of the royal family. Despite its brutality and support for violent extremist groups, the U.S. considers Saudi Arabia a "close ally." In September, the State Department said it "welcomed" the news that the Saudi regime would head a U.N. human rights panel, noting "We're close allies." The State Department also refers to the Saudi monarchy as "a strong partner in regional security and counterterrorism efforts, providing military, diplomatic, and financial cooperation." President Obama's administration has done more than $100 billion in arms deals with the Saudi regime in the past five years. Saudi Arabia has been a close U.S. ally since the early 20th century, when it was discovered to have what were believed to be the world's largest oil reserves (the largest oil reserves are now known to be in Venezuela, but Saudi Arabia comes in a close second). In 1945, just before he died, President Franklin Delano Roosevelt famously met with Ibn Saud, the first king and founder of Saudi Arabia, and promised the absolute monarch the U.S. would support his regime in return for oil.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 02, 2016 09:00

January 1, 2016

The high school musical that might have saved me: Watching a high school production of “A Chorus Line” in middle age showed me how much I missed

I didn’t see "A Chorus Line" until I was 45.

In 1975, the year Michael Bennett and Bob Fosse’s brilliant spectacle of dance and confession first appeared on Broadway, I was 9 years old and just becoming aware that I was different from other boys. That year, after I’d expressed an interest in acting, my mother enrolled me in a drama school in Sacramento, incidentally the same school that a young Molly Ringwald attended. But after going to just one class, a well-meaning friend of mine who had heard of my interest from his mother but whose vocabulary was still developing asked me if I intended to become an actress. I quit that very day, instinctively frightened that I had already revealed too much about what I liked and who I was. Acting, I thought I had learned, was for girls and sissies, and I would no longer have anything to do with it.

In the years that followed, the social geography of my conservative, semi-rural California high school confirmed my suspicions. The theatre building was on the far north side of campus, perhaps purposefully isolated from the rest of the school. Every year, when hand-painted posters appeared advertising serious dramas or raucous musicals, a hum of activity surrounded the building. But I scrupulously kept my distance. To be found anywhere near it, I thought, was to be immediately and irredeemably implicated it all its queerness.

In between shows, however, as I went from one class to another, I occasionally observed thin boys with poofy hair and plump, colorfully dressed girls passing in and out of the building’s large metal doors, sometimes singing softly, more often all too wary of the ugly name-calling and contemptuous stares that were their daily crosses to bear. These especially unpopular people seemed to me neither heroic nor pathetic but simply stupid because they willingly, even willfully, brought such hateful attention upon themselves. At my school, they were either ridiculed as “drama fags” or, worse, ignored as irrelevant. The boys, especially, were subject to the loneliest kind of irrelevance. Except as the butt of a football player’s violent joke, it seemed to me, they didn’t exist. Thank God I wasn’t one of them, I told myself.

Things have changed a lot since the 1980s. America has grown more tolerant of gay people, high school musicals have become serious business, and after years of therapy and a lot of soul-searching, I’ve managed to become a reasonably happy adult who both loves the theatre and, miraculously, himself.

To be sure, my knowledge of Broadway is still woefully inadequate for a gay man, owing largely to my misspent youth; some things just can’t be repaired. A decade ago, however, I wrote a play that was produced at the New York Fringe Festival, and a few years ago I even acted in a community theater production of The Tempest, playing an especially effeminate and cowardly Trinculo to rave reviews. On both glorious occasions, a long-dead ember within me flamed anew.

So when a good friend and his partner, both musicians, invited my partner and me to a high school production of "A Chorus Line," I jumped at the chance. As we crowded into the enormous, expensively appointed theater, I felt a little self-conscious that the four of us – middle-aged men without wives or children – didn’t belong. But as we jostled past proud and eager parents and nearly tripped over their rambunctious younger children, some of whom were clearly aspiring performers themselves, I was also aware that I had, 30 years later, entered the high school theater building—and survived.

When the lights went up, and the dancers began dancing, I found myself utterly astonished by what I saw and heard. These teenagers were not only supremely talented performers but also confident, unembarrassed ones. They perfectly inhabited the gifted, devoted, anxious dancers they were portraying, not merely because they were these things themselves, but because they were that good.

Who were these extraordinary kids? And how did they get this way? Before the show, I’d been told by the conductor that many of them have had years of training and serious professional ambitions; but their self-assurance was something that could only be taught by supportive families, teachers and friends, by a world that welcomed and cherished them. I gazed admiringly into their eyes to discover what they might be feeling as they kicked and turned across the stage. Not all of them were physically beautiful. Some of the boys were short or skinny and some of the girls were chunky or a little plain. But they danced and sang like they were the most attractive and popular kids in school. And the roar of the audience made it clear that they were.

The racy lyrics of "A Chorus Line" never would have passed muster with the teachers and parents at my high school. Bullying, bigotry and callous disregard for the suffering of misfits were widely tolerated, but smoking, foul language, and overt sexuality were not. Indeed, the musical’s mature themes and frank talk of “tits and ass” even made me wonder for a moment just how age-appropriate it was, if these students really understood what they were saying. But the way they said it—with their voices and their bodies—told me that they did understand, that they knew not only what the adult material meant but also what it meant to them.

When the character of Cassie admits that her failed solo career has brought her back to the Broadway chorus line; when Greg sadly recalls the day he realized he was gay; when Val exults in her surgically enhanced breasts; when Sheila reflects on dancing as an escape from a broken family—as I listened to each of these very grown-up revelations delivered with such emotional clarity, I felt certain that, at least on some level, these kids already knew what it had taken me over four decades to learn: that life and love are treacherous and splendid things at the same time.

This was made tearfully, joyously clear to me when the character of Paul, alone onstage, confesses to the director, Zach, the story of his growing up gay in New York City, becoming a drag performer, and one day being discovered by his incredulous Puerto Rican parents. It was one thing to play gay in high school—one very heroic thing to do—but quite another to perform such fragile vulnerability and painfully honest strength, and to do it so flawlessly.

The young man playing Paul may or may not have been gay himself, but he looked every bit like one of the boys I knew in high school who lingered around the theater building, preferred the company of girls, and withstood endless abuse from his peers. I remembered with shame and regret the day I watched from a distance as a tough, swaggering jock put him in a chokehold and dragged him effortlessly across the quad to the delight of the lunch crowd. Onstage, Paul had a similarly soft, round, good-looking, melancholy face, and a husky physique. But his dancing was light and strong, his voice was clear and bright, and his demeanor credibly revealed a history of struggle.

As he told his story, the audience was silent, and the four of us wept quietly. I don’t know exactly why my partner and two friends wept, though I can imagine the reasons. I wept for Paul because his story was sad, because the actor was brave, and because this high school was—incredibly!—celebrating his bravery. I never thought I’d see this in a million years.

But I also wept because what the actor playing Paul brought to life onstage was something I had missed at his age, something I’d been unable or afraid to claim for myself— something that had not been possible, or at least, had not seemed possible at the time, which amounts to the same. What if I’d gone to such a school and had such support when I was a boy? Or what if I’d had the courage to go into the theater building, to withstand the abuse, to get up onstage and risk being called an actress or a sissy or a drama fag? Or what if I’d been wrong about my high school, wrong about the building into which I never dared enter, or wrong about the people who did? What had I missed? What had I forgone?

Sitting in the audience that night, full of awe, humility, and not a little envy, I understood that my life as I’d lived it wasn’t at all inevitable, though it often seemed so, that the man I’d become was never preordained but rather the aggregated result of particular circumstances and choices. Seeing so palpably that things might have been otherwise—other paths I might have taken, other identities I might have assumed—filled me with both profound regret and unmistakable satisfaction. Since that evening, I have frequently mourned the bold performer and unflinching devourer of experience I never became; but I’ve also come to believe that of all the possible lives I might have lived, this one was pretty damn good and, if I’m lucky and perhaps a little daring, might get even better.

One of my favorite songs in "A Chorus Line," now that I have seen it, occurs just after Paul injures himself and is carried off stage. As the other dancers consider how precarious their careers are—and how under-paid and under-appreciated—they finally declare that the risks have been worth it. I’d vaguely heard of “What I Did for Love” but always assumed it was about some sacrifice made in the name of romance, about the “sweetness and the sorrow” involved in devoting oneself to another person. But as I listened to these wonderful kids sing, it suddenly occurred to me that it was about the love of performance, the love of art:

“The gift was ours to borrow

It's as if we always knew

And I won't forget what I did for love

What I did for love.”

And as I wiped away another current of tears, I realized that the love of which they sung was also the love they had for themselves, which may just be courage by another name.

I didn’t see "A Chorus Line" until I was 45.

In 1975, the year Michael Bennett and Bob Fosse’s brilliant spectacle of dance and confession first appeared on Broadway, I was 9 years old and just becoming aware that I was different from other boys. That year, after I’d expressed an interest in acting, my mother enrolled me in a drama school in Sacramento, incidentally the same school that a young Molly Ringwald attended. But after going to just one class, a well-meaning friend of mine who had heard of my interest from his mother but whose vocabulary was still developing asked me if I intended to become an actress. I quit that very day, instinctively frightened that I had already revealed too much about what I liked and who I was. Acting, I thought I had learned, was for girls and sissies, and I would no longer have anything to do with it.

In the years that followed, the social geography of my conservative, semi-rural California high school confirmed my suspicions. The theatre building was on the far north side of campus, perhaps purposefully isolated from the rest of the school. Every year, when hand-painted posters appeared advertising serious dramas or raucous musicals, a hum of activity surrounded the building. But I scrupulously kept my distance. To be found anywhere near it, I thought, was to be immediately and irredeemably implicated it all its queerness.

In between shows, however, as I went from one class to another, I occasionally observed thin boys with poofy hair and plump, colorfully dressed girls passing in and out of the building’s large metal doors, sometimes singing softly, more often all too wary of the ugly name-calling and contemptuous stares that were their daily crosses to bear. These especially unpopular people seemed to me neither heroic nor pathetic but simply stupid because they willingly, even willfully, brought such hateful attention upon themselves. At my school, they were either ridiculed as “drama fags” or, worse, ignored as irrelevant. The boys, especially, were subject to the loneliest kind of irrelevance. Except as the butt of a football player’s violent joke, it seemed to me, they didn’t exist. Thank God I wasn’t one of them, I told myself.

Things have changed a lot since the 1980s. America has grown more tolerant of gay people, high school musicals have become serious business, and after years of therapy and a lot of soul-searching, I’ve managed to become a reasonably happy adult who both loves the theatre and, miraculously, himself.

To be sure, my knowledge of Broadway is still woefully inadequate for a gay man, owing largely to my misspent youth; some things just can’t be repaired. A decade ago, however, I wrote a play that was produced at the New York Fringe Festival, and a few years ago I even acted in a community theater production of The Tempest, playing an especially effeminate and cowardly Trinculo to rave reviews. On both glorious occasions, a long-dead ember within me flamed anew.

So when a good friend and his partner, both musicians, invited my partner and me to a high school production of "A Chorus Line," I jumped at the chance. As we crowded into the enormous, expensively appointed theater, I felt a little self-conscious that the four of us – middle-aged men without wives or children – didn’t belong. But as we jostled past proud and eager parents and nearly tripped over their rambunctious younger children, some of whom were clearly aspiring performers themselves, I was also aware that I had, 30 years later, entered the high school theater building—and survived.

When the lights went up, and the dancers began dancing, I found myself utterly astonished by what I saw and heard. These teenagers were not only supremely talented performers but also confident, unembarrassed ones. They perfectly inhabited the gifted, devoted, anxious dancers they were portraying, not merely because they were these things themselves, but because they were that good.

Who were these extraordinary kids? And how did they get this way? Before the show, I’d been told by the conductor that many of them have had years of training and serious professional ambitions; but their self-assurance was something that could only be taught by supportive families, teachers and friends, by a world that welcomed and cherished them. I gazed admiringly into their eyes to discover what they might be feeling as they kicked and turned across the stage. Not all of them were physically beautiful. Some of the boys were short or skinny and some of the girls were chunky or a little plain. But they danced and sang like they were the most attractive and popular kids in school. And the roar of the audience made it clear that they were.

The racy lyrics of "A Chorus Line" never would have passed muster with the teachers and parents at my high school. Bullying, bigotry and callous disregard for the suffering of misfits were widely tolerated, but smoking, foul language, and overt sexuality were not. Indeed, the musical’s mature themes and frank talk of “tits and ass” even made me wonder for a moment just how age-appropriate it was, if these students really understood what they were saying. But the way they said it—with their voices and their bodies—told me that they did understand, that they knew not only what the adult material meant but also what it meant to them.

When the character of Cassie admits that her failed solo career has brought her back to the Broadway chorus line; when Greg sadly recalls the day he realized he was gay; when Val exults in her surgically enhanced breasts; when Sheila reflects on dancing as an escape from a broken family—as I listened to each of these very grown-up revelations delivered with such emotional clarity, I felt certain that, at least on some level, these kids already knew what it had taken me over four decades to learn: that life and love are treacherous and splendid things at the same time.

This was made tearfully, joyously clear to me when the character of Paul, alone onstage, confesses to the director, Zach, the story of his growing up gay in New York City, becoming a drag performer, and one day being discovered by his incredulous Puerto Rican parents. It was one thing to play gay in high school—one very heroic thing to do—but quite another to perform such fragile vulnerability and painfully honest strength, and to do it so flawlessly.

The young man playing Paul may or may not have been gay himself, but he looked every bit like one of the boys I knew in high school who lingered around the theater building, preferred the company of girls, and withstood endless abuse from his peers. I remembered with shame and regret the day I watched from a distance as a tough, swaggering jock put him in a chokehold and dragged him effortlessly across the quad to the delight of the lunch crowd. Onstage, Paul had a similarly soft, round, good-looking, melancholy face, and a husky physique. But his dancing was light and strong, his voice was clear and bright, and his demeanor credibly revealed a history of struggle.

As he told his story, the audience was silent, and the four of us wept quietly. I don’t know exactly why my partner and two friends wept, though I can imagine the reasons. I wept for Paul because his story was sad, because the actor was brave, and because this high school was—incredibly!—celebrating his bravery. I never thought I’d see this in a million years.

But I also wept because what the actor playing Paul brought to life onstage was something I had missed at his age, something I’d been unable or afraid to claim for myself— something that had not been possible, or at least, had not seemed possible at the time, which amounts to the same. What if I’d gone to such a school and had such support when I was a boy? Or what if I’d had the courage to go into the theater building, to withstand the abuse, to get up onstage and risk being called an actress or a sissy or a drama fag? Or what if I’d been wrong about my high school, wrong about the building into which I never dared enter, or wrong about the people who did? What had I missed? What had I forgone?

Sitting in the audience that night, full of awe, humility, and not a little envy, I understood that my life as I’d lived it wasn’t at all inevitable, though it often seemed so, that the man I’d become was never preordained but rather the aggregated result of particular circumstances and choices. Seeing so palpably that things might have been otherwise—other paths I might have taken, other identities I might have assumed—filled me with both profound regret and unmistakable satisfaction. Since that evening, I have frequently mourned the bold performer and unflinching devourer of experience I never became; but I’ve also come to believe that of all the possible lives I might have lived, this one was pretty damn good and, if I’m lucky and perhaps a little daring, might get even better.

One of my favorite songs in "A Chorus Line," now that I have seen it, occurs just after Paul injures himself and is carried off stage. As the other dancers consider how precarious their careers are—and how under-paid and under-appreciated—they finally declare that the risks have been worth it. I’d vaguely heard of “What I Did for Love” but always assumed it was about some sacrifice made in the name of romance, about the “sweetness and the sorrow” involved in devoting oneself to another person. But as I listened to these wonderful kids sing, it suddenly occurred to me that it was about the love of performance, the love of art:

“The gift was ours to borrow

It's as if we always knew

And I won't forget what I did for love

What I did for love.”

And as I wiped away another current of tears, I realized that the love of which they sung was also the love they had for themselves, which may just be courage by another name.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 16:00

The lost pop genius of Scott Miller: “Whatever makes you fall in love with a pop song … he did that”

In mid-December, Omnivore Records announced a reissue of Game Theory's 1987 double LP, "Lolita Nation." To say the news was long-awaited is an understatement: The album has been out of print and nearly impossible to find on CD for many years (unless you want to shell out big bucks on eBay), and it's considered by many to be the finest work produced by songwriter Scott Miller. Although known as a power-pop touchstone, the Mitch Easter-produced "Lolita Nation" is actually far denser and more complicated than that tag might imply: The collection leapfrogs through genres and sounds—theatrical synthrock, psych-torqued pop, swaggering jangle-rock, British Invasion exuberance, soul-jazz moodpieces—with fluid grace and ease. One of the more bittersweet aspects of the "Lolita Nation" reissue is that Miller isn't here to see it, as he took his own life in 2013. His unexpected death stunned fans, and spurred an outpouring of grief and admiration from musicians, journalists and fans alike. One such admirer was Boston-based journalist Brett Milano, who recently published a compelling, comprehensive biography of the musician, "Don't All Thank Me at Once: The Lost Pop Genius of Scott Miller." (More information and places to buy the book are here.) Milano—a long-time fan of Game Theory and Miller's other band, the Loud Family—dates his fandom to hearing the former band's song "Here It Is Tomorrow" on the radio. "It was like, 'Oh my God, the way [Miller] hit you over the head with that hook,'" Milano recalls. "He keyed into a real exhilaration which at that time I was not feeling from enough music. He just made me remember that if somebody turns a hook a certain way, it can change the entire quality of your life for a few moments." In "Don't All Thank Me at Once," Milano amplifies and explains those moments of exuberance via detailed analysis of the Game Theory and the Loud Family catalog, courtesy of interviews with band members and others around the group at the time. He weaves these tales into the story of Miller's life—how he grew from a teenage music fanatic into someone who, in the last decade, had nearly left music behind after becoming a father. The book also delves into the fact that Miller was in the early stages of working on a new Game Theory record when he passed, which would've been the first LP from that group since 1988. (This album is being completed by a slew of talented musicians and released under the name "I Love You All" in 2016.) Milano started thinking about writing "Don't All Thank Me at Once" after Miller died. "A lot of us that knew him and knew his work were affected by that, and thought it was a bit of a crime that he had never really been recognized," he says. "The quality of his work was way out of sync with the amount of recognition he got." Milano decided to put together a proposal for the 33 1/3 series of books on "Lolita Nation," which he knew was going to be reissued. That ultimately wasn't successful, but he says he received enough encouragement. "What the [33 1/3 series] editor finally said was, 'This is a little too non-commercial for us, but I love this proposal,' which I took as my cue to go a little further with it," Milano says. "I decided, 'Let's try and make this a real proper biography, and try to get a handle on who Scott was and how that played into the music he made. Let's get into his entire body of work.' A bunch of interviews later, that's where it wound up." Milano spoke with Salon about the genesis and execution of "Don't All Thank Me at Once," as well as Miller's music and legacy. What unique challenges did the book pose for you? Well, it was my first foray into telling a personal story, not just talking about somebody's music. And in this case, you had somebody that isn't with us anymore. So I had to get as much of a handle on what his life had been as I could, which means that not just talking to people that he played with. I tried to talk to people that were important and intimate in his life, which wound up including his mother, who is 93, who I wasn't sure I'd ever get a chance to talk to, and his wife, his widow, who obviously was still quite bereaved. Her being willing to open up about things was a big help. When you're writing a book about someone who isn't there anymore, the emotions are on the surface. I was really impressed how many people were willing to talk about him. More than a couple of people became very teary when I interviewed them. His memory was very fresh to them. If Scott didn't feel that people really cared about him deeply, he should've heard some of the interviews I did. You hate to hear that stuff now. It's kind of like with the reissue campaign—the "Lolita Nation" trailer came out this week. It's so exciting, and it's such a shame it took him passing to get all this stuff in motion. That's the horrible, sad part. But on the other hand, it didn't have to go that way. One thing you can see in the book is he was planning to make another record, which would've come out under the name Game Theory. The deal to reissue the albums—it had not come to fruition in his lifetime, but it was already on the table anyway. So I think the end of the story could've been quite different, if he could've finished the record he was intending to make and have that come out around the same time as the "Lolita" reissue. I don't know if he would've ever gotten quite the stardom he deserved, but he certainly would've been a little more enshrined in the ranks of cult heroes who need to be paid attention to. But the book wasn't sad. It was humorous at times, and it brought to life why his music was so compelling and smart. To me, he wasn't "insert the name of somebody whose music is invariably depressing." He wasn't one of those people. I find a whole lot of joy in his music, and a whole lot of life-affirming stuff. My personal favorite album is "Plants and Birds and Rocks and Things," the first Loud Family album, which does include some incredibly despairing songs. But on the other hand, there's some songs that are completely exuberant and full of life. That's what Scott was all about. His music was really about absorbing your emotions and your psyche and your intellect into everything that there is to be found. Which I think is important. There's a whole mythology when a musician passes away before he should or before his time—people tend to focus on the melancholy or the warning signs. But I listen back to so much of his music now, and it's like, "Why wasn't this a hit? This is wonderful." It's so upbeat; it's so smart. It's complicated and dense, in a really wonderful way. Exactly. And it was never complicated and dense in a way that people wouldn't get. And I don't really think you can say, "Oh yeah, makes sense that people wouldn't listen to this, it's so out there." It never really was. Probably the biggest stumbling block people ever had with him is that he had a higher voice. But certainly by the time you get to the Loud Family, he was singing in a much different register, and it was a classic model pop voice. It was not one of those voices that would not have sounded good on the radio. And now, there's dozens of indie rockers who sing in a similar high register! I know! [Laughs.] You hear a record like "Plants and Birds and Rocks and Things" and it's like, "Where were all the Weezer fans when this record came out?" They would've heard the same qualities in this. You were a huge Scott Miller fan for many years. What did you learn about him as a musician or a person from doing this book? There were some things I took that were true about him already. I actually worked with him somewhat, because I was at the Alias label when he was submitting "Plants and Birds," and he would send track lists to us before the songs were even recorded and may not have even been written. Fully conceptualized track lists, ideas for what the cover was going to look like. I definitely got it that he was one of those people that comes up with fully formed, incredibly complex ideas in his head. The whole idea of him being a brainiac was pretty much true. And I also figured out that he was pretty emotionally complex. It turned out to be even moreso than I had imagined, because even his closest friends did not really fathom what was going on in his head. Even his family really didn't have an inkling of what was going on with him. I mean, why he took his life still puzzles and disturbs a whole lot of people. He was really good at holding things close to his chest, which is tragic. But I did not quite realize how advanced he was in terms of that. Throughout the book, there is a thread that he is enigmatic, and you don’t really know him. But it's almost like people didn't realize how much they didn't know him until he passed. That stood out to me when you read some of the quotes people gave, and their observations. He was famously self-deprecating, and he was self-deprecating in a really funny way, particularly about his lack of success. But I don't think people realized that maybe there was a lot of actual pain behind that self-deprecation. I do think he felt the rejection, I think, that his records weren't doing better than they were. You wish you had another day to ask him. I still have Facebook messages from him in my inbox. And one time, I was reading over the online column that he did, "Ask Scott," and I was looking at his answer to a question, and I thought, "Well, this is really nice he framed it this way." And then I saw that I had actually asked the question. [Laughs.] Here's Scott here and now, guiding me along, answering a question for this book. It's like, "Wow." As much of a mastermind as he is, it bore out to me that the music was really a product of a lot of other people as well. It brought some of the life back to that music to reconnect with people like [Game Theory drummer] Gil Ray and [Game Theory guitarist/vocalist] Donette Thayer, [Loud Family guitarist] Zachary Smith and all the people that were collaborators and parts of his vision in those different bands. And the book was as much an analysis of the music as it was about Scott. That's why we care about this guy, really, is that he was so good at doing this. We can't quite understand what made him so good at doing this, but we can get a sense of how he approached it. He wrote a lot of his lyrics in a notebook on the train on the way to work, because he didn't drive. [It's like] "Okay, I've got an hour before I have to check in at the office; here comes a pop masterpiece." When you started doing interviews with everyone in his life, did the book take any unexpected turns? Did your approach or the structure change at all depending on the answers and information you took in? I was presented with some material I had to treat a bit sensitively. He did have a first marriage that fell apart. And then there was the issue of his death and how to present that, which I knew were going to be the hardest parts of the book. I knew that there would be a lot to balance in terms of my responsibilities as a biographer versus being a bit sensitive to the people involved. Those were the biggest challenges, having to absorb all the information around that and figure out what exactly to do with it. The way that Scott actually died, the specifics of that, I was asked not to present that. And because the information isn't actually public anywhere, I thought that the family was justified in having the call on that one. His band members, fans and friends have been really protective of his family and legacy. People aren't being salacious about it; they've been really supportive. I think that speaks to the quality of the character of the people he had around him, and the character of his fans. It really does. It says that he reached a lot of the people he wanted to reach, people that would think a little more deeply about the music they were listening to. What makes Scott Miller compelling as a person and as a musician and as a book subject? What makes him compelling as a musician is that his stuff did have all these allusions and all this complexity to it, and yet it was Beatles-level catchy. You hear these songs—whatever makes you fall in love with a pop song and have this glorious hook ringing inside of your head for days and days, he did that. What makes him interesting as a person is he was a real anomaly in terms of who made pop music. He wasn't into sex and drugs particularly. He was monogamous in his relationships. He had this ability on the road to go to events where the entire band was partying, drag a mattress in the middle of the floor, and fall asleep. He was somebody that was really purposeful and really single-minded about wanting to create music. He didn't get into it for any lifestyle reasons. I think people relate to him, because he truly was a romantic. When he fell in love, he fell in love really deeply. He got really hurt by his relationships, and he got really exhilarated when the relationships worked out. [And] the template for a pop song is capturing that kind of thing. That's who he was. In a lot of ways, he was the kind of person that his music appealed to—people that were a bit more sensitive than the norm, and that had a bit more hunger for musical experiences than most people did. When people read the book, what kind of takeaways do you want them to get from it? "Oh my God, why didn't I listen to this guy? I'd better start right now!" [Laughs.] People have said they've cried after reading the book, which I didn't really intend that. I think for some people it was more of a healing thing to read it. It was to write it, for sure.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 16:00

5 excellent crowd-pleasing movies you might have missed in 2015

The holiday season means getting together with family members who may not share your political views, and watching a movie together is one way to get the bickering to stop. As it happens, that’s my normal: My everyday viewing household consists of a large white Republican whose preferred genre in all things is “war”; and a vegetarian, philosophically minded teenage boy who plays Dungeons & Dragons and rows varsity crew. It's tricky finding a film that we all enjoy, but I'm the most difficult to please. A cinematic narcoleptic, I routinely fall asleep at the movies. My personal rating system is based on how far I’m into the story before I start to doze off. I fell asleep during every installment of the “Iron Man,” “Hunger Games,” and the “Avengers” franchises. By contrast, I stayed awake all the way through both “Star Trek” reboots with Chris Pine as Kirk, unexpectedly loved “Guardians of the Galaxy,” and mostly stayed awake for the James Bond films with Daniel Craig in the lead. (The exception was “Quantum of Solace,” which put me soundly to sleep by the Opera scene.) So while “Mad Max: Fury Road,” is on a lot of top 10 lists of 2015 at No. 1, and caps the Rotten Tomatoes list as the top reviewed film of the year, I fell asleep through that one too, though the original “Mad Max” with young Mel is a film I still occasionally rewatch. Yell at me for Not Understanding the Themes or being a silly liberal [unprintable], but here is the thing: the hawkish man and hippie boy in my household agree. They were underwhelmed and kind of bored by “Fury Road.” And by “Star Wars: The Force Awakens,” too. Too many special effects and not enough original story. Given the disparities between what all three of us loved and what makes the “Best of” and “Top 10” lists, there seems to be a critical gap between what gets aggressively marketed, and what average viewers end up recommending to their friends. I get that there are serious, arty films engineered for critical raves, and then there are the popcorn summer films designed purely to entertain. But “thought-provoking” and “compulsively watchable” don’t have to be mutually exclusive goals. One film that my household loves is “Snowpiercer.“ Technically, it’s a foreign film because it was made in Korea, but the dialogue is mostly in American English, and Captain America and Dame Maggie Smith play the leads, so it scans as a Hollywood film. The point being that I’ve made the men in my household watch indie and foreign-ish films for so long they no longer protest, because they’ve turned out to be engrossing and memorable. (And if they also happen to have Strong Female Protagonists and, hey, maybe a lot of brown people with speaking roles, I don't have a problem with this.) Here are the five films/DVDs of 2015 that made the bipartisan No-Snooze list. “Spare Parts,” 2015. Underdog stories are always compelling, but this one resonates because it’s based on a true story. Held together with a shoestring budget and looking too much like an afterschool special, the film partakes of the same earnest, MacGyverish ethos of its heroes--a quartet of impoverished, undocumented high school kids who end up competing in an underwater robotics competition against MIT college students. Inform your Trump-loving uncle that Marisa Tomei is one of the leads and don’t tell him anything else. By the end, I’ll bet Uncle Trump will be rooting for those “illegals” to win. “Cop Car,” 2015. For fans of the TV series “Fargo,” this tense, anti-nostalgia film will help fill the void. Starring Kevin Bacon, this film gives us a crooked white cop, a pair of boys with too much time on their hands, and an empty rural landscape, and then it puts loaded weapons—a car and a gun-- in the hands of children who have no idea what they’re doing with them. “Cop Car” has the feel of a potential classic that didn’t quite reach greatness, but given how politically charged the topics of police corruption and gun control currently are, it’s noteworthy that this film explores both subjects without pushing an agenda, and lets the bleak landscape serve as a metaphor for the flat-lined condition of a dying American dream. “Nightcrawler,” 2014/ DVD 2015. None of us expected to enjoy this one, and we ended up talking about it for days. A neo-noir thriller starring Jake Gyllenhall as a sociopathic freelance cameraman who starts manipulating crime scenes for better ratings, “Nightcrawler” is a mesmerizing critique of the news industry as well as a subversive tale of How To Succeed at Business by Really, Really Trying. This film will please conservatives, who will find the lamestream media indicted here, as well as liberals who condemn predatory capitalism for having no boundaries, and no morals. The teenager liked it because it was honest about the difficulties of getting a decent job in a crap economy, while de-glamorizing the media. “The Homesman,” 2014/DVD 2015. Billed as a feminist revisionist western, it’s got Hilary Swank playing a self-sufficient landowner who takes on the task of returning four madwomen back to civilization. The film takes risks, and it falls more than once, but its mistakes were really interesting, and the man and the boy didn’t care that it fell apart by the end. Not for the faint of heart, because the violence operates at a human level, and for that reason is truly disturbing. Translation: Men like this film because it refuses to romanticize the difficulties of surviving on the frontier, and women like it because it takes emotional pain seriously. “What We Do in the Shadows,” released in the US in 2015. I am a fan of vampire cheese and the boy loves “Flight of the Conchords,” so the two of us went in expecting to adore “What We Do in the Shadows,” a New Zealand indie starring “Conchords'” Jemaine Clement. However, we also figured the Stan Smith in our household would hate it, because he neither enjoys vampires nor comedies. Well, we were wrong. Clever, satirical—biting, even!--and very funny, this mockumentary is the film to watch when you want to laugh at everything, including political correctness.The holiday season means getting together with family members who may not share your political views, and watching a movie together is one way to get the bickering to stop. As it happens, that’s my normal: My everyday viewing household consists of a large white Republican whose preferred genre in all things is “war”; and a vegetarian, philosophically minded teenage boy who plays Dungeons & Dragons and rows varsity crew. It's tricky finding a film that we all enjoy, but I'm the most difficult to please. A cinematic narcoleptic, I routinely fall asleep at the movies. My personal rating system is based on how far I’m into the story before I start to doze off. I fell asleep during every installment of the “Iron Man,” “Hunger Games,” and the “Avengers” franchises. By contrast, I stayed awake all the way through both “Star Trek” reboots with Chris Pine as Kirk, unexpectedly loved “Guardians of the Galaxy,” and mostly stayed awake for the James Bond films with Daniel Craig in the lead. (The exception was “Quantum of Solace,” which put me soundly to sleep by the Opera scene.) So while “Mad Max: Fury Road,” is on a lot of top 10 lists of 2015 at No. 1, and caps the Rotten Tomatoes list as the top reviewed film of the year, I fell asleep through that one too, though the original “Mad Max” with young Mel is a film I still occasionally rewatch. Yell at me for Not Understanding the Themes or being a silly liberal [unprintable], but here is the thing: the hawkish man and hippie boy in my household agree. They were underwhelmed and kind of bored by “Fury Road.” And by “Star Wars: The Force Awakens,” too. Too many special effects and not enough original story. Given the disparities between what all three of us loved and what makes the “Best of” and “Top 10” lists, there seems to be a critical gap between what gets aggressively marketed, and what average viewers end up recommending to their friends. I get that there are serious, arty films engineered for critical raves, and then there are the popcorn summer films designed purely to entertain. But “thought-provoking” and “compulsively watchable” don’t have to be mutually exclusive goals. One film that my household loves is “Snowpiercer.“ Technically, it’s a foreign film because it was made in Korea, but the dialogue is mostly in American English, and Captain America and Dame Maggie Smith play the leads, so it scans as a Hollywood film. The point being that I’ve made the men in my household watch indie and foreign-ish films for so long they no longer protest, because they’ve turned out to be engrossing and memorable. (And if they also happen to have Strong Female Protagonists and, hey, maybe a lot of brown people with speaking roles, I don't have a problem with this.) Here are the five films/DVDs of 2015 that made the bipartisan No-Snooze list. “Spare Parts,” 2015. Underdog stories are always compelling, but this one resonates because it’s based on a true story. Held together with a shoestring budget and looking too much like an afterschool special, the film partakes of the same earnest, MacGyverish ethos of its heroes--a quartet of impoverished, undocumented high school kids who end up competing in an underwater robotics competition against MIT college students. Inform your Trump-loving uncle that Marisa Tomei is one of the leads and don’t tell him anything else. By the end, I’ll bet Uncle Trump will be rooting for those “illegals” to win. “Cop Car,” 2015. For fans of the TV series “Fargo,” this tense, anti-nostalgia film will help fill the void. Starring Kevin Bacon, this film gives us a crooked white cop, a pair of boys with too much time on their hands, and an empty rural landscape, and then it puts loaded weapons—a car and a gun-- in the hands of children who have no idea what they’re doing with them. “Cop Car” has the feel of a potential classic that didn’t quite reach greatness, but given how politically charged the topics of police corruption and gun control currently are, it’s noteworthy that this film explores both subjects without pushing an agenda, and lets the bleak landscape serve as a metaphor for the flat-lined condition of a dying American dream. “Nightcrawler,” 2014/ DVD 2015. None of us expected to enjoy this one, and we ended up talking about it for days. A neo-noir thriller starring Jake Gyllenhall as a sociopathic freelance cameraman who starts manipulating crime scenes for better ratings, “Nightcrawler” is a mesmerizing critique of the news industry as well as a subversive tale of How To Succeed at Business by Really, Really Trying. This film will please conservatives, who will find the lamestream media indicted here, as well as liberals who condemn predatory capitalism for having no boundaries, and no morals. The teenager liked it because it was honest about the difficulties of getting a decent job in a crap economy, while de-glamorizing the media. “The Homesman,” 2014/DVD 2015. Billed as a feminist revisionist western, it’s got Hilary Swank playing a self-sufficient landowner who takes on the task of returning four madwomen back to civilization. The film takes risks, and it falls more than once, but its mistakes were really interesting, and the man and the boy didn’t care that it fell apart by the end. Not for the faint of heart, because the violence operates at a human level, and for that reason is truly disturbing. Translation: Men like this film because it refuses to romanticize the difficulties of surviving on the frontier, and women like it because it takes emotional pain seriously. “What We Do in the Shadows,” released in the US in 2015. I am a fan of vampire cheese and the boy loves “Flight of the Conchords,” so the two of us went in expecting to adore “What We Do in the Shadows,” a New Zealand indie starring “Conchords'” Jemaine Clement. However, we also figured the Stan Smith in our household would hate it, because he neither enjoys vampires nor comedies. Well, we were wrong. Clever, satirical—biting, even!--and very funny, this mockumentary is the film to watch when you want to laugh at everything, including political correctness.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 15:59

The 9 best film performances of 2015 — that won’t get nominated for anything

Every Oscar race features a number of sure things, safe bets and also-rans—the folks who might have gained traction in some of the awards guilds and smaller ceremonies but won’t hear their names read when the Academy Awards nominations are called out in January. This year, that’s likely to be folks like Kristen Stewart, who was recognized by the New York Film Critics Circle and France’s prestigious César Awards for her great work in “Clouds of Sils Maria,” or SAG nominee Michael Shannon (“99 Homes”)—still a long shot in this year’s race. But what about the also-also-rans: the folks who have no realistic shot to get nominated for an Academy Award for their work—or even much else? They might not give Oscars for “best Bulgarian accent” or “best bearded sexy dance,” but that doesn’t mean these nine actors weren’t giving it their all in 2015. 1. Charlize Theron in “Mad Max: Fury Road” For all the praise heaped onMad Max: Fury Road”—Rotten Tomatoes’ best reviewed film of the year—it’s a damn shame that Charlize Theron’s own superlative work isn’t likely to translate to much awards traction. In “Fury Road,” Theron proved not just that she’s one of the best actors of her generation but that she’s one of our great unsung action heroes. Theron’s one-armed Imperator Furiosa is as instantly iconic and unforgettable as the film she’s in, the 21st century’s answer to Ellen Ripley. 2. Oscar Isaac in “Ex Machina” While Oscar Isaac’s co-star—the amazing Swedish actress Alicia Vikander—is stealing all the buzz for Alex Garland’s sci-fi thriller, Isaac’s work here shouldn’t be ignored. Isaac lends a simmering brute force to the film as an alcoholic, unpredictable tech CEO who invites an employee (Domhnall Gleeson) to partake in a controlled experiment at his private compound. “Ex Machina” proved Isaac a formidable leading man to watch, after stellar breakout turns in “Drive,” “Inside Llewyn Davis,” and “The Two Faces of January.” You can catch him right now in a little movie called “Star Wars: The Force Awakens.” 3. Rose Byrne in “Spy” I love Melissa McCarthy. You love Melissa McCarthy. Everyone loves Melissa McCarthy. But I mean it as no impertinence when I say that her “Spy” co-star, Rose Byrne, made me laugh harder than anyone else this year. After “Bridesmaids” and “Neighbors,” Paul Feig’s Bond send-up further proved that Byrne has unbelievable comic timing, playing the opulently coarse daughter of a Bulgarian mobster. Rose Byrne is a mile-a-minute put-down machine in this movie, like Triumph the Insult Comic Dog in leopard-print spandex. I could watch Byrne and McCarthy trade barbs for the rest of my life. 4. Cynthia Nixon in “James White” Did you know that Cynthia Nixon is just a little gold man away from being an EGOT winner? The former “Sex and the City” actress has won awards for her work on that show, as well as her acclaimed role in “Rabbit Hole” on Broadway and the spoken-word companion album to the documentary “An Inconvenient Truth” (which is a thing, apparently). In “James White,” Nixon makes a strong case to be let into the club. In a scene that’s easily the most powerful and heartbreaking of the year, Nixon’s character, who is dying of cancer, listens as her troubled son (Christopher Abbott of “Girls”) envisions a happy ending for them in Paris. She moves for work, and he finds love. Life, however, isn’t always what we dream. 5. Tessa Thompson in “Creed” While Michael B. Jordan proves himself a bonafide star and Sylvester Stallone mounts his Oscar comeback, I’d like to see some of that love spread to their co-star, the terrific Tessa Thompson. You might recognize Thompson from “Veronica Mars”—where she played Veronica’s frenemy, Jackie—but 2014 was a breakthrough year for her, with roles in “Selma” and “Dear White People.” In “Creed,” Tessa Thompson plays Bianca, Adonis’ neighbor and later girlfriend, but she’s more than just boxing ring candy. She gives another soulful, layered performance as lived in as the movie’s Philadelphia setting, and playing an aspiring singer, she proves herself no slouch behind a mic. 6. Joel Edgerton in “The Gift” Joel Edgerton gained traction early in this year’s Oscar race for his work in “Black Mass,” but his work in “The Gift” proved the true revelation. Edgerton pulls double duty, both directing and starring in this twisty thriller about a man (Jason Bateman), the kid he used to bully in school (Edgerton) and his concerned wife (Rebecca Hall)—who doesn’t know about his past. Instead of the conventional nice-people-stalked-by-deranged-lunatic narrative, this European-tinged nailbiter recalls the best work of Austria’s Michael Haneke. “The Gift” asks difficult questions about the comforts of suburban conformity and its costs, forcing us to ask who really is the bad guy here: the stalker or the man who has abused him for decades? 7. Benicio Del Toro in “Sicario” There’s a reason that Lionsgate is already discussing a spin-off film based around Benicio Del Toro. In Denis Villeneuve's acclaimed drug war thriller, “Sicario,” Del Toro reminds audiences why he remains one of cinema’s most venerable character actors. He’s a government operative working both sides of the border, whose behavior is perpetually confusing to his co-worker (Emily Blunt). Del Toro’s Alejandro is a cipher for the menace of the drug war itself, and his intense and stoic performance never gives too much away; he’s spellbinding to watch. Blunt is technically the lead, but given that the film’s title is “hitman” in Spanish, there’s never any question about whose movie this is. 8. Parker Posey in “Irrational Man” The biggest issue with late-period Woody Allen movies—other than their writer/director’s problematic personal history—is that Allen thinks in themes, not characters. In “Irrational Man,” Joaquin Phoenix’s Abe feels less like a college professor than a soapbox for Woody Allen’s own conflicted philosophies. For all the Woodman’s contrivances, though, Parker Posey’s Rita feels like the closest he’s come to an actual person in years. Posey has a knack for playing characters with a hint of darkness (see: her recent stint on “Louie”), and her character—a married colleague who falls in love with Abe—is perpetually frazzled and conflicted but completely absorbing. When she’s onscreen, she brings out the best in Woody Allen, and if he’s at all smart, Allen’s next movie should be the Parker Posey vehicle we deserved ages ago. 9. Julie Walters in “Brooklyn” John Crowley’s “Brooklyn” features an embarrassment of great, understated performances from its talented ensemble. Saoirse Ronan, who gives the performance of her young career as an Irish immigrant in 1950s New York, is my personal favorite in this year’s Best Actress Oscar race, but the film’s humane touch extends to nearly every character. The always great Julie Walters (“Billy Elliott,” “Mamma Mia”) particularly shines as Eilis’ (Ronan) boisterously spirited landlady. What makes “Brooklyn” one of the finest films of the year is that it feels so wonderfully alive and rich with experience, as if you could take up a room in Walters’ boarding house and live there. I would in a second. Also Check Out: 1) Geza Rohrig in “Son of Saul,” 2) Jason Segel in “The End of the Tour,” 3) Blythe Danner in “I’ll See You in My Dreams,” 4) Elizabeth Banks in “Love and Mercy,” and 5) Sarah Paulson in “Carol.”Every Oscar race features a number of sure things, safe bets and also-rans—the folks who might have gained traction in some of the awards guilds and smaller ceremonies but won’t hear their names read when the Academy Awards nominations are called out in January. This year, that’s likely to be folks like Kristen Stewart, who was recognized by the New York Film Critics Circle and France’s prestigious César Awards for her great work in “Clouds of Sils Maria,” or SAG nominee Michael Shannon (“99 Homes”)—still a long shot in this year’s race. But what about the also-also-rans: the folks who have no realistic shot to get nominated for an Academy Award for their work—or even much else? They might not give Oscars for “best Bulgarian accent” or “best bearded sexy dance,” but that doesn’t mean these nine actors weren’t giving it their all in 2015. 1. Charlize Theron in “Mad Max: Fury Road” For all the praise heaped onMad Max: Fury Road”—Rotten Tomatoes’ best reviewed film of the year—it’s a damn shame that Charlize Theron’s own superlative work isn’t likely to translate to much awards traction. In “Fury Road,” Theron proved not just that she’s one of the best actors of her generation but that she’s one of our great unsung action heroes. Theron’s one-armed Imperator Furiosa is as instantly iconic and unforgettable as the film she’s in, the 21st century’s answer to Ellen Ripley. 2. Oscar Isaac in “Ex Machina” While Oscar Isaac’s co-star—the amazing Swedish actress Alicia Vikander—is stealing all the buzz for Alex Garland’s sci-fi thriller, Isaac’s work here shouldn’t be ignored. Isaac lends a simmering brute force to the film as an alcoholic, unpredictable tech CEO who invites an employee (Domhnall Gleeson) to partake in a controlled experiment at his private compound. “Ex Machina” proved Isaac a formidable leading man to watch, after stellar breakout turns in “Drive,” “Inside Llewyn Davis,” and “The Two Faces of January.” You can catch him right now in a little movie called “Star Wars: The Force Awakens.” 3. Rose Byrne in “Spy” I love Melissa McCarthy. You love Melissa McCarthy. Everyone loves Melissa McCarthy. But I mean it as no impertinence when I say that her “Spy” co-star, Rose Byrne, made me laugh harder than anyone else this year. After “Bridesmaids” and “Neighbors,” Paul Feig’s Bond send-up further proved that Byrne has unbelievable comic timing, playing the opulently coarse daughter of a Bulgarian mobster. Rose Byrne is a mile-a-minute put-down machine in this movie, like Triumph the Insult Comic Dog in leopard-print spandex. I could watch Byrne and McCarthy trade barbs for the rest of my life. 4. Cynthia Nixon in “James White” Did you know that Cynthia Nixon is just a little gold man away from being an EGOT winner? The former “Sex and the City” actress has won awards for her work on that show, as well as her acclaimed role in “Rabbit Hole” on Broadway and the spoken-word companion album to the documentary “An Inconvenient Truth” (which is a thing, apparently). In “James White,” Nixon makes a strong case to be let into the club. In a scene that’s easily the most powerful and heartbreaking of the year, Nixon’s character, who is dying of cancer, listens as her troubled son (Christopher Abbott of “Girls”) envisions a happy ending for them in Paris. She moves for work, and he finds love. Life, however, isn’t always what we dream. 5. Tessa Thompson in “Creed” While Michael B. Jordan proves himself a bonafide star and Sylvester Stallone mounts his Oscar comeback, I’d like to see some of that love spread to their co-star, the terrific Tessa Thompson. You might recognize Thompson from “Veronica Mars”—where she played Veronica’s frenemy, Jackie—but 2014 was a breakthrough year for her, with roles in “Selma” and “Dear White People.” In “Creed,” Tessa Thompson plays Bianca, Adonis’ neighbor and later girlfriend, but she’s more than just boxing ring candy. She gives another soulful, layered performance as lived in as the movie’s Philadelphia setting, and playing an aspiring singer, she proves herself no slouch behind a mic. 6. Joel Edgerton in “The Gift” Joel Edgerton gained traction early in this year’s Oscar race for his work in “Black Mass,” but his work in “The Gift” proved the true revelation. Edgerton pulls double duty, both directing and starring in this twisty thriller about a man (Jason Bateman), the kid he used to bully in school (Edgerton) and his concerned wife (Rebecca Hall)—who doesn’t know about his past. Instead of the conventional nice-people-stalked-by-deranged-lunatic narrative, this European-tinged nailbiter recalls the best work of Austria’s Michael Haneke. “The Gift” asks difficult questions about the comforts of suburban conformity and its costs, forcing us to ask who really is the bad guy here: the stalker or the man who has abused him for decades? 7. Benicio Del Toro in “Sicario” There’s a reason that Lionsgate is already discussing a spin-off film based around Benicio Del Toro. In Denis Villeneuve's acclaimed drug war thriller, “Sicario,” Del Toro reminds audiences why he remains one of cinema’s most venerable character actors. He’s a government operative working both sides of the border, whose behavior is perpetually confusing to his co-worker (Emily Blunt). Del Toro’s Alejandro is a cipher for the menace of the drug war itself, and his intense and stoic performance never gives too much away; he’s spellbinding to watch. Blunt is technically the lead, but given that the film’s title is “hitman” in Spanish, there’s never any question about whose movie this is. 8. Parker Posey in “Irrational Man” The biggest issue with late-period Woody Allen movies—other than their writer/director’s problematic personal history—is that Allen thinks in themes, not characters. In “Irrational Man,” Joaquin Phoenix’s Abe feels less like a college professor than a soapbox for Woody Allen’s own conflicted philosophies. For all the Woodman’s contrivances, though, Parker Posey’s Rita feels like the closest he’s come to an actual person in years. Posey has a knack for playing characters with a hint of darkness (see: her recent stint on “Louie”), and her character—a married colleague who falls in love with Abe—is perpetually frazzled and conflicted but completely absorbing. When she’s onscreen, she brings out the best in Woody Allen, and if he’s at all smart, Allen’s next movie should be the Parker Posey vehicle we deserved ages ago. 9. Julie Walters in “Brooklyn” John Crowley’s “Brooklyn” features an embarrassment of great, understated performances from its talented ensemble. Saoirse Ronan, who gives the performance of her young career as an Irish immigrant in 1950s New York, is my personal favorite in this year’s Best Actress Oscar race, but the film’s humane touch extends to nearly every character. The always great Julie Walters (“Billy Elliott,” “Mamma Mia”) particularly shines as Eilis’ (Ronan) boisterously spirited landlady. What makes “Brooklyn” one of the finest films of the year is that it feels so wonderfully alive and rich with experience, as if you could take up a room in Walters’ boarding house and live there. I would in a second. Also Check Out: 1) Geza Rohrig in “Son of Saul,” 2) Jason Segel in “The End of the Tour,” 3) Blythe Danner in “I’ll See You in My Dreams,” 4) Elizabeth Banks in “Love and Mercy,” and 5) Sarah Paulson in “Carol.”Every Oscar race features a number of sure things, safe bets and also-rans—the folks who might have gained traction in some of the awards guilds and smaller ceremonies but won’t hear their names read when the Academy Awards nominations are called out in January. This year, that’s likely to be folks like Kristen Stewart, who was recognized by the New York Film Critics Circle and France’s prestigious César Awards for her great work in “Clouds of Sils Maria,” or SAG nominee Michael Shannon (“99 Homes”)—still a long shot in this year’s race. But what about the also-also-rans: the folks who have no realistic shot to get nominated for an Academy Award for their work—or even much else? They might not give Oscars for “best Bulgarian accent” or “best bearded sexy dance,” but that doesn’t mean these nine actors weren’t giving it their all in 2015. 1. Charlize Theron in “Mad Max: Fury Road” For all the praise heaped onMad Max: Fury Road”—Rotten Tomatoes’ best reviewed film of the year—it’s a damn shame that Charlize Theron’s own superlative work isn’t likely to translate to much awards traction. In “Fury Road,” Theron proved not just that she’s one of the best actors of her generation but that she’s one of our great unsung action heroes. Theron’s one-armed Imperator Furiosa is as instantly iconic and unforgettable as the film she’s in, the 21st century’s answer to Ellen Ripley. 2. Oscar Isaac in “Ex Machina” While Oscar Isaac’s co-star—the amazing Swedish actress Alicia Vikander—is stealing all the buzz for Alex Garland’s sci-fi thriller, Isaac’s work here shouldn’t be ignored. Isaac lends a simmering brute force to the film as an alcoholic, unpredictable tech CEO who invites an employee (Domhnall Gleeson) to partake in a controlled experiment at his private compound. “Ex Machina” proved Isaac a formidable leading man to watch, after stellar breakout turns in “Drive,” “Inside Llewyn Davis,” and “The Two Faces of January.” You can catch him right now in a little movie called “Star Wars: The Force Awakens.” 3. Rose Byrne in “Spy” I love Melissa McCarthy. You love Melissa McCarthy. Everyone loves Melissa McCarthy. But I mean it as no impertinence when I say that her “Spy” co-star, Rose Byrne, made me laugh harder than anyone else this year. After “Bridesmaids” and “Neighbors,” Paul Feig’s Bond send-up further proved that Byrne has unbelievable comic timing, playing the opulently coarse daughter of a Bulgarian mobster. Rose Byrne is a mile-a-minute put-down machine in this movie, like Triumph the Insult Comic Dog in leopard-print spandex. I could watch Byrne and McCarthy trade barbs for the rest of my life. 4. Cynthia Nixon in “James White” Did you know that Cynthia Nixon is just a little gold man away from being an EGOT winner? The former “Sex and the City” actress has won awards for her work on that show, as well as her acclaimed role in “Rabbit Hole” on Broadway and the spoken-word companion album to the documentary “An Inconvenient Truth” (which is a thing, apparently). In “James White,” Nixon makes a strong case to be let into the club. In a scene that’s easily the most powerful and heartbreaking of the year, Nixon’s character, who is dying of cancer, listens as her troubled son (Christopher Abbott of “Girls”) envisions a happy ending for them in Paris. She moves for work, and he finds love. Life, however, isn’t always what we dream. 5. Tessa Thompson in “Creed” While Michael B. Jordan proves himself a bonafide star and Sylvester Stallone mounts his Oscar comeback, I’d like to see some of that love spread to their co-star, the terrific Tessa Thompson. You might recognize Thompson from “Veronica Mars”—where she played Veronica’s frenemy, Jackie—but 2014 was a breakthrough year for her, with roles in “Selma” and “Dear White People.” In “Creed,” Tessa Thompson plays Bianca, Adonis’ neighbor and later girlfriend, but she’s more than just boxing ring candy. She gives another soulful, layered performance as lived in as the movie’s Philadelphia setting, and playing an aspiring singer, she proves herself no slouch behind a mic. 6. Joel Edgerton in “The Gift” Joel Edgerton gained traction early in this year’s Oscar race for his work in “Black Mass,” but his work in “The Gift” proved the true revelation. Edgerton pulls double duty, both directing and starring in this twisty thriller about a man (Jason Bateman), the kid he used to bully in school (Edgerton) and his concerned wife (Rebecca Hall)—who doesn’t know about his past. Instead of the conventional nice-people-stalked-by-deranged-lunatic narrative, this European-tinged nailbiter recalls the best work of Austria’s Michael Haneke. “The Gift” asks difficult questions about the comforts of suburban conformity and its costs, forcing us to ask who really is the bad guy here: the stalker or the man who has abused him for decades? 7. Benicio Del Toro in “Sicario” There’s a reason that Lionsgate is already discussing a spin-off film based around Benicio Del Toro. In Denis Villeneuve's acclaimed drug war thriller, “Sicario,” Del Toro reminds audiences why he remains one of cinema’s most venerable character actors. He’s a government operative working both sides of the border, whose behavior is perpetually confusing to his co-worker (Emily Blunt). Del Toro’s Alejandro is a cipher for the menace of the drug war itself, and his intense and stoic performance never gives too much away; he’s spellbinding to watch. Blunt is technically the lead, but given that the film’s title is “hitman” in Spanish, there’s never any question about whose movie this is. 8. Parker Posey in “Irrational Man” The biggest issue with late-period Woody Allen movies—other than their writer/director’s problematic personal history—is that Allen thinks in themes, not characters. In “Irrational Man,” Joaquin Phoenix’s Abe feels less like a college professor than a soapbox for Woody Allen’s own conflicted philosophies. For all the Woodman’s contrivances, though, Parker Posey’s Rita feels like the closest he’s come to an actual person in years. Posey has a knack for playing characters with a hint of darkness (see: her recent stint on “Louie”), and her character—a married colleague who falls in love with Abe—is perpetually frazzled and conflicted but completely absorbing. When she’s onscreen, she brings out the best in Woody Allen, and if he’s at all smart, Allen’s next movie should be the Parker Posey vehicle we deserved ages ago. 9. Julie Walters in “Brooklyn” John Crowley’s “Brooklyn” features an embarrassment of great, understated performances from its talented ensemble. Saoirse Ronan, who gives the performance of her young career as an Irish immigrant in 1950s New York, is my personal favorite in this year’s Best Actress Oscar race, but the film’s humane touch extends to nearly every character. The always great Julie Walters (“Billy Elliott,” “Mamma Mia”) particularly shines as Eilis’ (Ronan) boisterously spirited landlady. What makes “Brooklyn” one of the finest films of the year is that it feels so wonderfully alive and rich with experience, as if you could take up a room in Walters’ boarding house and live there. I would in a second. Also Check Out: 1) Geza Rohrig in “Son of Saul,” 2) Jason Segel in “The End of the Tour,” 3) Blythe Danner in “I’ll See You in My Dreams,” 4) Elizabeth Banks in “Love and Mercy,” and 5) Sarah Paulson in “Carol.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 15:00

The Chinese home I never knew

Narratively

"It's a totally brand new city. I don’t recognize anything,” my mom says, gazing wide-eyed out of the decades-old tram and into the chaotic streets of Hong Kong. “I never used to take the ding ding. It’s so slow,” she complains as the tram operator rings the bell twice to alert pedestrians nearby — emitting a characteristic sound that gives the tram its Cantonese nickname. “But actually it’s kind of nice. You can take your time to see the scenery around you.”

More from Narratively: "Kids, We're Moving to Greenland"

It’s been 33 years since she moved from Hong Kong to Toronto and four months since I did the opposite. This is how I became my mother’s tour guide in her own hometown.

More from Narratively: "How an International Man of Mystery Scammed My Grandma"

The crowded streets that are familiar to me are now unrecognizable to her. I guide her through the modern subway system, which was constructed after her departure, and advise her on the best bus routes to take when she meets up with her old school friends. We walk together through Central, the city’s financial district, among 70-story skyscrapers and modern steel structures that my mother has never seen before. I introduce her to some of the great local restaurants I’ve discovered; she uses her Cantonese fluency to order dishes that aren’t on the English menu.

More from Narratively: "Can German Atonement Teach America to Finally Face Slavery"

Together, we are the ultimate Hongkonger.

My mom comments on how much my Cantonese has improved while we devour her expertly-ordered meal. This food that I’m enjoying and the language I’m eagerly practicing is in stark contrast to the attitudes of my youth.

I credit my childhood apathy towards my mother’s culture to the whitewashed Toronto suburb I grew up in, where I was one of three Chinese kids in an elementary school of 300. I wanted nothing more than to be just like my white Canadian peers with their Wonder Bread sandwiches and after-school ballet classes. Instead I was forced into weekly piano lessons, which I loathed, and fed traditional Chinese meals of steamed meat and stir-fried vegetables over heaping bowls of rice that I thought were bland.

My mother put me in Chinese school on the weekends to learn her native tongue, but replacing my Saturday morning cartoons with rote repetition of stroke order and Chinese characters instilled in me a repulsion of the language. Whenever she spoke to me in Cantonese, I’d understand perfectly but reply back in English.

There’s a Cantonese slang phrase to describe people like me: jook sing. The term originates from the knot in a reed of bamboo that prevents water from flowing from one end to the other. “You look at bamboo. It looks like it should be all hollow, but actually there are parts where it’s closed inside,” my mom explains. “That’s like you. You look like you’re Chinese but you’re not really Chinese because you do not understand the language and the culture.” It’s meant to be derogatory, and it works. My mother and relatives would tease me, my sister and my cousins for being jook sing when we spoke our accented Cantonese or used our chopsticks incorrectly.

Witnessing her children stuck between two cultures must have been frustrating for my mother too. While she moved to Canada to live and raise kids in a western environment, with western values and beliefs, she didn’t know what the norms were. She had only her own childhood experience to draw from. Growing up in poverty, she adopted an eat-or-be-eaten attitude as the second eldest of six siblings. She witnessed how her father’s strong work ethic brought him from low-level garment seamster to factory foreman and, later, top-level manager. My mother was taught by strict Catholic nuns at an all-girls high school, which undoubtedly influenced her own parenting style. The chaotic city life and incredible academic, societal and economic pressures of Hong Kong are what eventually pushed my mom out of the city.

Between the time my mom left in 1976 and the handover of Hong Kong from the British to the Chinese, a slew of Hongkongers fled to the west. I went to high school in a city that was popular with recent Hong Kong immigrants and their children, who became my classmates. I befriended many but I could never really relate, instead seeking distance from them and my mother’s heritage. When I registered for university and renewed my passport I omitted my Chinese middle name on the application forms, thinking it might make me seem less Chinese and more Canadian.

It wasn’t until I was removed from this environment that I recognized the gap in my identity. In my third year of university, I spent a year abroad, studying in Copenhagen and traveling through Europe. I visited parts of the world where the concepts of immigration and visible minorities were virtually unheard of. I once met a group of young Croatian girls at a bar in Dubrovnik’s Old City, celebrating a friend’s eighteenth birthday. They fawned over my foreign appearance, a rarity in Eastern Europe. “China! You’re from China?” one of the girls asked in a tone that wasn’t racist but inquisitively earnest.

“No,” I said, feeling, for the first time ever, ashamed in my eventual reply. “Actually, I’ve never been.”

* * *

Growing up with my mother and among other Chinese immigrants I never once imagined that I might live in Hong Kong one day. But then I graduated from university on the brink of a recession with no job prospects at home. Hong Kong was a natural destination. I easily found work with an English teaching agency and was trained with 60 other expatriates, mostly white and from the UK.

Skills that I snubbed in my adolescence now gave me an edge above the typical new arrival to the city. Over heavily-accented Cantonese I negotiated with my landlord for cheaper rent. I enjoyed blending in with the crowd, slipping into clothing stores and shopping for groceries unnoticed, until my accented Cantonese outed me as a foreigner. While my new colleagues fumbled with their chopsticks at group lunches, skeptically eyeing the steaming bamboo baskets of curried cuttlefish, shrimp dumplings and pillow-y white buns with mystery fillings, I found comfort in these dishes that were more familiar to me than the strangers I was seated with.

These strangers did eventually become my friends. But over time, as my social networks expanded past the English teachers I first arrived with, I met more and more expatriates like myself — the children of Hong Kong immigrants who were born, raised and educated in the west but settled in Hong Kong for work. While these were the very people I was eager to distance myself from in my youth, in our parents’ homeland I discovered a camaraderie that was missing before. While no one tracks the number of western-born people of Chinese descent living in Hong Kong, data from a 2013 census analyzed by Pacific Prime Hong Kong shows that there are an estimated 301,000 expatriates living in Hong Kong, comprising 4.24 percent of the city’s population. This is a big jump from just 252,000 expatriates living there in 2009.

Hong Kong’s western-born Chinese definitely aren’t considered locals, but we aren’t quite expats either. We navigate the city as in-betweeners, with our varying knowledge of the local customs and norms, the language and food. Yet we retain our western values, among them a sense of individuality and freedom that would be considered a rebellious trait of a traditional family-centric Hongkonger.

Working as an English teacher, I found that my in-between status hindered rather than helped. The schools where I taught English requested an immersive experience for their students, which meant native English speakers. The supervisor of my second teaching job, Pamela Bovitz, instructed me to pretend that I didn’t understand a word of Cantonese.

“She knows what we’re saying!” the kids would say in their native tongue, aware that I was one of them. I feigned confusion and fought the urge to reply back in Cantonese. At the kindergarten where I worked, I was hired to teach these young students English. But through the playground games and classroom conversations I overheard, the students inadvertently helped me improve my second language too. Every morning I listened to them sing their school’s anthem in Cantonese and one day I sang it for my mother. In between fits of laughter, she helped me translate the song and understand the lyrics.

Out of seven English teachers, there were two other Chinese teachers like me at the kindergarten. Bovitz, who has been responsible for hiring native English teachers for the past decade, told me that she gets plenty of resumes from western-born Chinese applicants — more so in the past few years.

After completing my teaching contract at the kindergarten, I was eager for a change but not yet ready to leave Hong Kong. So I enrolled in the University of Hong Kong’s Master of Journalism program in 2010, where I met Stephanie Chan. She moved to Hong Kong from her native Toronto in 2007 to teach middle school at the Canadian International School of Hong Kong. She had trained as a teacher in Canada but looked abroad for work after graduation. “A lot of my friends in teachers’ college were scrambling to find jobs,” Chan recalls. “Many couldn't get permanent teaching jobs in the districts that they wanted. A lot of them just ended up being supply teachers for a couple of years.” Like me, Chan knew very little about Hong Kong, having visited only once before she moved. Her parents were still children when they immigrated to Canada, and they still live in Toronto as my mother does.

Chan is now teaching media studies and journalism at the same Hong Kong international school. Her husband Ricky is also from Canada and of Chinese descent. He was equally drawn to Hong Kong for employment, where he found work with an American firm.

Being not quite local but not quite foreign has been a positive experience for Chan. “I’ve been able to enjoy the Chinese culture while also maintaining the expat identity,” she says. “Hong Kong is such an easy place for a foreigner to live. And even easier as a foreigner who speaks Chinese.”

While Iris Lam was born in Hong Kong, she immigrated to Canada with her family in the late 1980s when she was just a kid. She had the opportunity to return when her employer, Shangri-La Hotels & Resorts, promoted her through a transfer to the company’s headquarters in Hong Kong in 2011. Lam worked there for two years and was then was transferred to Shanghai. She returned to Hong Kong in 2014 to work for the Mandarin Oriental Hotel Group.

Compared to Shanghai and Kuala Lumpur, where Lam has also worked, she says that her status as a western-raised Chinese person isn’t as notable in Hong Kong. “I find local Hong Kong people are amongst the most accustomed and accepting of the concept of overseas Chinese,” she explains. “I haven’t found any stigma or special fascination with the way locals interact with me. In fact, I think I’m regarded as somewhat common within Hong Kong society.”

Second-generation reverse migration isn’t exclusive to Hong Kong. I met Sandy Oh through a mutual friend in Hong Kong as she was stopping over on her way to Seoul. Oh’s mother and aunt left South Korea for New York City in the mid 1970s, where they both found better-paying nursing jobs. Oh moved to Seoul in 2008, and like me, easily found work teaching English. Later she became a textbook writer and developed teaching curricula for a private academy. Oh was inspired by her experience working with upper-class teens to pursue a PhD studying Korean international high school students and global education. She has since returned to Seoul three times and is currently conducting long-term fieldwork in the city.

While in Seoul, Oh has had the opportunity to reconnect with middle and high school friends from her native New York. “There are lots of Korean-Americans who come back for stints like me,” she says. Her peers are working in law, finance, research or opening restaurants, but most are English teachers at the same academies where Oh started out. “It’s common knowledge that a lot of people like me — highly educated, grew up in America — come back to Korea these days because even we can’t find jobs.”

* * *

In 2009, my mom retired from a fruitful 35-year nursing career in Canada. With time on her hands and a desire to see how her hometown had changed (and to see me, of course), she celebrated with an extended seven-week holiday in Hong Kong. Her last visit to the city was in 1982.

Over the years, my mother has never been very sentimental about her upbringing. She’s told me bits and pieces but being in her hometown and seeing certain things for the first time in nearly three decades triggered a flood of memories from her past: a department store where her father used to take the family shopping on the weekends, a spicy Shanghainese noodle dish that she’d always eat after school with her classmates, or hearing the rickety old trams with their characteristic ding ding.

One afternoon during my mother’s stay, I took her to the Hong Kong Museum of History. Part of the museum’s permanent exhibition includes a reconstruction of a room in a post-war tenement building, meant to recreate the rough conditions that Hong Kong’s increasingly poor population lived in at the time.

My mom’s eyes lit up when we turned the corner and the exhibit came into view. “This looks just like our first apartment! I used to sleep on the top bunk with your dai kau fu and yi yi,” my mom said, pointing to a small bunk bed tucked into the corner. “We had to sleep this way,” she added, gesturing her arm width-wise across along the bed, slicing her hand down three times to represent the three children who slept there. I imagined my mother there, sleeping in a bed with her siblings like sardines, in a room so iconic in its squalor that it’s now preserved in a history museum.

“The bottom bunk was where your po po and kau fu would sleep,” she continued. “And your gong gong would sleep on a cot.” The bathroom and kitchen were shared with another family of six. My mother’s family lived there for five years before my gong gong, my mother’s father, earned enough to support the growing family’s move to better accommodations and their eventual emigration and establishment in Canada.

On her first trip back, my mom kept busy touring the city with me and catching up with old high school friends she hadn’t seen in decades. Her retired life now allows her the freedom to fly back to Hong Kong for a few months every year. She’s living comfortably, thanks to a sizeable pension plan that she wouldn’t have attained had she worked in Hong Kong, along with savings from her shrewd frugality — a stereotypical trait of Chinese immigrants, which I’m now grateful for having adopted.

My mom carries her western habits with her to Hong Kong when she visits. When the temperature dips in the tropical city, down jackets, hats and mitts come out. But my mother, hardened after countless Canadian winters, keeps her shorts and tees on. When interacting with a waiter or a shopkeeper, she finds herself initiating conversation in English before reverting back to Cantonese. Her friends comment on her sweet tooth, as she’s gotten used to ending every meal with a dessert — a western norm that’s less common in Chinese cuisine. “They call me fan gwai po because I’m acting like a white woman.”

My mom knows they’re teasing, just like how she used to call me jook sing when I fumbled with my Cantonese as a kid. But I know my mom has some jook sing in her too. Those knots developed in her as she found her way in Toronto, adopting Canadian habits and merging them with her Chinese upbringing.

I’m back in Toronto now after four years in Hong Kong. While I thought of what life would be like if I were to stay in Hong Kong permanently, my heart always was, and still is in Toronto. Life was easier abroad — socially and career-wise. Ironically, I’ve struggled to make friends and find my footing in my hometown, but I’m establishing a fruitful freelance writing career and a good group of friends, old and new. Spending time with my extended family in Toronto is a definite benefit of being back home. I feel proud to converse in my improved Cantonese with the aunts and uncles who used to mock me and cook Chinese dishes for my Canadian friends. While I’m grateful for the linguistic and culinary skills I’ve picked up, what I value most from my time away is the appreciation I discovered for my in-between identity. I used to be ashamed of the place that I occupied between two cultures, but now I’m grateful for these knots because I know they’re what tie me together.

Narratively

"It's a totally brand new city. I don’t recognize anything,” my mom says, gazing wide-eyed out of the decades-old tram and into the chaotic streets of Hong Kong. “I never used to take the ding ding. It’s so slow,” she complains as the tram operator rings the bell twice to alert pedestrians nearby — emitting a characteristic sound that gives the tram its Cantonese nickname. “But actually it’s kind of nice. You can take your time to see the scenery around you.”

More from Narratively: "Kids, We're Moving to Greenland"

It’s been 33 years since she moved from Hong Kong to Toronto and four months since I did the opposite. This is how I became my mother’s tour guide in her own hometown.

More from Narratively: "How an International Man of Mystery Scammed My Grandma"

The crowded streets that are familiar to me are now unrecognizable to her. I guide her through the modern subway system, which was constructed after her departure, and advise her on the best bus routes to take when she meets up with her old school friends. We walk together through Central, the city’s financial district, among 70-story skyscrapers and modern steel structures that my mother has never seen before. I introduce her to some of the great local restaurants I’ve discovered; she uses her Cantonese fluency to order dishes that aren’t on the English menu.

More from Narratively: "Can German Atonement Teach America to Finally Face Slavery"

Together, we are the ultimate Hongkonger.

My mom comments on how much my Cantonese has improved while we devour her expertly-ordered meal. This food that I’m enjoying and the language I’m eagerly practicing is in stark contrast to the attitudes of my youth.

I credit my childhood apathy towards my mother’s culture to the whitewashed Toronto suburb I grew up in, where I was one of three Chinese kids in an elementary school of 300. I wanted nothing more than to be just like my white Canadian peers with their Wonder Bread sandwiches and after-school ballet classes. Instead I was forced into weekly piano lessons, which I loathed, and fed traditional Chinese meals of steamed meat and stir-fried vegetables over heaping bowls of rice that I thought were bland.

My mother put me in Chinese school on the weekends to learn her native tongue, but replacing my Saturday morning cartoons with rote repetition of stroke order and Chinese characters instilled in me a repulsion of the language. Whenever she spoke to me in Cantonese, I’d understand perfectly but reply back in English.

There’s a Cantonese slang phrase to describe people like me: jook sing. The term originates from the knot in a reed of bamboo that prevents water from flowing from one end to the other. “You look at bamboo. It looks like it should be all hollow, but actually there are parts where it’s closed inside,” my mom explains. “That’s like you. You look like you’re Chinese but you’re not really Chinese because you do not understand the language and the culture.” It’s meant to be derogatory, and it works. My mother and relatives would tease me, my sister and my cousins for being jook sing when we spoke our accented Cantonese or used our chopsticks incorrectly.

Witnessing her children stuck between two cultures must have been frustrating for my mother too. While she moved to Canada to live and raise kids in a western environment, with western values and beliefs, she didn’t know what the norms were. She had only her own childhood experience to draw from. Growing up in poverty, she adopted an eat-or-be-eaten attitude as the second eldest of six siblings. She witnessed how her father’s strong work ethic brought him from low-level garment seamster to factory foreman and, later, top-level manager. My mother was taught by strict Catholic nuns at an all-girls high school, which undoubtedly influenced her own parenting style. The chaotic city life and incredible academic, societal and economic pressures of Hong Kong are what eventually pushed my mom out of the city.

Between the time my mom left in 1976 and the handover of Hong Kong from the British to the Chinese, a slew of Hongkongers fled to the west. I went to high school in a city that was popular with recent Hong Kong immigrants and their children, who became my classmates. I befriended many but I could never really relate, instead seeking distance from them and my mother’s heritage. When I registered for university and renewed my passport I omitted my Chinese middle name on the application forms, thinking it might make me seem less Chinese and more Canadian.

It wasn’t until I was removed from this environment that I recognized the gap in my identity. In my third year of university, I spent a year abroad, studying in Copenhagen and traveling through Europe. I visited parts of the world where the concepts of immigration and visible minorities were virtually unheard of. I once met a group of young Croatian girls at a bar in Dubrovnik’s Old City, celebrating a friend’s eighteenth birthday. They fawned over my foreign appearance, a rarity in Eastern Europe. “China! You’re from China?” one of the girls asked in a tone that wasn’t racist but inquisitively earnest.

“No,” I said, feeling, for the first time ever, ashamed in my eventual reply. “Actually, I’ve never been.”

* * *

Growing up with my mother and among other Chinese immigrants I never once imagined that I might live in Hong Kong one day. But then I graduated from university on the brink of a recession with no job prospects at home. Hong Kong was a natural destination. I easily found work with an English teaching agency and was trained with 60 other expatriates, mostly white and from the UK.

Skills that I snubbed in my adolescence now gave me an edge above the typical new arrival to the city. Over heavily-accented Cantonese I negotiated with my landlord for cheaper rent. I enjoyed blending in with the crowd, slipping into clothing stores and shopping for groceries unnoticed, until my accented Cantonese outed me as a foreigner. While my new colleagues fumbled with their chopsticks at group lunches, skeptically eyeing the steaming bamboo baskets of curried cuttlefish, shrimp dumplings and pillow-y white buns with mystery fillings, I found comfort in these dishes that were more familiar to me than the strangers I was seated with.

These strangers did eventually become my friends. But over time, as my social networks expanded past the English teachers I first arrived with, I met more and more expatriates like myself — the children of Hong Kong immigrants who were born, raised and educated in the west but settled in Hong Kong for work. While these were the very people I was eager to distance myself from in my youth, in our parents’ homeland I discovered a camaraderie that was missing before. While no one tracks the number of western-born people of Chinese descent living in Hong Kong, data from a 2013 census analyzed by Pacific Prime Hong Kong shows that there are an estimated 301,000 expatriates living in Hong Kong, comprising 4.24 percent of the city’s population. This is a big jump from just 252,000 expatriates living there in 2009.

Hong Kong’s western-born Chinese definitely aren’t considered locals, but we aren’t quite expats either. We navigate the city as in-betweeners, with our varying knowledge of the local customs and norms, the language and food. Yet we retain our western values, among them a sense of individuality and freedom that would be considered a rebellious trait of a traditional family-centric Hongkonger.

Working as an English teacher, I found that my in-between status hindered rather than helped. The schools where I taught English requested an immersive experience for their students, which meant native English speakers. The supervisor of my second teaching job, Pamela Bovitz, instructed me to pretend that I didn’t understand a word of Cantonese.

“She knows what we’re saying!” the kids would say in their native tongue, aware that I was one of them. I feigned confusion and fought the urge to reply back in Cantonese. At the kindergarten where I worked, I was hired to teach these young students English. But through the playground games and classroom conversations I overheard, the students inadvertently helped me improve my second language too. Every morning I listened to them sing their school’s anthem in Cantonese and one day I sang it for my mother. In between fits of laughter, she helped me translate the song and understand the lyrics.

Out of seven English teachers, there were two other Chinese teachers like me at the kindergarten. Bovitz, who has been responsible for hiring native English teachers for the past decade, told me that she gets plenty of resumes from western-born Chinese applicants — more so in the past few years.

After completing my teaching contract at the kindergarten, I was eager for a change but not yet ready to leave Hong Kong. So I enrolled in the University of Hong Kong’s Master of Journalism program in 2010, where I met Stephanie Chan. She moved to Hong Kong from her native Toronto in 2007 to teach middle school at the Canadian International School of Hong Kong. She had trained as a teacher in Canada but looked abroad for work after graduation. “A lot of my friends in teachers’ college were scrambling to find jobs,” Chan recalls. “Many couldn't get permanent teaching jobs in the districts that they wanted. A lot of them just ended up being supply teachers for a couple of years.” Like me, Chan knew very little about Hong Kong, having visited only once before she moved. Her parents were still children when they immigrated to Canada, and they still live in Toronto as my mother does.

Chan is now teaching media studies and journalism at the same Hong Kong international school. Her husband Ricky is also from Canada and of Chinese descent. He was equally drawn to Hong Kong for employment, where he found work with an American firm.

Being not quite local but not quite foreign has been a positive experience for Chan. “I’ve been able to enjoy the Chinese culture while also maintaining the expat identity,” she says. “Hong Kong is such an easy place for a foreigner to live. And even easier as a foreigner who speaks Chinese.”

While Iris Lam was born in Hong Kong, she immigrated to Canada with her family in the late 1980s when she was just a kid. She had the opportunity to return when her employer, Shangri-La Hotels & Resorts, promoted her through a transfer to the company’s headquarters in Hong Kong in 2011. Lam worked there for two years and was then was transferred to Shanghai. She returned to Hong Kong in 2014 to work for the Mandarin Oriental Hotel Group.

Compared to Shanghai and Kuala Lumpur, where Lam has also worked, she says that her status as a western-raised Chinese person isn’t as notable in Hong Kong. “I find local Hong Kong people are amongst the most accustomed and accepting of the concept of overseas Chinese,” she explains. “I haven’t found any stigma or special fascination with the way locals interact with me. In fact, I think I’m regarded as somewhat common within Hong Kong society.”

Second-generation reverse migration isn’t exclusive to Hong Kong. I met Sandy Oh through a mutual friend in Hong Kong as she was stopping over on her way to Seoul. Oh’s mother and aunt left South Korea for New York City in the mid 1970s, where they both found better-paying nursing jobs. Oh moved to Seoul in 2008, and like me, easily found work teaching English. Later she became a textbook writer and developed teaching curricula for a private academy. Oh was inspired by her experience working with upper-class teens to pursue a PhD studying Korean international high school students and global education. She has since returned to Seoul three times and is currently conducting long-term fieldwork in the city.

While in Seoul, Oh has had the opportunity to reconnect with middle and high school friends from her native New York. “There are lots of Korean-Americans who come back for stints like me,” she says. Her peers are working in law, finance, research or opening restaurants, but most are English teachers at the same academies where Oh started out. “It’s common knowledge that a lot of people like me — highly educated, grew up in America — come back to Korea these days because even we can’t find jobs.”

* * *

In 2009, my mom retired from a fruitful 35-year nursing career in Canada. With time on her hands and a desire to see how her hometown had changed (and to see me, of course), she celebrated with an extended seven-week holiday in Hong Kong. Her last visit to the city was in 1982.

Over the years, my mother has never been very sentimental about her upbringing. She’s told me bits and pieces but being in her hometown and seeing certain things for the first time in nearly three decades triggered a flood of memories from her past: a department store where her father used to take the family shopping on the weekends, a spicy Shanghainese noodle dish that she’d always eat after school with her classmates, or hearing the rickety old trams with their characteristic ding ding.

One afternoon during my mother’s stay, I took her to the Hong Kong Museum of History. Part of the museum’s permanent exhibition includes a reconstruction of a room in a post-war tenement building, meant to recreate the rough conditions that Hong Kong’s increasingly poor population lived in at the time.

My mom’s eyes lit up when we turned the corner and the exhibit came into view. “This looks just like our first apartment! I used to sleep on the top bunk with your dai kau fu and yi yi,” my mom said, pointing to a small bunk bed tucked into the corner. “We had to sleep this way,” she added, gesturing her arm width-wise across along the bed, slicing her hand down three times to represent the three children who slept there. I imagined my mother there, sleeping in a bed with her siblings like sardines, in a room so iconic in its squalor that it’s now preserved in a history museum.

“The bottom bunk was where your po po and kau fu would sleep,” she continued. “And your gong gong would sleep on a cot.” The bathroom and kitchen were shared with another family of six. My mother’s family lived there for five years before my gong gong, my mother’s father, earned enough to support the growing family’s move to better accommodations and their eventual emigration and establishment in Canada.

On her first trip back, my mom kept busy touring the city with me and catching up with old high school friends she hadn’t seen in decades. Her retired life now allows her the freedom to fly back to Hong Kong for a few months every year. She’s living comfortably, thanks to a sizeable pension plan that she wouldn’t have attained had she worked in Hong Kong, along with savings from her shrewd frugality — a stereotypical trait of Chinese immigrants, which I’m now grateful for having adopted.

My mom carries her western habits with her to Hong Kong when she visits. When the temperature dips in the tropical city, down jackets, hats and mitts come out. But my mother, hardened after countless Canadian winters, keeps her shorts and tees on. When interacting with a waiter or a shopkeeper, she finds herself initiating conversation in English before reverting back to Cantonese. Her friends comment on her sweet tooth, as she’s gotten used to ending every meal with a dessert — a western norm that’s less common in Chinese cuisine. “They call me fan gwai po because I’m acting like a white woman.”

My mom knows they’re teasing, just like how she used to call me jook sing when I fumbled with my Cantonese as a kid. But I know my mom has some jook sing in her too. Those knots developed in her as she found her way in Toronto, adopting Canadian habits and merging them with her Chinese upbringing.

I’m back in Toronto now after four years in Hong Kong. While I thought of what life would be like if I were to stay in Hong Kong permanently, my heart always was, and still is in Toronto. Life was easier abroad — socially and career-wise. Ironically, I’ve struggled to make friends and find my footing in my hometown, but I’m establishing a fruitful freelance writing career and a good group of friends, old and new. Spending time with my extended family in Toronto is a definite benefit of being back home. I feel proud to converse in my improved Cantonese with the aunts and uncles who used to mock me and cook Chinese dishes for my Canadian friends. While I’m grateful for the linguistic and culinary skills I’ve picked up, what I value most from my time away is the appreciation I discovered for my in-between identity. I used to be ashamed of the place that I occupied between two cultures, but now I’m grateful for these knots because I know they’re what tie me together.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 14:00

After spending 2 years in solitude, I know something about loneliness. So why is New Year’s still so rough?

More than half of Americans over the age of 16 are single, myself included, and New Year’s can be rough on us.  We tend to fear the cozily smug parties of the dating and the doting, to flinch at the mass kissing migration at midnight, and, should we muster the self-respect or self-loathing simply to stay home, we have to block our ears against the siren call of doubly shaming romantic comedies—shame on us for watching them, shame on us that no soul mate is racing down a snowy city street or through a crowded airport to declare his or her abiding, charmingly articulated love.   But 4,000 years ago, in its earliest recorded incarnation, New Year’s was likely easier on singles. For the ancient Babylonians of Mesopotamia, New Year’s was a spring holiday, begun on the first full moon following the vernal equinox, and comprised 11 days of religious and political rituals. Parades, a quieter and less pneumatic version of the Macy’s Thanksgiving Day parade, featured statues of the gods, and rites reenacted their victory over the forces of darkness. On the political front, one ritual demanded that the king be stripped of his crown, and slapped and dragged by his ears; if he cried, legend had it, the sky god, Marduk, would be pleased and would extend the king’s rule. The Babylonians’ New Year’s, like ours today, celebrated the passage of time, but it celebrated communal time rather than individual time—time as determined by the interrelatedness of astronomy, agriculture, religion and politics, not time as determined by personal resolutions kept or abandoned, nor by goals, romantic or otherwise, achieved or fallen short of.   Though it may be difficult to appreciate, the one vestige of ancient communal time we still celebrate on New Year’s is the astronomical. Granted, the holiday no longer corresponds to a particular astronomical event like the vernal equinox.  (In 46 BC, Julius Caesar, with an unlikely combination of poetry and authority, moved the holiday to Jan. 1, in honor of the two-headed god Janus, who looks both forward and backward.)  But what we are still celebrating on New Year’s — beneath the current cultural overlay of holiday sales, top 10 lists, self-improvement resolutions, and a kind of collective couples’ anniversary — is simply another circuit of the earth around the sun. Of course, the Babylonians didn’t recognize a year’s passage as such, but the gist is the same. We’ve made it through another year. The planet is still here.  And, thank Marduk, we’re still on it. Which brings me back to the cultural phenomenon of New Year’s loneliness. No holiday is particularly easy if you’re lacking for love, but the others, at least, come with socially inclusive buffers: religious ceremonies, historical pomp and circumstance, and, for better or worse, family traditions.  But New Year’s has become our most existential holiday, one that now marks nothing but the yearly anniversary of a yearly anniversary, nothing but the passage of time itself. So perhaps New Year’s loneliness isn’t just a result of watching too many romantic comedies, or not watching too many couples kissing as the ball drops. Perhaps it’s a glimpse of something existential in scope, something not merely socially constructed. Here’s where I pause to flash my loneliness credentials, to establish that what I will say next comes from experience. From the fall of 1999 to the late spring of 2001, I lived in solitude in a ramshackle house in the woods of northern Vermont, at the dead end of an unmaintained dirt road. I had no computer, no television and no cellphone. I often went weeks without seeing or speaking to another human being. At dinner, my jaw would be sore as I ate, because my jaw muscles hadn’t been used to speak. Every two weeks or so, if the road was passable, I’d make the drive into the nearby town to buy groceries, saying little more than “thank you” to the cashier, which would feel like a heart-to-heart.   My reasons for living this way aren’t pertinent here, but suffice to say I had a chance to observe my loneliness in slow motion, to sink down through its usually blurry taxonomy, to feel the particular texture of each of its constituent layers. What was most intense initially, but lasted only a week or two, was the loneliness of missing out. Awakened that first fall by the rain hammering on the tarpaper roof, I’d wonder: Where are my friends right now? What have I done in turning my back on them?  What memories are being made without me?   This, really, was just a hermit’s version of FOMO (fear of missing out).  Indeed, FOMO is so widely recognized (and felt) that it has been used to sell all manner of things, from a search engine in Australia to an app to watch hockey games in the U.S.  It’s a loneliness of social anxiety, heightened at New Year’s, one likely rooted in the survival instinct not to be separated from the group; indeed, the fear is less of missing out than of being excluded, even if, as in my case, you’ve chosen to exclude yourself. The second level of loneliness I encountered, one that deepened as the snow kept falling that first winter, was the fear that I was missing out on my own life, that my own life, somehow, was going on somewhere without me. I was 25 at the time, and imagined this other life like a chalk outline of a crime scene, a certain space of air that was moving through my hometown, and still in a relationship with the woman who’d broken my heart. It was a kind of envy of my former life, a life that no longer seemed possible.   This level of loneliness often strikes singles on New Year’s, too — especially if we’re fresh on the heels of a major breakup or divorce.  Whether or not the split is ultimately for the better, we’re stuck with the loneliness not just of the love we’ve lost but also of the narrative we’ve lost, that reliable architecture for understanding our own lives.  As New Year’s tends to be the holiday for rereading (aloud or to ourselves) that narrative, singles can experience the disorienting loneliness of feeling that we’re no longer a part of a story at all, at least not one we can be proud of. By the end of my first winter in the woods, though, I no longer worried about what my friends were doing, no longer feared my life was happening somewhere else. My narrative might have been a strange one, but it was mine — and it offered its own consolations, its own share of beauty. Snowshoeing up into the trees after a nor’easter, the morning penitent and windless, the sky newly blue. Shoveling the roof at night under a full moon, the meadow luminous, as though lit from within. And it was beauty, ultimately, that brought me to the deepest level of loneliness.  Which was the impulse, the need, really, to share this beauty — with its joy, its sadness and its mystery — with someone else.   That’s the part of New Year’s loneliness that strikes deepest, I think, the part that corresponds to the essence of the holiday itself. Earth has made another glorious journey around the sun. And we’re still on it. What are we to do with being human? We’re alone, like everyone else on the planet, with our particular perceptions, with the parts of ourselves we understand and those we don’t. Time is passing. Which is a hard and joyous thing to face communally, like the ancient Babylonians, but a hard and sad thing to face alone. Perhaps New Year’s, more than any other holiday, is a helpful reminder, “Earth’s the right place,” as that curmudgeon Robert Frost wrote, “for love.”             Howard Axelrod is the author of "The Point of Vanishing: A Memoir of Two Years in Solitude."More than half of Americans over the age of 16 are single, myself included, and New Year’s can be rough on us.  We tend to fear the cozily smug parties of the dating and the doting, to flinch at the mass kissing migration at midnight, and, should we muster the self-respect or self-loathing simply to stay home, we have to block our ears against the siren call of doubly shaming romantic comedies—shame on us for watching them, shame on us that no soul mate is racing down a snowy city street or through a crowded airport to declare his or her abiding, charmingly articulated love.   But 4,000 years ago, in its earliest recorded incarnation, New Year’s was likely easier on singles. For the ancient Babylonians of Mesopotamia, New Year’s was a spring holiday, begun on the first full moon following the vernal equinox, and comprised 11 days of religious and political rituals. Parades, a quieter and less pneumatic version of the Macy’s Thanksgiving Day parade, featured statues of the gods, and rites reenacted their victory over the forces of darkness. On the political front, one ritual demanded that the king be stripped of his crown, and slapped and dragged by his ears; if he cried, legend had it, the sky god, Marduk, would be pleased and would extend the king’s rule. The Babylonians’ New Year’s, like ours today, celebrated the passage of time, but it celebrated communal time rather than individual time—time as determined by the interrelatedness of astronomy, agriculture, religion and politics, not time as determined by personal resolutions kept or abandoned, nor by goals, romantic or otherwise, achieved or fallen short of.   Though it may be difficult to appreciate, the one vestige of ancient communal time we still celebrate on New Year’s is the astronomical. Granted, the holiday no longer corresponds to a particular astronomical event like the vernal equinox.  (In 46 BC, Julius Caesar, with an unlikely combination of poetry and authority, moved the holiday to Jan. 1, in honor of the two-headed god Janus, who looks both forward and backward.)  But what we are still celebrating on New Year’s — beneath the current cultural overlay of holiday sales, top 10 lists, self-improvement resolutions, and a kind of collective couples’ anniversary — is simply another circuit of the earth around the sun. Of course, the Babylonians didn’t recognize a year’s passage as such, but the gist is the same. We’ve made it through another year. The planet is still here.  And, thank Marduk, we’re still on it. Which brings me back to the cultural phenomenon of New Year’s loneliness. No holiday is particularly easy if you’re lacking for love, but the others, at least, come with socially inclusive buffers: religious ceremonies, historical pomp and circumstance, and, for better or worse, family traditions.  But New Year’s has become our most existential holiday, one that now marks nothing but the yearly anniversary of a yearly anniversary, nothing but the passage of time itself. So perhaps New Year’s loneliness isn’t just a result of watching too many romantic comedies, or not watching too many couples kissing as the ball drops. Perhaps it’s a glimpse of something existential in scope, something not merely socially constructed. Here’s where I pause to flash my loneliness credentials, to establish that what I will say next comes from experience. From the fall of 1999 to the late spring of 2001, I lived in solitude in a ramshackle house in the woods of northern Vermont, at the dead end of an unmaintained dirt road. I had no computer, no television and no cellphone. I often went weeks without seeing or speaking to another human being. At dinner, my jaw would be sore as I ate, because my jaw muscles hadn’t been used to speak. Every two weeks or so, if the road was passable, I’d make the drive into the nearby town to buy groceries, saying little more than “thank you” to the cashier, which would feel like a heart-to-heart.   My reasons for living this way aren’t pertinent here, but suffice to say I had a chance to observe my loneliness in slow motion, to sink down through its usually blurry taxonomy, to feel the particular texture of each of its constituent layers. What was most intense initially, but lasted only a week or two, was the loneliness of missing out. Awakened that first fall by the rain hammering on the tarpaper roof, I’d wonder: Where are my friends right now? What have I done in turning my back on them?  What memories are being made without me?   This, really, was just a hermit’s version of FOMO (fear of missing out).  Indeed, FOMO is so widely recognized (and felt) that it has been used to sell all manner of things, from a search engine in Australia to an app to watch hockey games in the U.S.  It’s a loneliness of social anxiety, heightened at New Year’s, one likely rooted in the survival instinct not to be separated from the group; indeed, the fear is less of missing out than of being excluded, even if, as in my case, you’ve chosen to exclude yourself. The second level of loneliness I encountered, one that deepened as the snow kept falling that first winter, was the fear that I was missing out on my own life, that my own life, somehow, was going on somewhere without me. I was 25 at the time, and imagined this other life like a chalk outline of a crime scene, a certain space of air that was moving through my hometown, and still in a relationship with the woman who’d broken my heart. It was a kind of envy of my former life, a life that no longer seemed possible.   This level of loneliness often strikes singles on New Year’s, too — especially if we’re fresh on the heels of a major breakup or divorce.  Whether or not the split is ultimately for the better, we’re stuck with the loneliness not just of the love we’ve lost but also of the narrative we’ve lost, that reliable architecture for understanding our own lives.  As New Year’s tends to be the holiday for rereading (aloud or to ourselves) that narrative, singles can experience the disorienting loneliness of feeling that we’re no longer a part of a story at all, at least not one we can be proud of. By the end of my first winter in the woods, though, I no longer worried about what my friends were doing, no longer feared my life was happening somewhere else. My narrative might have been a strange one, but it was mine — and it offered its own consolations, its own share of beauty. Snowshoeing up into the trees after a nor’easter, the morning penitent and windless, the sky newly blue. Shoveling the roof at night under a full moon, the meadow luminous, as though lit from within. And it was beauty, ultimately, that brought me to the deepest level of loneliness.  Which was the impulse, the need, really, to share this beauty — with its joy, its sadness and its mystery — with someone else.   That’s the part of New Year’s loneliness that strikes deepest, I think, the part that corresponds to the essence of the holiday itself. Earth has made another glorious journey around the sun. And we’re still on it. What are we to do with being human? We’re alone, like everyone else on the planet, with our particular perceptions, with the parts of ourselves we understand and those we don’t. Time is passing. Which is a hard and joyous thing to face communally, like the ancient Babylonians, but a hard and sad thing to face alone. Perhaps New Year’s, more than any other holiday, is a helpful reminder, “Earth’s the right place,” as that curmudgeon Robert Frost wrote, “for love.”             Howard Axelrod is the author of "The Point of Vanishing: A Memoir of Two Years in Solitude."

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 12:30

Best TV of 2015: Our critic’s picks for the 10 best shows of the year

10. “Inside Amy Schumer,” season three. Showrunners: Amy Schumer and Jessi Klein. (Comedy Central) This year was the year this little Comedy Central sketch show became must-see TV, as creator and star Amy Schumer’s facility for exposing the hypocrisies of life as a woman in this world. Some of the show’s imaginings, like “Last Fuckable Day,” have become part of our lexicon. Some, like the sketch where Schumer plays a character with three buttholes, are just hilarious. And the season’s third episode, “12 Angry Men Inside Amy Schumer,” is one of the most brilliant episodes of the year. In terms of sheer virality alone, “Inside Amy Schumer” earns a spot on this list. 9. “Mr. Robot,” season one. Showrunner: Sam Esmail. (USA) “Mr. Robot” was beset by an obscure title and a marginal network, but that couldn’t stop the science-fiction/cyberpunk show from finding sudden popularity this summer. That was due mostly to the incredible assurance and totality of vision presented by creator Sam Esmail, who ended up creating a kind of “Fight Club” for 2015, complete with an awful real-world correlation on the day its finale was supposed to air. Ambitious and addictive, “Mr. Robot” is a unique and well-crafted journey into corporate structures and cybersecurity, and easily one of the most enjoyable binge-watches of the year. 8. “Fresh Off the Boat,” seasons one and two. Showrunner: Nahnatchka Khan. (ABC) It’s hard to overstate the ambition of this unassuming little family sitcom, which tells the story of a Taiwanese-American family that moves to suburban Orlando in 1995. What makes the sitcom so brilliant is how prickly and complicated the Huang family is, headed by Jessica (Constance Wu) and Louis (Randall Park). They’re not cookie-cutter, but they’re not entirely idiosyncratic, either. And at its core is a lot of pain—the immigrant experience, the struggle to make ends meet with a new business, and that actual thing called structural racism, which always finds a new way to rear its ugly head. The sitcom finds ways to transmute the pain into humor, and in some ways, its failures are as compelling as its successes; the show’s roughest edges still invite warmth and intimacy. Plus, an embrace of ’90s nostalgia that includes Ol’ Dirty Bastard, Mr. T and Air Jordans. 7. “Penny Dreadful,” season two. Showrunner: John Logan. (Showtime) I’m a sucker for ambitious worlds, and John Logan’s demimonde in “Penny Dreadful” always finds a way to exceed expectations. In addition to swirling fact, fantasy, history and myth together in one heady cocktail, “Penny Dreadful” has incredible performances—from Eva Green, Harry Treadaway, and Billie Piper, especially. Above all, it feels like you’ve stepped into the Victorian Gothic mindset, complete with doubts, superstitions and a pure terror of eternal damnation. 6. “Mad Men,” season seven. Showrunner: Matthew Weiner. (AMC) There is obviously a place on every list for “Mad Men,” a show that changed television when it debuted just seven seasons ago. The split seventh season didn’t do the story or the show any favors, but it’s hard to quarrel with episodes like “Time & Life,” which displayed the show’s powers at their considerable height. I have my quibbles with the finale, but I can at least cop to finding it the most thought-provoking of the year. I’m not sure if I’ve completely decided what it means, but I am more than happy to be left pondering such an odd but indelible vision. 5. “The Leftovers,” season two. Showrunner: Damon Lindelof. (HBO) Last year, if you’d told me that “The Leftovers” was going to make my list, I would have laughed all the way to January. And yet here we are, with Damon Lindelof’s post-“Lost” existential fantasy turning out a second season that made the first season retroactively much better. “The Leftovers” works with dreams, spirituality and suffering to create a show that addresses, somehow, the fundamental horrors and mysteries of being a human being. Season two zeroed in on a single strange evening of events in a place colloquially called “Miracle,” and it proved to be enough to blow the show’s potential wide open. 4. “Black-ish,” seasons one and two. Showrunner: Kenya Barris. (ABC) When people ask me for television recommendations, the show I near-universally recommend is “Black-ish.” Of the shows that focus on families of color, it’s “Black-ish” that finds the most elegant balance between humor, identity and universal constants. The Johnsons are an incredible and almost elastic family, able to withstand a lot of pressure without losing their shape. It gives them the ability to walk into (or push each other into) all kinds of situations—from gun control to heart failure to church, depending on the order of the day. 3. “Show Me a Hero,” miniseries. Written by David Simon, William F. Zorzi, and Lisa Belkin. Directed by Paul Haggis. (HBO) “Show me a hero, and I’ll write you a tragedy,” F. Scott Fitzgerald wrote. This HBO miniseries tells the story of Yonkers mayor Nick Wasicsko, who unexpectedly became the spokesperson for a national conversation on desegregating American neighborhoods. In the capable hands of Simon and Zorzi (who adapted Belkin’s book), a bureaucratic nightmare is turned into human drama, as a neighborhood struggles to grow up a little. 2. “BoJack Horseman,” season two. Showrunner: Raphael Bob-Waksberg. (Netflix) Watching “BoJack Horseman” is an experience quite unlike any other. This completely bizarre animated series took a while, in its first season, to find its voice; by season two, though, it became a surreal meditation on Hollywood, depression and aging, told through the mouth, literally, of a talking horse (voiced by Will Arnett). Perhaps because it’s so disarmingly quirky, the emotional impact of “BoJack Horseman” hit me like a ton of bricks—able to say something profound about humanity by making a lot of very silly jokes about animals. 1. ‘Transparent,” season two. Showrunner: Jill Soloway. (Amazon Studios) In addition to straight-up changing the national conversation on transgender awareness, Jill Soloway’s “Transparent” has made a strong attempt to invent the female gaze and in the process created the most affecting season of television I’ve watched this year. It’s some kind of magic. And my guess is that Soloway, and “Transparent,” are nowhere close to done dazzling us.10. “Inside Amy Schumer,” season three. Showrunners: Amy Schumer and Jessi Klein. (Comedy Central) This year was the year this little Comedy Central sketch show became must-see TV, as creator and star Amy Schumer’s facility for exposing the hypocrisies of life as a woman in this world. Some of the show’s imaginings, like “Last Fuckable Day,” have become part of our lexicon. Some, like the sketch where Schumer plays a character with three buttholes, are just hilarious. And the season’s third episode, “12 Angry Men Inside Amy Schumer,” is one of the most brilliant episodes of the year. In terms of sheer virality alone, “Inside Amy Schumer” earns a spot on this list. 9. “Mr. Robot,” season one. Showrunner: Sam Esmail. (USA) “Mr. Robot” was beset by an obscure title and a marginal network, but that couldn’t stop the science-fiction/cyberpunk show from finding sudden popularity this summer. That was due mostly to the incredible assurance and totality of vision presented by creator Sam Esmail, who ended up creating a kind of “Fight Club” for 2015, complete with an awful real-world correlation on the day its finale was supposed to air. Ambitious and addictive, “Mr. Robot” is a unique and well-crafted journey into corporate structures and cybersecurity, and easily one of the most enjoyable binge-watches of the year. 8. “Fresh Off the Boat,” seasons one and two. Showrunner: Nahnatchka Khan. (ABC) It’s hard to overstate the ambition of this unassuming little family sitcom, which tells the story of a Taiwanese-American family that moves to suburban Orlando in 1995. What makes the sitcom so brilliant is how prickly and complicated the Huang family is, headed by Jessica (Constance Wu) and Louis (Randall Park). They’re not cookie-cutter, but they’re not entirely idiosyncratic, either. And at its core is a lot of pain—the immigrant experience, the struggle to make ends meet with a new business, and that actual thing called structural racism, which always finds a new way to rear its ugly head. The sitcom finds ways to transmute the pain into humor, and in some ways, its failures are as compelling as its successes; the show’s roughest edges still invite warmth and intimacy. Plus, an embrace of ’90s nostalgia that includes Ol’ Dirty Bastard, Mr. T and Air Jordans. 7. “Penny Dreadful,” season two. Showrunner: John Logan. (Showtime) I’m a sucker for ambitious worlds, and John Logan’s demimonde in “Penny Dreadful” always finds a way to exceed expectations. In addition to swirling fact, fantasy, history and myth together in one heady cocktail, “Penny Dreadful” has incredible performances—from Eva Green, Harry Treadaway, and Billie Piper, especially. Above all, it feels like you’ve stepped into the Victorian Gothic mindset, complete with doubts, superstitions and a pure terror of eternal damnation. 6. “Mad Men,” season seven. Showrunner: Matthew Weiner. (AMC) There is obviously a place on every list for “Mad Men,” a show that changed television when it debuted just seven seasons ago. The split seventh season didn’t do the story or the show any favors, but it’s hard to quarrel with episodes like “Time & Life,” which displayed the show’s powers at their considerable height. I have my quibbles with the finale, but I can at least cop to finding it the most thought-provoking of the year. I’m not sure if I’ve completely decided what it means, but I am more than happy to be left pondering such an odd but indelible vision. 5. “The Leftovers,” season two. Showrunner: Damon Lindelof. (HBO) Last year, if you’d told me that “The Leftovers” was going to make my list, I would have laughed all the way to January. And yet here we are, with Damon Lindelof’s post-“Lost” existential fantasy turning out a second season that made the first season retroactively much better. “The Leftovers” works with dreams, spirituality and suffering to create a show that addresses, somehow, the fundamental horrors and mysteries of being a human being. Season two zeroed in on a single strange evening of events in a place colloquially called “Miracle,” and it proved to be enough to blow the show’s potential wide open. 4. “Black-ish,” seasons one and two. Showrunner: Kenya Barris. (ABC) When people ask me for television recommendations, the show I near-universally recommend is “Black-ish.” Of the shows that focus on families of color, it’s “Black-ish” that finds the most elegant balance between humor, identity and universal constants. The Johnsons are an incredible and almost elastic family, able to withstand a lot of pressure without losing their shape. It gives them the ability to walk into (or push each other into) all kinds of situations—from gun control to heart failure to church, depending on the order of the day. 3. “Show Me a Hero,” miniseries. Written by David Simon, William F. Zorzi, and Lisa Belkin. Directed by Paul Haggis. (HBO) “Show me a hero, and I’ll write you a tragedy,” F. Scott Fitzgerald wrote. This HBO miniseries tells the story of Yonkers mayor Nick Wasicsko, who unexpectedly became the spokesperson for a national conversation on desegregating American neighborhoods. In the capable hands of Simon and Zorzi (who adapted Belkin’s book), a bureaucratic nightmare is turned into human drama, as a neighborhood struggles to grow up a little. 2. “BoJack Horseman,” season two. Showrunner: Raphael Bob-Waksberg. (Netflix) Watching “BoJack Horseman” is an experience quite unlike any other. This completely bizarre animated series took a while, in its first season, to find its voice; by season two, though, it became a surreal meditation on Hollywood, depression and aging, told through the mouth, literally, of a talking horse (voiced by Will Arnett). Perhaps because it’s so disarmingly quirky, the emotional impact of “BoJack Horseman” hit me like a ton of bricks—able to say something profound about humanity by making a lot of very silly jokes about animals. 1. ‘Transparent,” season two. Showrunner: Jill Soloway. (Amazon Studios) In addition to straight-up changing the national conversation on transgender awareness, Jill Soloway’s “Transparent” has made a strong attempt to invent the female gaze and in the process created the most affecting season of television I’ve watched this year. It’s some kind of magic. And my guess is that Soloway, and “Transparent,” are nowhere close to done dazzling us.10. “Inside Amy Schumer,” season three. Showrunners: Amy Schumer and Jessi Klein. (Comedy Central) This year was the year this little Comedy Central sketch show became must-see TV, as creator and star Amy Schumer’s facility for exposing the hypocrisies of life as a woman in this world. Some of the show’s imaginings, like “Last Fuckable Day,” have become part of our lexicon. Some, like the sketch where Schumer plays a character with three buttholes, are just hilarious. And the season’s third episode, “12 Angry Men Inside Amy Schumer,” is one of the most brilliant episodes of the year. In terms of sheer virality alone, “Inside Amy Schumer” earns a spot on this list. 9. “Mr. Robot,” season one. Showrunner: Sam Esmail. (USA) “Mr. Robot” was beset by an obscure title and a marginal network, but that couldn’t stop the science-fiction/cyberpunk show from finding sudden popularity this summer. That was due mostly to the incredible assurance and totality of vision presented by creator Sam Esmail, who ended up creating a kind of “Fight Club” for 2015, complete with an awful real-world correlation on the day its finale was supposed to air. Ambitious and addictive, “Mr. Robot” is a unique and well-crafted journey into corporate structures and cybersecurity, and easily one of the most enjoyable binge-watches of the year. 8. “Fresh Off the Boat,” seasons one and two. Showrunner: Nahnatchka Khan. (ABC) It’s hard to overstate the ambition of this unassuming little family sitcom, which tells the story of a Taiwanese-American family that moves to suburban Orlando in 1995. What makes the sitcom so brilliant is how prickly and complicated the Huang family is, headed by Jessica (Constance Wu) and Louis (Randall Park). They’re not cookie-cutter, but they’re not entirely idiosyncratic, either. And at its core is a lot of pain—the immigrant experience, the struggle to make ends meet with a new business, and that actual thing called structural racism, which always finds a new way to rear its ugly head. The sitcom finds ways to transmute the pain into humor, and in some ways, its failures are as compelling as its successes; the show’s roughest edges still invite warmth and intimacy. Plus, an embrace of ’90s nostalgia that includes Ol’ Dirty Bastard, Mr. T and Air Jordans. 7. “Penny Dreadful,” season two. Showrunner: John Logan. (Showtime) I’m a sucker for ambitious worlds, and John Logan’s demimonde in “Penny Dreadful” always finds a way to exceed expectations. In addition to swirling fact, fantasy, history and myth together in one heady cocktail, “Penny Dreadful” has incredible performances—from Eva Green, Harry Treadaway, and Billie Piper, especially. Above all, it feels like you’ve stepped into the Victorian Gothic mindset, complete with doubts, superstitions and a pure terror of eternal damnation. 6. “Mad Men,” season seven. Showrunner: Matthew Weiner. (AMC) There is obviously a place on every list for “Mad Men,” a show that changed television when it debuted just seven seasons ago. The split seventh season didn’t do the story or the show any favors, but it’s hard to quarrel with episodes like “Time & Life,” which displayed the show’s powers at their considerable height. I have my quibbles with the finale, but I can at least cop to finding it the most thought-provoking of the year. I’m not sure if I’ve completely decided what it means, but I am more than happy to be left pondering such an odd but indelible vision. 5. “The Leftovers,” season two. Showrunner: Damon Lindelof. (HBO) Last year, if you’d told me that “The Leftovers” was going to make my list, I would have laughed all the way to January. And yet here we are, with Damon Lindelof’s post-“Lost” existential fantasy turning out a second season that made the first season retroactively much better. “The Leftovers” works with dreams, spirituality and suffering to create a show that addresses, somehow, the fundamental horrors and mysteries of being a human being. Season two zeroed in on a single strange evening of events in a place colloquially called “Miracle,” and it proved to be enough to blow the show’s potential wide open. 4. “Black-ish,” seasons one and two. Showrunner: Kenya Barris. (ABC) When people ask me for television recommendations, the show I near-universally recommend is “Black-ish.” Of the shows that focus on families of color, it’s “Black-ish” that finds the most elegant balance between humor, identity and universal constants. The Johnsons are an incredible and almost elastic family, able to withstand a lot of pressure without losing their shape. It gives them the ability to walk into (or push each other into) all kinds of situations—from gun control to heart failure to church, depending on the order of the day. 3. “Show Me a Hero,” miniseries. Written by David Simon, William F. Zorzi, and Lisa Belkin. Directed by Paul Haggis. (HBO) “Show me a hero, and I’ll write you a tragedy,” F. Scott Fitzgerald wrote. This HBO miniseries tells the story of Yonkers mayor Nick Wasicsko, who unexpectedly became the spokesperson for a national conversation on desegregating American neighborhoods. In the capable hands of Simon and Zorzi (who adapted Belkin’s book), a bureaucratic nightmare is turned into human drama, as a neighborhood struggles to grow up a little. 2. “BoJack Horseman,” season two. Showrunner: Raphael Bob-Waksberg. (Netflix) Watching “BoJack Horseman” is an experience quite unlike any other. This completely bizarre animated series took a while, in its first season, to find its voice; by season two, though, it became a surreal meditation on Hollywood, depression and aging, told through the mouth, literally, of a talking horse (voiced by Will Arnett). Perhaps because it’s so disarmingly quirky, the emotional impact of “BoJack Horseman” hit me like a ton of bricks—able to say something profound about humanity by making a lot of very silly jokes about animals. 1. ‘Transparent,” season two. Showrunner: Jill Soloway. (Amazon Studios) In addition to straight-up changing the national conversation on transgender awareness, Jill Soloway’s “Transparent” has made a strong attempt to invent the female gaze and in the process created the most affecting season of television I’ve watched this year. It’s some kind of magic. And my guess is that Soloway, and “Transparent,” are nowhere close to done dazzling us.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 01, 2016 11:00