Lily Salter's Blog, page 979

October 17, 2015

It’s far worse than it sounds: Climate change is making our winters shorter

If winter comes… spring’s going to be closer-than-usual behind. New research shows that as a result of rising temperatures caused by global climate change, the first leaves and buds of spring will begin arriving at least three weeks ahead of time in the United States. Researchers at the University of Wisconsin-Madison examined the variations and trends in the onset of spring across the Northern Hemisphere’s temperate regions and calculated that the onset of spring plant growth will shift by a median of three weeks earlier over the next century. Their findings were published in the journal Environmental Research Letters yesterday. The prospect of shorter winters probably seem like great news for those who have suffered through several long and freezing cold spells recently, but it’s definitely bad tidings for many spring season plants and for animals that rely on them for sustenance. The change will have far-reaching impact on the growing season of plants and could, as a result, affect for our food security as well. In the natural world, plants and animas life cycles are adapted to the seasons and resource availability. For instance, some migratory birds cope with winter conditions, when food is scarce, by migrating to a warmer climate. Others, like bears and squirrels, do the same by hibernating. Both migration and hibernation are sensitive to seasonal changes. So when the climate goes out of whack — or as scientists would say, when there’s a “phonological mismatch”  — it means trouble. (There have already been reports about warmer weather driving bears out of hibernation early this year.) “Our projections show that winter will be shorter — which sound great for those of us in Wisconsin,” Andrew Allstadt, one of the authors of the study, said in a statement. “But long distance migratory birds, for example, time their migration based on day length in their winter range. They may arrive in their breeding ground to find that the plant resources that they require are already gone.” The researchers observed that the Pacific Northwest and Mountainous regions of the western US would see the biggest shift — they estimate spring will arrive nearly a month early in these regions by 2100. The shift will be the smallest in the southern regions, where spring already arrives early, The researchers also predicted that there would be fewer “false springs” — when freezing temperatures return unexpectedly after spring plant growth has begun — across most of the country, except in certain parts of the western Great Plains, which is projected to see an increase this particular weather phenomena. "This is important as false springs can damage plant production cycles in natural and agricultural systems" Allstadt said. “In some cases, an entire crop can be lost.” The researchers now plant to expand their inquiry to extreme weather events such as droughts and heat waves. "We are particularly interested in how these affect bird populations in wildlife refuges," Allstadt said.If winter comes… spring’s going to be closer-than-usual behind. New research shows that as a result of rising temperatures caused by global climate change, the first leaves and buds of spring will begin arriving at least three weeks ahead of time in the United States. Researchers at the University of Wisconsin-Madison examined the variations and trends in the onset of spring across the Northern Hemisphere’s temperate regions and calculated that the onset of spring plant growth will shift by a median of three weeks earlier over the next century. Their findings were published in the journal Environmental Research Letters yesterday. The prospect of shorter winters probably seem like great news for those who have suffered through several long and freezing cold spells recently, but it’s definitely bad tidings for many spring season plants and for animals that rely on them for sustenance. The change will have far-reaching impact on the growing season of plants and could, as a result, affect for our food security as well. In the natural world, plants and animas life cycles are adapted to the seasons and resource availability. For instance, some migratory birds cope with winter conditions, when food is scarce, by migrating to a warmer climate. Others, like bears and squirrels, do the same by hibernating. Both migration and hibernation are sensitive to seasonal changes. So when the climate goes out of whack — or as scientists would say, when there’s a “phonological mismatch”  — it means trouble. (There have already been reports about warmer weather driving bears out of hibernation early this year.) “Our projections show that winter will be shorter — which sound great for those of us in Wisconsin,” Andrew Allstadt, one of the authors of the study, said in a statement. “But long distance migratory birds, for example, time their migration based on day length in their winter range. They may arrive in their breeding ground to find that the plant resources that they require are already gone.” The researchers observed that the Pacific Northwest and Mountainous regions of the western US would see the biggest shift — they estimate spring will arrive nearly a month early in these regions by 2100. The shift will be the smallest in the southern regions, where spring already arrives early, The researchers also predicted that there would be fewer “false springs” — when freezing temperatures return unexpectedly after spring plant growth has begun — across most of the country, except in certain parts of the western Great Plains, which is projected to see an increase this particular weather phenomena. "This is important as false springs can damage plant production cycles in natural and agricultural systems" Allstadt said. “In some cases, an entire crop can be lost.” The researchers now plant to expand their inquiry to extreme weather events such as droughts and heat waves. "We are particularly interested in how these affect bird populations in wildlife refuges," Allstadt said.If winter comes… spring’s going to be closer-than-usual behind. New research shows that as a result of rising temperatures caused by global climate change, the first leaves and buds of spring will begin arriving at least three weeks ahead of time in the United States. Researchers at the University of Wisconsin-Madison examined the variations and trends in the onset of spring across the Northern Hemisphere’s temperate regions and calculated that the onset of spring plant growth will shift by a median of three weeks earlier over the next century. Their findings were published in the journal Environmental Research Letters yesterday. The prospect of shorter winters probably seem like great news for those who have suffered through several long and freezing cold spells recently, but it’s definitely bad tidings for many spring season plants and for animals that rely on them for sustenance. The change will have far-reaching impact on the growing season of plants and could, as a result, affect for our food security as well. In the natural world, plants and animas life cycles are adapted to the seasons and resource availability. For instance, some migratory birds cope with winter conditions, when food is scarce, by migrating to a warmer climate. Others, like bears and squirrels, do the same by hibernating. Both migration and hibernation are sensitive to seasonal changes. So when the climate goes out of whack — or as scientists would say, when there’s a “phonological mismatch”  — it means trouble. (There have already been reports about warmer weather driving bears out of hibernation early this year.) “Our projections show that winter will be shorter — which sound great for those of us in Wisconsin,” Andrew Allstadt, one of the authors of the study, said in a statement. “But long distance migratory birds, for example, time their migration based on day length in their winter range. They may arrive in their breeding ground to find that the plant resources that they require are already gone.” The researchers observed that the Pacific Northwest and Mountainous regions of the western US would see the biggest shift — they estimate spring will arrive nearly a month early in these regions by 2100. The shift will be the smallest in the southern regions, where spring already arrives early, The researchers also predicted that there would be fewer “false springs” — when freezing temperatures return unexpectedly after spring plant growth has begun — across most of the country, except in certain parts of the western Great Plains, which is projected to see an increase this particular weather phenomena. "This is important as false springs can damage plant production cycles in natural and agricultural systems" Allstadt said. “In some cases, an entire crop can be lost.” The researchers now plant to expand their inquiry to extreme weather events such as droughts and heat waves. "We are particularly interested in how these affect bird populations in wildlife refuges," Allstadt said.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 12:00

The Pope wanted Hitler dead: The secret story of the Vatican’s war to kill the Nazi despot

At six in the morning on Sunday, 12 March, a procession snaked toward the bronze doors of St. Peter’s. Swiss Guards led the line, followed by barefoot friars with belts of rope. Pius took his place at the end, borne on a portable throne. Ostrich plumes stirred silently to either side, like quotation marks. Pius entered the basilica to a blare of silver trumpets and a burst of applause. Through pillars of incense he blessed the faces. At the High Altar, attendants placed on his shoulders a wool strip interwoven with black crosses. Outside, police pushed back the crowds. People climbed onto ledges and balanced on chimneys, straining to see the palace balcony. At noon Pius emerged. The cardinal deacon stood alongside him. Onto Pacelli’s dark head he lowered a crown of pearls, shaped like a beehive. “Receive this Tiara,” he said, “and know that you are the father of kings, the ruler of the world.” Germany’s ambassador to the Holy See, Diego von Bergen, reportedly said of the ceremony: “Very moving and beautiful, but it will be the last.” * As Pacelli was crowned, Hitler at tended a state ceremony in Berlin. In a Memorial Day speech at the State Opera House, Grand Admiral Erich Raeder said, “Wherever we gain a foothold, we will maintain it! Wherever a gap appears, we will bridge it! . . . Germany strikes swiftly and strongly!” Hitler reviewed honor guards, then placed a wreath at a memorial to the Unknown Soldier. That same day, he issued orders for his soldiers to occupy Czechoslovakia. On 15 March the German army entered Prague. Through snow and mist, on ice-bound roads, Hitler followed in his three-axel Mercedes, its bulletproof windows up. Himmler’s gang of 800 SS officers hunted undesirables. A papal agent cabled Rome, with “details obtained confidentially,” reporting the arrests of all who “had spoken and written against the Third Reich and its Führer.” Soon 487 Czech and Slovak Jesuits landed in prison camps, where it was “a common sight,” one witness said, “to see a priest dressed in rags, exhausted, pulling a car and behind him a youth in SA [Storm Troop] uniform, whip in hand.” Hitler’s seizure of Czechoslovakia put Europe in crisis. He had scrapped his pledge to respect Czech integrity, made at Munich six months before, which British Prime Minister Neville Chamberlain said guaranteed “peace in our time.” Now London condemned “Germany’s attempt to obtain world domination, which it was in the interest of all countries to resist.” Poland’s government, facing a German ultimatum over the disputed Danzig corridor, mobilized its troops. On 18 March, the papal agent in Warsaw reported “a state of tension” between the Reich and Poland “which could have the most serious consequences.” Another intelligence report reaching the Vatican called the situation “desperately grave.” Perhaps no pope in nearly a millennium had taken power amid such general fear. The scene paralleled that in 1073, when Charlemagne’s old empire imploded, and Europe needed only a spark to burn. “Even the election of the pope stood in the shadow of the Swastika,” Nazi labor leader Robert Ley boasted. “I am sure they spoke of nothing else than how to find a candidate for the chair of St. Peter who was more or less up to dealing with Adolf Hitler.” * The political crisis had in fact produced a political pope. Amid the gathering storm, the cardinals had elected the candidate most skilled in statecraft, in the quickest conclave in four centuries. His long career in the papal foreign service made Pacelli the dean of Church diplomats. He had hunted on horseback with Prussian generals, endured at dinner parties the rants of exiled kings, faced down armed revolutionaries with just his jeweled cross. As cardinal secretary of state, he had discreetly aligned with friendly states, and won Catholic rights from hostile ones. Useful to every government, a lackey to none, he impressed one German diplomat as “a politician to the high extreme.” Politics were in Pacelli’s blood. His grandfather had been interior minister of the Papal States, a belt of territories bigger than Denmark, which popes had ruled since the Dark Ages. Believing that these lands kept popes politically independent, the Pacellis fought to preserve them against Italian nationalists. The Pacellis lost. By 1870 the pope ruled only Vatican City, a diamond-shaped kingdom the size of a golf course. Born in Rome six years later, raised in the shadow of St. Peter’s, Eugenio Pacelli inherited a highly political sense of mission. As an altar boy, he prayed for the Papal States; in school essays, he protested secular claims; and as pope, he saw politics as religion by other means. Some thought his priestly mixing in politics a contradiction. Pacelli contained many contradictions. He visited more countries, and spoke more languages, than any previous pope—yet remained a homebody, who lived with his mother until he was forty-one. Eager to meet children, unafraid to deal with dictators, he was timid with bishops and priests. He led one of the planet’s most public lives, and one of its loneliest. He was familiar to billions, but his best friend was a goldfinch. He was open with strangers, pensive with friends. His aides could not see into his soul. To some he did not seem “a human being with impulses, emotions, passions”—but others recalled him weeping over the fate of the Jews. One observer found him “pathetic and tremendous,” another “despotic and insecure.” Half of him, it seemed, was always counteracting the other half. A dual devotion to piety and politics clove him deeply. No one could call him a mere Machiavellian, a Medici-pope: he said Mass daily, communed with God for hours, reported visions of Jesus and Mary. Visitors remarked on his saintly appearance; one called him “a man like a ray of light.” Yet those who thought Pius not of this world were mistaken. Hyperspirituality, a withdrawal into the sphere of the purely religious, found no favor with him. A US intelligence officer in Rome noted how much time Pius devoted to politics and how closely he supervised all aspects of the Vatican Foreign Office. While writing an encyclical on the Mystical Body of Christ, he was also assessing the likely strategic impact of atomic weapons. He judged them a “useful means of defense.” Even some who liked Pacelli disliked his concern with worldly power. “One is tempted to say that attention to the political is too much,” wrote Jacques Maritain, the French postwar ambassador to the Vatican, “considering the essential role of the Church.” The Church’s essential role, after all, was to save souls. But in practice, the spiritual purpose entailed a temporal one: the achievement of political conditions under which souls could be saved. Priests must baptize, say Mass, and consecrate marriages without interference from the state. A fear of state power structured Church thought: the Caesars had killed Peter and Paul, and Jesus. The pope therefore did not have one role, but two. He had to render to God what was God’s, and keep Caesar at bay. Every pope was in part a politician; some led armies. The papacy Pacelli inherited was as bipolar as he was. He merely encompassed, in compressed form, the existential problem of the Church: how to be a spiritual institution in a physical and highly political world. It was a problem that could not be solved, only managed. And if it was a dilemma which had caused twenty centuries of war between Church and state, climaxing just as Pacelli became pope, it was also a quandary that would, on his watch, put Catholicism in conflict with itself. For the tectonic pull of opposing tensions, of spiritual and temporal imperatives, opened a fissure in the foundations of the Church that could not be closed. Ideally, a pope’s spiritual function ought not clash with his political one. But if and when it did, which should take precedence? That was always a difficult question—but never more difficult than during the bloodiest years in history, when Pius the Twelfth would have to choose his answer. * On 1 September 1939, Pius awoke at around 6:00 a.m. at his summer residence, Castel Gandolfo, a medieval fortress straddling a dormant volcano. His housekeeper, Sister Pascalina, had just released his canaries from their cages when the bedside telephone rang. Answering in his usual manner, “E’qui Pacelli” (“Pacelli here”), he heard the trembling voice of Cardinal Luigi Maglione, relaying intelligence from the papal nuncio in Berlin: fifteen minutes earlier, the German Wehrmacht had surged into Poland. At first Pius carried on normally, papally. He shuffled to his private chapel and bent in prayer. Then, after a cold shower and an electric shave, he celebrated Mass, attended by Bavarian nuns. But at breakfast, Sister Pascalina recalled, he probed his rolls and coffee warily, “as if opening a stack of bills in the mail.” He ate little for the next six years. By war’s end, although he stood six feet tall, he would weigh only 125 pounds. His nerves frayed from moral and political burdens, he would remind Pascalina of a “famished robin or an overdriven horse.” With the sigh of a great sadness, his undersecretary of state, Domenico Tardini, reflected: “This man, who was peace-loving by temperament, education, and conviction, was to have what might be called a pontificate of war.” In war the Vatican tried to stay neutral. Because the pope represented Catholics in all nations, he had to appear unbiased. Taking sides would compel some Catholics to betray their country, and others their faith. But Poland was special. For centuries, the Poles had been a Catholic bulwark between Protestant Prussia and Orthodox Russia. Pius would recognize the exiled Polish government, not the Nazi protectorate. “Neutrality” described his official stance, not his real one. As he told France’s ambassador when Warsaw fell: “You know which side my sympathies lie. But I cannot say so.” As news of Poland’s agony spread, however, Pius felt compelled to speak. By October, the Vatican had received reports of Jews shot in synagogues and buried in ditches. The Nazis, moreover, were targeting Polish Catholics as well. They would eventually murder perhaps 2.4 million Catholic Poles in “nonmilitary killing operations.” The persecution of Polish Gentiles fell far short of the industrialized genocide visited on Europe’s Jews. But it had near-genocidal traits and prepared the way for what followed. On 20 October Pius issued a public statement. His encyclical Summi Pontificatus, known in English as Darkness over the Earth, began by denouncing attacks on Judaism. “Who among ‘the Soldiers of Christ’ does not feel himself incited to a more determined resistance, as he perceives Christ’s enemies wantonly break the Tables of God’s Commandments to substitute other tables and other standards stripped of the ethical content of the Revelation on Sinai?” Even at the cost of “torments or martyrdom,” he wrote, one “must confront such wickedness by saying: ‘Non licet; it is not allowed!’” Pius then stressed the “unity of the human race.” Underscoring that this unity refuted racism, he said he would consecrate bishops of twelve ethnicities in the Vatican crypt. He clinched the point by insisting that “the spirit, the teaching and the work of the Church can never be other than that which the Apostle of the Gentiles preached: ‘there is neither Gentile nor Jew.’” The world judged the work an attack on Nazi Germany. “Pope Condemns Dictators, Treaty Violators, Racism,” the New York Times announced in a front-page banner headline. “The unqualified condemnation which Pope Pius XII heaped on totalitarian, racist and materialistic theories of government in his encyclical Summi Pontificatus caused a profound stir,” the Jewish Telegraphic Agency reported. “Although it had been expected that the Pope would attack ideologies hostile to the Catholic Church, few observers had expected so outspoken a document.” Pius even pledged to speak out again, if necessary. “We owe no greater debt to Our office and to Our time than to testify to the truth,” he wrote. “In the fulfillment of this, Our duty, we shall not let Ourselves be influenced by earthly considerations.” It was a valiant vow, and a vain one. He would not use the word “Jew” in public again until 1945. Allied and Jewish press agencies still hailed him as anti-Nazi during the war. But in time, his silence strained Catholic-Jewish relations, and reduced the moral credibility of the faith. Debated into the next century, the causes and meaning of that silence would become the principal enigma in both the biography of Pius and the history of the modern Church. Judging Pius by what he did not say, one could only damn him. With images of piles of skeletal corpses before his eyes; with women and young children compelled, by torture, to kill each other; with millions of innocents caged like criminals, butchered like cattle, and burned like trash—he should have spoken out. He had this duty, not only as pontiff, but as a person. After his first encyclical, he did reissue general distinctions between race-hatred and Christian love. Yet with the ethical coin of the Church, Pius proved frugal; toward what he privately termed “Satanic forces,” he showed public moderation; where no conscience could stay neutral, the Church seemed to be. During the world’s greatest moral crisis, its greatest moral leader seemed at a loss for words. But the Vatican did not work by words alone. By 20 October, when Pius put his name to Summi Pontficatus, he was enmeshed in a war behind the war. Those who later explored the maze of his policies, without a clue to his secret actions, wondered why he seemed so hostile toward Nazism, and then fell so silent. But when his secret acts are mapped, and made to overlay his public words, a stark correlation emerges. The last day during the war when Pius publicly said the word “Jew” is also, in fact, the first day history can document his choice to help kill Adolf Hitler. Excerpted from "Church of Spies: The Pope's Secret War Against Hitler" by Mark Riebling. Published by Basic Books, a division of the Perseus Book Group. Copyright 2015 by Mark Riebling. Reprinted with permission of the publisher. All rights reserved. At six in the morning on Sunday, 12 March, a procession snaked toward the bronze doors of St. Peter’s. Swiss Guards led the line, followed by barefoot friars with belts of rope. Pius took his place at the end, borne on a portable throne. Ostrich plumes stirred silently to either side, like quotation marks. Pius entered the basilica to a blare of silver trumpets and a burst of applause. Through pillars of incense he blessed the faces. At the High Altar, attendants placed on his shoulders a wool strip interwoven with black crosses. Outside, police pushed back the crowds. People climbed onto ledges and balanced on chimneys, straining to see the palace balcony. At noon Pius emerged. The cardinal deacon stood alongside him. Onto Pacelli’s dark head he lowered a crown of pearls, shaped like a beehive. “Receive this Tiara,” he said, “and know that you are the father of kings, the ruler of the world.” Germany’s ambassador to the Holy See, Diego von Bergen, reportedly said of the ceremony: “Very moving and beautiful, but it will be the last.” * As Pacelli was crowned, Hitler at tended a state ceremony in Berlin. In a Memorial Day speech at the State Opera House, Grand Admiral Erich Raeder said, “Wherever we gain a foothold, we will maintain it! Wherever a gap appears, we will bridge it! . . . Germany strikes swiftly and strongly!” Hitler reviewed honor guards, then placed a wreath at a memorial to the Unknown Soldier. That same day, he issued orders for his soldiers to occupy Czechoslovakia. On 15 March the German army entered Prague. Through snow and mist, on ice-bound roads, Hitler followed in his three-axel Mercedes, its bulletproof windows up. Himmler’s gang of 800 SS officers hunted undesirables. A papal agent cabled Rome, with “details obtained confidentially,” reporting the arrests of all who “had spoken and written against the Third Reich and its Führer.” Soon 487 Czech and Slovak Jesuits landed in prison camps, where it was “a common sight,” one witness said, “to see a priest dressed in rags, exhausted, pulling a car and behind him a youth in SA [Storm Troop] uniform, whip in hand.” Hitler’s seizure of Czechoslovakia put Europe in crisis. He had scrapped his pledge to respect Czech integrity, made at Munich six months before, which British Prime Minister Neville Chamberlain said guaranteed “peace in our time.” Now London condemned “Germany’s attempt to obtain world domination, which it was in the interest of all countries to resist.” Poland’s government, facing a German ultimatum over the disputed Danzig corridor, mobilized its troops. On 18 March, the papal agent in Warsaw reported “a state of tension” between the Reich and Poland “which could have the most serious consequences.” Another intelligence report reaching the Vatican called the situation “desperately grave.” Perhaps no pope in nearly a millennium had taken power amid such general fear. The scene paralleled that in 1073, when Charlemagne’s old empire imploded, and Europe needed only a spark to burn. “Even the election of the pope stood in the shadow of the Swastika,” Nazi labor leader Robert Ley boasted. “I am sure they spoke of nothing else than how to find a candidate for the chair of St. Peter who was more or less up to dealing with Adolf Hitler.” * The political crisis had in fact produced a political pope. Amid the gathering storm, the cardinals had elected the candidate most skilled in statecraft, in the quickest conclave in four centuries. His long career in the papal foreign service made Pacelli the dean of Church diplomats. He had hunted on horseback with Prussian generals, endured at dinner parties the rants of exiled kings, faced down armed revolutionaries with just his jeweled cross. As cardinal secretary of state, he had discreetly aligned with friendly states, and won Catholic rights from hostile ones. Useful to every government, a lackey to none, he impressed one German diplomat as “a politician to the high extreme.” Politics were in Pacelli’s blood. His grandfather had been interior minister of the Papal States, a belt of territories bigger than Denmark, which popes had ruled since the Dark Ages. Believing that these lands kept popes politically independent, the Pacellis fought to preserve them against Italian nationalists. The Pacellis lost. By 1870 the pope ruled only Vatican City, a diamond-shaped kingdom the size of a golf course. Born in Rome six years later, raised in the shadow of St. Peter’s, Eugenio Pacelli inherited a highly political sense of mission. As an altar boy, he prayed for the Papal States; in school essays, he protested secular claims; and as pope, he saw politics as religion by other means. Some thought his priestly mixing in politics a contradiction. Pacelli contained many contradictions. He visited more countries, and spoke more languages, than any previous pope—yet remained a homebody, who lived with his mother until he was forty-one. Eager to meet children, unafraid to deal with dictators, he was timid with bishops and priests. He led one of the planet’s most public lives, and one of its loneliest. He was familiar to billions, but his best friend was a goldfinch. He was open with strangers, pensive with friends. His aides could not see into his soul. To some he did not seem “a human being with impulses, emotions, passions”—but others recalled him weeping over the fate of the Jews. One observer found him “pathetic and tremendous,” another “despotic and insecure.” Half of him, it seemed, was always counteracting the other half. A dual devotion to piety and politics clove him deeply. No one could call him a mere Machiavellian, a Medici-pope: he said Mass daily, communed with God for hours, reported visions of Jesus and Mary. Visitors remarked on his saintly appearance; one called him “a man like a ray of light.” Yet those who thought Pius not of this world were mistaken. Hyperspirituality, a withdrawal into the sphere of the purely religious, found no favor with him. A US intelligence officer in Rome noted how much time Pius devoted to politics and how closely he supervised all aspects of the Vatican Foreign Office. While writing an encyclical on the Mystical Body of Christ, he was also assessing the likely strategic impact of atomic weapons. He judged them a “useful means of defense.” Even some who liked Pacelli disliked his concern with worldly power. “One is tempted to say that attention to the political is too much,” wrote Jacques Maritain, the French postwar ambassador to the Vatican, “considering the essential role of the Church.” The Church’s essential role, after all, was to save souls. But in practice, the spiritual purpose entailed a temporal one: the achievement of political conditions under which souls could be saved. Priests must baptize, say Mass, and consecrate marriages without interference from the state. A fear of state power structured Church thought: the Caesars had killed Peter and Paul, and Jesus. The pope therefore did not have one role, but two. He had to render to God what was God’s, and keep Caesar at bay. Every pope was in part a politician; some led armies. The papacy Pacelli inherited was as bipolar as he was. He merely encompassed, in compressed form, the existential problem of the Church: how to be a spiritual institution in a physical and highly political world. It was a problem that could not be solved, only managed. And if it was a dilemma which had caused twenty centuries of war between Church and state, climaxing just as Pacelli became pope, it was also a quandary that would, on his watch, put Catholicism in conflict with itself. For the tectonic pull of opposing tensions, of spiritual and temporal imperatives, opened a fissure in the foundations of the Church that could not be closed. Ideally, a pope’s spiritual function ought not clash with his political one. But if and when it did, which should take precedence? That was always a difficult question—but never more difficult than during the bloodiest years in history, when Pius the Twelfth would have to choose his answer. * On 1 September 1939, Pius awoke at around 6:00 a.m. at his summer residence, Castel Gandolfo, a medieval fortress straddling a dormant volcano. His housekeeper, Sister Pascalina, had just released his canaries from their cages when the bedside telephone rang. Answering in his usual manner, “E’qui Pacelli” (“Pacelli here”), he heard the trembling voice of Cardinal Luigi Maglione, relaying intelligence from the papal nuncio in Berlin: fifteen minutes earlier, the German Wehrmacht had surged into Poland. At first Pius carried on normally, papally. He shuffled to his private chapel and bent in prayer. Then, after a cold shower and an electric shave, he celebrated Mass, attended by Bavarian nuns. But at breakfast, Sister Pascalina recalled, he probed his rolls and coffee warily, “as if opening a stack of bills in the mail.” He ate little for the next six years. By war’s end, although he stood six feet tall, he would weigh only 125 pounds. His nerves frayed from moral and political burdens, he would remind Pascalina of a “famished robin or an overdriven horse.” With the sigh of a great sadness, his undersecretary of state, Domenico Tardini, reflected: “This man, who was peace-loving by temperament, education, and conviction, was to have what might be called a pontificate of war.” In war the Vatican tried to stay neutral. Because the pope represented Catholics in all nations, he had to appear unbiased. Taking sides would compel some Catholics to betray their country, and others their faith. But Poland was special. For centuries, the Poles had been a Catholic bulwark between Protestant Prussia and Orthodox Russia. Pius would recognize the exiled Polish government, not the Nazi protectorate. “Neutrality” described his official stance, not his real one. As he told France’s ambassador when Warsaw fell: “You know which side my sympathies lie. But I cannot say so.” As news of Poland’s agony spread, however, Pius felt compelled to speak. By October, the Vatican had received reports of Jews shot in synagogues and buried in ditches. The Nazis, moreover, were targeting Polish Catholics as well. They would eventually murder perhaps 2.4 million Catholic Poles in “nonmilitary killing operations.” The persecution of Polish Gentiles fell far short of the industrialized genocide visited on Europe’s Jews. But it had near-genocidal traits and prepared the way for what followed. On 20 October Pius issued a public statement. His encyclical Summi Pontificatus, known in English as Darkness over the Earth, began by denouncing attacks on Judaism. “Who among ‘the Soldiers of Christ’ does not feel himself incited to a more determined resistance, as he perceives Christ’s enemies wantonly break the Tables of God’s Commandments to substitute other tables and other standards stripped of the ethical content of the Revelation on Sinai?” Even at the cost of “torments or martyrdom,” he wrote, one “must confront such wickedness by saying: ‘Non licet; it is not allowed!’” Pius then stressed the “unity of the human race.” Underscoring that this unity refuted racism, he said he would consecrate bishops of twelve ethnicities in the Vatican crypt. He clinched the point by insisting that “the spirit, the teaching and the work of the Church can never be other than that which the Apostle of the Gentiles preached: ‘there is neither Gentile nor Jew.’” The world judged the work an attack on Nazi Germany. “Pope Condemns Dictators, Treaty Violators, Racism,” the New York Times announced in a front-page banner headline. “The unqualified condemnation which Pope Pius XII heaped on totalitarian, racist and materialistic theories of government in his encyclical Summi Pontificatus caused a profound stir,” the Jewish Telegraphic Agency reported. “Although it had been expected that the Pope would attack ideologies hostile to the Catholic Church, few observers had expected so outspoken a document.” Pius even pledged to speak out again, if necessary. “We owe no greater debt to Our office and to Our time than to testify to the truth,” he wrote. “In the fulfillment of this, Our duty, we shall not let Ourselves be influenced by earthly considerations.” It was a valiant vow, and a vain one. He would not use the word “Jew” in public again until 1945. Allied and Jewish press agencies still hailed him as anti-Nazi during the war. But in time, his silence strained Catholic-Jewish relations, and reduced the moral credibility of the faith. Debated into the next century, the causes and meaning of that silence would become the principal enigma in both the biography of Pius and the history of the modern Church. Judging Pius by what he did not say, one could only damn him. With images of piles of skeletal corpses before his eyes; with women and young children compelled, by torture, to kill each other; with millions of innocents caged like criminals, butchered like cattle, and burned like trash—he should have spoken out. He had this duty, not only as pontiff, but as a person. After his first encyclical, he did reissue general distinctions between race-hatred and Christian love. Yet with the ethical coin of the Church, Pius proved frugal; toward what he privately termed “Satanic forces,” he showed public moderation; where no conscience could stay neutral, the Church seemed to be. During the world’s greatest moral crisis, its greatest moral leader seemed at a loss for words. But the Vatican did not work by words alone. By 20 October, when Pius put his name to Summi Pontficatus, he was enmeshed in a war behind the war. Those who later explored the maze of his policies, without a clue to his secret actions, wondered why he seemed so hostile toward Nazism, and then fell so silent. But when his secret acts are mapped, and made to overlay his public words, a stark correlation emerges. The last day during the war when Pius publicly said the word “Jew” is also, in fact, the first day history can document his choice to help kill Adolf Hitler. Excerpted from "Church of Spies: The Pope's Secret War Against Hitler" by Mark Riebling. Published by Basic Books, a division of the Perseus Book Group. Copyright 2015 by Mark Riebling. Reprinted with permission of the publisher. All rights reserved. At six in the morning on Sunday, 12 March, a procession snaked toward the bronze doors of St. Peter’s. Swiss Guards led the line, followed by barefoot friars with belts of rope. Pius took his place at the end, borne on a portable throne. Ostrich plumes stirred silently to either side, like quotation marks. Pius entered the basilica to a blare of silver trumpets and a burst of applause. Through pillars of incense he blessed the faces. At the High Altar, attendants placed on his shoulders a wool strip interwoven with black crosses. Outside, police pushed back the crowds. People climbed onto ledges and balanced on chimneys, straining to see the palace balcony. At noon Pius emerged. The cardinal deacon stood alongside him. Onto Pacelli’s dark head he lowered a crown of pearls, shaped like a beehive. “Receive this Tiara,” he said, “and know that you are the father of kings, the ruler of the world.” Germany’s ambassador to the Holy See, Diego von Bergen, reportedly said of the ceremony: “Very moving and beautiful, but it will be the last.” * As Pacelli was crowned, Hitler at tended a state ceremony in Berlin. In a Memorial Day speech at the State Opera House, Grand Admiral Erich Raeder said, “Wherever we gain a foothold, we will maintain it! Wherever a gap appears, we will bridge it! . . . Germany strikes swiftly and strongly!” Hitler reviewed honor guards, then placed a wreath at a memorial to the Unknown Soldier. That same day, he issued orders for his soldiers to occupy Czechoslovakia. On 15 March the German army entered Prague. Through snow and mist, on ice-bound roads, Hitler followed in his three-axel Mercedes, its bulletproof windows up. Himmler’s gang of 800 SS officers hunted undesirables. A papal agent cabled Rome, with “details obtained confidentially,” reporting the arrests of all who “had spoken and written against the Third Reich and its Führer.” Soon 487 Czech and Slovak Jesuits landed in prison camps, where it was “a common sight,” one witness said, “to see a priest dressed in rags, exhausted, pulling a car and behind him a youth in SA [Storm Troop] uniform, whip in hand.” Hitler’s seizure of Czechoslovakia put Europe in crisis. He had scrapped his pledge to respect Czech integrity, made at Munich six months before, which British Prime Minister Neville Chamberlain said guaranteed “peace in our time.” Now London condemned “Germany’s attempt to obtain world domination, which it was in the interest of all countries to resist.” Poland’s government, facing a German ultimatum over the disputed Danzig corridor, mobilized its troops. On 18 March, the papal agent in Warsaw reported “a state of tension” between the Reich and Poland “which could have the most serious consequences.” Another intelligence report reaching the Vatican called the situation “desperately grave.” Perhaps no pope in nearly a millennium had taken power amid such general fear. The scene paralleled that in 1073, when Charlemagne’s old empire imploded, and Europe needed only a spark to burn. “Even the election of the pope stood in the shadow of the Swastika,” Nazi labor leader Robert Ley boasted. “I am sure they spoke of nothing else than how to find a candidate for the chair of St. Peter who was more or less up to dealing with Adolf Hitler.” * The political crisis had in fact produced a political pope. Amid the gathering storm, the cardinals had elected the candidate most skilled in statecraft, in the quickest conclave in four centuries. His long career in the papal foreign service made Pacelli the dean of Church diplomats. He had hunted on horseback with Prussian generals, endured at dinner parties the rants of exiled kings, faced down armed revolutionaries with just his jeweled cross. As cardinal secretary of state, he had discreetly aligned with friendly states, and won Catholic rights from hostile ones. Useful to every government, a lackey to none, he impressed one German diplomat as “a politician to the high extreme.” Politics were in Pacelli’s blood. His grandfather had been interior minister of the Papal States, a belt of territories bigger than Denmark, which popes had ruled since the Dark Ages. Believing that these lands kept popes politically independent, the Pacellis fought to preserve them against Italian nationalists. The Pacellis lost. By 1870 the pope ruled only Vatican City, a diamond-shaped kingdom the size of a golf course. Born in Rome six years later, raised in the shadow of St. Peter’s, Eugenio Pacelli inherited a highly political sense of mission. As an altar boy, he prayed for the Papal States; in school essays, he protested secular claims; and as pope, he saw politics as religion by other means. Some thought his priestly mixing in politics a contradiction. Pacelli contained many contradictions. He visited more countries, and spoke more languages, than any previous pope—yet remained a homebody, who lived with his mother until he was forty-one. Eager to meet children, unafraid to deal with dictators, he was timid with bishops and priests. He led one of the planet’s most public lives, and one of its loneliest. He was familiar to billions, but his best friend was a goldfinch. He was open with strangers, pensive with friends. His aides could not see into his soul. To some he did not seem “a human being with impulses, emotions, passions”—but others recalled him weeping over the fate of the Jews. One observer found him “pathetic and tremendous,” another “despotic and insecure.” Half of him, it seemed, was always counteracting the other half. A dual devotion to piety and politics clove him deeply. No one could call him a mere Machiavellian, a Medici-pope: he said Mass daily, communed with God for hours, reported visions of Jesus and Mary. Visitors remarked on his saintly appearance; one called him “a man like a ray of light.” Yet those who thought Pius not of this world were mistaken. Hyperspirituality, a withdrawal into the sphere of the purely religious, found no favor with him. A US intelligence officer in Rome noted how much time Pius devoted to politics and how closely he supervised all aspects of the Vatican Foreign Office. While writing an encyclical on the Mystical Body of Christ, he was also assessing the likely strategic impact of atomic weapons. He judged them a “useful means of defense.” Even some who liked Pacelli disliked his concern with worldly power. “One is tempted to say that attention to the political is too much,” wrote Jacques Maritain, the French postwar ambassador to the Vatican, “considering the essential role of the Church.” The Church’s essential role, after all, was to save souls. But in practice, the spiritual purpose entailed a temporal one: the achievement of political conditions under which souls could be saved. Priests must baptize, say Mass, and consecrate marriages without interference from the state. A fear of state power structured Church thought: the Caesars had killed Peter and Paul, and Jesus. The pope therefore did not have one role, but two. He had to render to God what was God’s, and keep Caesar at bay. Every pope was in part a politician; some led armies. The papacy Pacelli inherited was as bipolar as he was. He merely encompassed, in compressed form, the existential problem of the Church: how to be a spiritual institution in a physical and highly political world. It was a problem that could not be solved, only managed. And if it was a dilemma which had caused twenty centuries of war between Church and state, climaxing just as Pacelli became pope, it was also a quandary that would, on his watch, put Catholicism in conflict with itself. For the tectonic pull of opposing tensions, of spiritual and temporal imperatives, opened a fissure in the foundations of the Church that could not be closed. Ideally, a pope’s spiritual function ought not clash with his political one. But if and when it did, which should take precedence? That was always a difficult question—but never more difficult than during the bloodiest years in history, when Pius the Twelfth would have to choose his answer. * On 1 September 1939, Pius awoke at around 6:00 a.m. at his summer residence, Castel Gandolfo, a medieval fortress straddling a dormant volcano. His housekeeper, Sister Pascalina, had just released his canaries from their cages when the bedside telephone rang. Answering in his usual manner, “E’qui Pacelli” (“Pacelli here”), he heard the trembling voice of Cardinal Luigi Maglione, relaying intelligence from the papal nuncio in Berlin: fifteen minutes earlier, the German Wehrmacht had surged into Poland. At first Pius carried on normally, papally. He shuffled to his private chapel and bent in prayer. Then, after a cold shower and an electric shave, he celebrated Mass, attended by Bavarian nuns. But at breakfast, Sister Pascalina recalled, he probed his rolls and coffee warily, “as if opening a stack of bills in the mail.” He ate little for the next six years. By war’s end, although he stood six feet tall, he would weigh only 125 pounds. His nerves frayed from moral and political burdens, he would remind Pascalina of a “famished robin or an overdriven horse.” With the sigh of a great sadness, his undersecretary of state, Domenico Tardini, reflected: “This man, who was peace-loving by temperament, education, and conviction, was to have what might be called a pontificate of war.” In war the Vatican tried to stay neutral. Because the pope represented Catholics in all nations, he had to appear unbiased. Taking sides would compel some Catholics to betray their country, and others their faith. But Poland was special. For centuries, the Poles had been a Catholic bulwark between Protestant Prussia and Orthodox Russia. Pius would recognize the exiled Polish government, not the Nazi protectorate. “Neutrality” described his official stance, not his real one. As he told France’s ambassador when Warsaw fell: “You know which side my sympathies lie. But I cannot say so.” As news of Poland’s agony spread, however, Pius felt compelled to speak. By October, the Vatican had received reports of Jews shot in synagogues and buried in ditches. The Nazis, moreover, were targeting Polish Catholics as well. They would eventually murder perhaps 2.4 million Catholic Poles in “nonmilitary killing operations.” The persecution of Polish Gentiles fell far short of the industrialized genocide visited on Europe’s Jews. But it had near-genocidal traits and prepared the way for what followed. On 20 October Pius issued a public statement. His encyclical Summi Pontificatus, known in English as Darkness over the Earth, began by denouncing attacks on Judaism. “Who among ‘the Soldiers of Christ’ does not feel himself incited to a more determined resistance, as he perceives Christ’s enemies wantonly break the Tables of God’s Commandments to substitute other tables and other standards stripped of the ethical content of the Revelation on Sinai?” Even at the cost of “torments or martyrdom,” he wrote, one “must confront such wickedness by saying: ‘Non licet; it is not allowed!’” Pius then stressed the “unity of the human race.” Underscoring that this unity refuted racism, he said he would consecrate bishops of twelve ethnicities in the Vatican crypt. He clinched the point by insisting that “the spirit, the teaching and the work of the Church can never be other than that which the Apostle of the Gentiles preached: ‘there is neither Gentile nor Jew.’” The world judged the work an attack on Nazi Germany. “Pope Condemns Dictators, Treaty Violators, Racism,” the New York Times announced in a front-page banner headline. “The unqualified condemnation which Pope Pius XII heaped on totalitarian, racist and materialistic theories of government in his encyclical Summi Pontificatus caused a profound stir,” the Jewish Telegraphic Agency reported. “Although it had been expected that the Pope would attack ideologies hostile to the Catholic Church, few observers had expected so outspoken a document.” Pius even pledged to speak out again, if necessary. “We owe no greater debt to Our office and to Our time than to testify to the truth,” he wrote. “In the fulfillment of this, Our duty, we shall not let Ourselves be influenced by earthly considerations.” It was a valiant vow, and a vain one. He would not use the word “Jew” in public again until 1945. Allied and Jewish press agencies still hailed him as anti-Nazi during the war. But in time, his silence strained Catholic-Jewish relations, and reduced the moral credibility of the faith. Debated into the next century, the causes and meaning of that silence would become the principal enigma in both the biography of Pius and the history of the modern Church. Judging Pius by what he did not say, one could only damn him. With images of piles of skeletal corpses before his eyes; with women and young children compelled, by torture, to kill each other; with millions of innocents caged like criminals, butchered like cattle, and burned like trash—he should have spoken out. He had this duty, not only as pontiff, but as a person. After his first encyclical, he did reissue general distinctions between race-hatred and Christian love. Yet with the ethical coin of the Church, Pius proved frugal; toward what he privately termed “Satanic forces,” he showed public moderation; where no conscience could stay neutral, the Church seemed to be. During the world’s greatest moral crisis, its greatest moral leader seemed at a loss for words. But the Vatican did not work by words alone. By 20 October, when Pius put his name to Summi Pontficatus, he was enmeshed in a war behind the war. Those who later explored the maze of his policies, without a clue to his secret actions, wondered why he seemed so hostile toward Nazism, and then fell so silent. But when his secret acts are mapped, and made to overlay his public words, a stark correlation emerges. The last day during the war when Pius publicly said the word “Jew” is also, in fact, the first day history can document his choice to help kill Adolf Hitler. Excerpted from "Church of Spies: The Pope's Secret War Against Hitler" by Mark Riebling. Published by Basic Books, a division of the Perseus Book Group. Copyright 2015 by Mark Riebling. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 11:00

5 things that could change American minds about socialism

Global Post If Bernie Sanders were to win the Democratic presidential nomination, his chances of actually making it to the White House are somewhere between zero and nothing.

That, at least, is the view of some political observers. One of the reasons for their pessimism is Sanders’ political ideology: He’s a self-described "Democratic Socialist."

And the S-word frightens a lot of Americans.

A Pew Research Center survey conducted in December 2011, shortly after the Occupy Wall Street protests, which highlighted the growing wealth gap between the rich and the poor, found half of all Americans still had a positive view of capitalism, while 60 percent had a negative perception of socialism.

“Socialism is a far more divisive word (than capitalism), with wide differences of opinion along racial, generational, socioeconomic and political lines,” Pew said.

“Fully nine-in-ten conservative Republicans (90 percent) view socialism negatively, while nearly six-in-ten liberal Democrats (59 percent) react positively. Low-income Americans are twice as likely as higher-income Americans to offer a positive assessment of socialism (43 percent among those with incomes under $30,000, 22 percent among those earning $75,000 or more).”

A Gallup survey this summer found similar anti-socialist views among American voters, half of whom said they wouldn't vote for a socialist candidate.

It's not hard to see why this is. For many Americans the word "socialism" still carries the associations with authoritarianism that it acquired during the Cold War. That explains why some opponents of Obama's Affordable Care Act were calling it the same thing Ronald Reagan called Medicare in 1961: "socialized medicine." Combine those negative Cold War associations with the fact that a significant portion of the American electorate wants to shrink government, limit spending, and cut taxes, and you realize that Bernie Sanders has his work cut out for him if he's going to proudly wave the socialist flag.

One thing Sanders has on his side: social welfare policies enacted overseas in nations that consistently rank more highly than the United States in terms of happiness and prosperity. If Sanders can convince Americans voters that this is what he's talking about when he talks about "socialism," maybe he'll have a shot.

Here are five things other countries do that could change American minds about socialism.

Free baby stuff

Since the 1930s, the Finnish government has been issuing pregnant women with a cardboard box filled with the sort of stuff they will need when their baby is born: clothes, nappies, toys, sheets, blankets and a mattress. Babies often end up sleeping in the same box, which has been credited for the country's low infant mortality rate of two deaths per 1,000 live births in 2014 — one-third of the rate in the United States.

More than a year of paid parental leave

Sweden has, hands down, one of the best parental leave systems in the world. Parents are allowed to take 480 days, or 16 months, of paid leave to look after their children — biological or adopted. The leave only expires when the child is eight years old. If that wasn’t generous enough, the Swedish government wants to force new dads to take a minimum of three months paid leave.

Generous unemployment benefits

No job? No worries, at least if you are in Denmark and worked 52 weeks in the previous three years. Unemployed Danes are entitled to 90 percent of their average earnings. Despite the extremely generous allowance, the Danish unemployment rate was a seasonally adjusted 6.3 percent in August, one of the lowest in the European Union.

Free healthcare for everyone

Planning to have a baby? Go to France. The country’s universal health care system has long been lauded as one of the best in the world. It uses public and private funding to cover pretty much everyone, including the unemployed and undocumented immigrants applying for residency.

And it’s particularly generous to expecting mothers, as American Claire Lundberg found out when she got pregnant while living in Paris.

“From the sixth month of pregnancy to 11 days after a child’s birth, the government covers a woman’s medical expenses in full,” Lundberg wrote in Slate.

“… had I managed to book a bed in one of the public wards (of a hospital), my birth would have been completely free, paid for entirely by the government’s Assurance Maladie.”

Long holidays (that are paid)

Austrians get a lot of time off every year. A 2013 study by the Center for Economic and Policy Research found Austrians receive 38 statutory paid holiday and vacation days a year, topping a list of 21 rich countries that included 16 European nations, Australia, Canada, Japan, New Zealand and the United States, which ranked last with zero days. Global Post If Bernie Sanders were to win the Democratic presidential nomination, his chances of actually making it to the White House are somewhere between zero and nothing.

That, at least, is the view of some political observers. One of the reasons for their pessimism is Sanders’ political ideology: He’s a self-described "Democratic Socialist."

And the S-word frightens a lot of Americans.

A Pew Research Center survey conducted in December 2011, shortly after the Occupy Wall Street protests, which highlighted the growing wealth gap between the rich and the poor, found half of all Americans still had a positive view of capitalism, while 60 percent had a negative perception of socialism.

“Socialism is a far more divisive word (than capitalism), with wide differences of opinion along racial, generational, socioeconomic and political lines,” Pew said.

“Fully nine-in-ten conservative Republicans (90 percent) view socialism negatively, while nearly six-in-ten liberal Democrats (59 percent) react positively. Low-income Americans are twice as likely as higher-income Americans to offer a positive assessment of socialism (43 percent among those with incomes under $30,000, 22 percent among those earning $75,000 or more).”

A Gallup survey this summer found similar anti-socialist views among American voters, half of whom said they wouldn't vote for a socialist candidate.

It's not hard to see why this is. For many Americans the word "socialism" still carries the associations with authoritarianism that it acquired during the Cold War. That explains why some opponents of Obama's Affordable Care Act were calling it the same thing Ronald Reagan called Medicare in 1961: "socialized medicine." Combine those negative Cold War associations with the fact that a significant portion of the American electorate wants to shrink government, limit spending, and cut taxes, and you realize that Bernie Sanders has his work cut out for him if he's going to proudly wave the socialist flag.

One thing Sanders has on his side: social welfare policies enacted overseas in nations that consistently rank more highly than the United States in terms of happiness and prosperity. If Sanders can convince Americans voters that this is what he's talking about when he talks about "socialism," maybe he'll have a shot.

Here are five things other countries do that could change American minds about socialism.

Free baby stuff

Since the 1930s, the Finnish government has been issuing pregnant women with a cardboard box filled with the sort of stuff they will need when their baby is born: clothes, nappies, toys, sheets, blankets and a mattress. Babies often end up sleeping in the same box, which has been credited for the country's low infant mortality rate of two deaths per 1,000 live births in 2014 — one-third of the rate in the United States.

More than a year of paid parental leave

Sweden has, hands down, one of the best parental leave systems in the world. Parents are allowed to take 480 days, or 16 months, of paid leave to look after their children — biological or adopted. The leave only expires when the child is eight years old. If that wasn’t generous enough, the Swedish government wants to force new dads to take a minimum of three months paid leave.

Generous unemployment benefits

No job? No worries, at least if you are in Denmark and worked 52 weeks in the previous three years. Unemployed Danes are entitled to 90 percent of their average earnings. Despite the extremely generous allowance, the Danish unemployment rate was a seasonally adjusted 6.3 percent in August, one of the lowest in the European Union.

Free healthcare for everyone

Planning to have a baby? Go to France. The country’s universal health care system has long been lauded as one of the best in the world. It uses public and private funding to cover pretty much everyone, including the unemployed and undocumented immigrants applying for residency.

And it’s particularly generous to expecting mothers, as American Claire Lundberg found out when she got pregnant while living in Paris.

“From the sixth month of pregnancy to 11 days after a child’s birth, the government covers a woman’s medical expenses in full,” Lundberg wrote in Slate.

“… had I managed to book a bed in one of the public wards (of a hospital), my birth would have been completely free, paid for entirely by the government’s Assurance Maladie.”

Long holidays (that are paid)

Austrians get a lot of time off every year. A 2013 study by the Center for Economic and Policy Research found Austrians receive 38 statutory paid holiday and vacation days a year, topping a list of 21 rich countries that included 16 European nations, Australia, Canada, Japan, New Zealand and the United States, which ranked last with zero days.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 10:00

Rathergate and the dark magic of 2004: When the GOP learned how to subvert truth and alter political reality

Most of us, I would assume, have eagerly erased all memories of the dismal 2004 presidential campaign between George W. Bush and John Kerry, and for good reasons. It was the post-9/11 election, with each candidate trying to out-pander and out-patriot the other: Remember Kerry snapping off a salute to the Democratic convention and announcing that he was “reporting for duty”? I hope not. Bush was re-elected -- if we accept the fiction that he had ever been elected in the first place -- even though his war in Iraq went badly south that year and the neocon dream of a Pax Americana across the Middle East was clearly falling apart. Indeed, 2004 gave us the only presidential election in the last six in which the Republican candidate won the popular vote. That campaign lacked the drama of the “hanging chad” stolen election of 2000 or the post-crash Obama landslide of 2008 (although that one also feels like a generation ago). In keeping with the supposed mood of shared trauma and national comity, the two candidates affected not to dislike each other, and expressed their policy differences only in the context of their boundless and unquestioned love for America. But Mary Mapes has not forgotten that election, and neither has Dan Rather. Painful as all of this is to recall, there are reasons we should not forget it either. Mapes was the CBS News producer who put together Rather’s infamous September 2004 report on "60 Minutes" about George W. Bush’s dubious service record in the Texas Air National Guard, a glamorous flyboy unit that played host in the late ’60s and early ’70s to numerous scions of prominent Lone Star State families who were less than eager to serve in Vietnam. After the story became the focus of a furious Internet-based right-wing counterattack over the disputed authenticity of two documents, which were not essential to the overall case, Rather was forced to resign as anchor of the "CBS Evening News" and Mapes was fired, along with several other executives and producers. (She never worked in TV journalism again.) When paired with the third-party “Swift Boat ads” attacking John Kerry’s record of actual service in Vietnam (and his actual war injuries), Rathergate announced the advent of a new form of Republican political black magic, which could distort reality so severely that truth became lies and lies became truth. James Vanderbilt’s tendentious docudrama “Truth,” which stars Cate Blanchett as Mapes and Robert Redford as Rather, is not a particularly good movie. It’s full of plucky newsroom banter, colorful Texas coots and craven corporate suits yelling at people. Blanchett uses essentially the same upper-crust American accent that won her an Oscar in “Blue Jasmine” to play Mapes, who grew up in modest circumstances in the Pacific Northwest. Redford, who has never played anything except the “Robert Redford role,” doesn’t look or sound anything like Dan Rather. But along the way, “Truth” makes the point that when right-wing operatives (in this case Karl Rove, Bush’s Gollum-like strategist) discovered that they could topple America’s most venerated newsman over sloppy reporting shortcuts in a story whose basic facts were not in dispute, then or now, a Rubicon had been crossed. Whether Rathergate actually won that election for Bush is debatable; in the great Democratic middle-road milquetoast tradition of Al Gore and Michael Dukakis, the dithery, soporific Kerry campaign did everything it could to ensure defeat. But that whole episode clearly boosted Republican morale and made Rove and his minions feel that they were in control of the narrative and had the power to transform any possible negative into a positive. Rathergate was certainly not the first counterfactual counterattack strategy in American political history, and was not quite the birth of the right-wing blogosphere (which had surfaced, in embryonic form, during the Clinton years), but it marked a crucial evolution on both fronts. From 2004 flowed many blessings, at least for those who sought to control political discourse and popular perception and in so doing alter the nature of reality. Suddenly all things were possible, and America became an endless episode of “The X-Files,” brought to you by faceless men in nice suits who are eager to warn you about the ravages of government and the duplicity of the “liberal media.” Every delirious resistance hypothesis previously promulgated by retired pharmacists who hand out mimeographed screeds in the mall now became a conduit for sowing widespread bewilderment and disempowerment. Climate-change denialism and birth-certificate trutherism. Death panels. Benghazi and Planned Parenthood. Every single aspect of what was once Tea Party ideology and has now become Republican orthodoxy, especially the upside-down economics in which high taxes (which are at historic lows) and social spending (which has been slashed) are reducing Americans to poverty and misery, rather than, let’s just say, a pointless $4 trillion war or the intensifying concentration of wealth at the very top of the pyramid. In watching “Truth,” I remembered why I hadn’t paid much attention to Rather’s original story or the ensuing scandal at the time. Partly because the whole thing seemed to get lost in boring technical minutiae and partly because there were much better reasons to hate George W. Bush than whatever scams he had pulled as a young man to stay out of a stupid and dangerous war. I get that in the flag-waving context of 2004 it made him look like a hypocrite, but I totally didn’t care and still don’t. My older brother starved himself down to 130 pounds (at 6’3”) in order to fail his medical exam during the Vietnam era; some of his friends fled to Canada or faked mental illness or told the draft examiners they were gay when they weren’t. Mapes and her investigative team dug up a decent amount of documentary and circumstantial evidence that wasn’t especially explosive but tended to support the rumors that had long swirled around Bush’s National Guard years. Bush or his dad, who was then a congressman representing an affluent Houston district, had apparently pulled strings to get him into the unit in the first place, and the young lieutenant had received all sorts of preferential treatment: He skipped a physical exam that might have included a drug test, got a yearlong transfer to an Alabama Guard unit but never showed up there (with no consequences), and finally got an early discharge so he could attend Harvard Business School. News flash: Rich kids have it easy. But Mapes and Rather couldn’t get any of Bush’s superior officers or Guard comrades to go on camera and testify that the future president had been a youthful hell-raiser who had essentially ditched his military commitment but was seen as politically untouchable. That was hardly surprising, but it was also what lured them into Karl Rove’s alternate universe -- and perhaps, according to some theories of the case, into a trap that Rove or Roger Stone or some other GOP Sith lord had laid for them. This gets played out in “Truth” (which is based on Mapes’ memoir) at far more length than it deserves, but the short version is that they got hold of photocopied documents that purported to be memos from Bush’s commanding officer, complaining about his young lieutenant who had gone AWOL but who couldn’t be disciplined for it. Those memos provided exactly the evidence Mapes needed to lock down the story, which might have been her first clue that something was wrong. Since her source did not have the originals -- and told bizarre and contradictory stories about who had given him the copies -- document experts could only make guesses as to their provenance and authenticity. Hilarious as this may seem in retrospect, none of the professionals paid by CBS News to examine these documents noticed that the typeface and font size and kerning and margins of military memos supposedly composed on an electric typewriter in the early 1970s precisely matched the default settings in Microsoft Word. If the account in “Truth” is any indication, Mary Mapes still believes those documents were authentic, or perhaps that their content was genuine but the physical copies she had Dan Rather present on “60 Minutes” had been retyped by unknown persons for unknown reasons. Of course I have no idea and maybe that’s so, but the whole Microsoft Word default settings problem is pretty difficult to get past. On one level, I get it: Mapes sees herself as an old-school, award-winning journalist who got railroaded for doing her job, and she’s right about that. She doesn’t want to admit that she got bamboozled by an easily detectable forgery into a dumb mistake that undermined a good muckraking story, sent Dan Rather into early retirement and drastically accelerated the downsizing, fluff-ification and corporate evisceration of the entire TV news business. As the title of this mediocre movie suggests, Mapes -- or at least the heroic version of her scripted by James Vanderbilt and portrayed by Cate Blanchett -- can’t get past the idea that the truth matters, whether it’s the truth about those admittedly puzzling documents, the truth about the long-ago military record of a former president we would all rather forget, or truth as the brilliant white beacon toward which all news reporting strives. But the factual accuracy of her “60 Minutes” story wasn’t the point in 2004 and is irrelevant today, when Karl Rove and the Koch brothers and the stewards of the global economy have moved long past such limited conceptions of truth,. Truth is not about finding a superscript “th” key on a vintage typewriter (an actual question of fact in Rathergate). It’s about telling a story that people want to believe. The kinds of stories that Mary Mapes and Dan Rather used to tell on CBS News didn’t make Americans feel good, and so they became untrue and were left behind.Most of us, I would assume, have eagerly erased all memories of the dismal 2004 presidential campaign between George W. Bush and John Kerry, and for good reasons. It was the post-9/11 election, with each candidate trying to out-pander and out-patriot the other: Remember Kerry snapping off a salute to the Democratic convention and announcing that he was “reporting for duty”? I hope not. Bush was re-elected -- if we accept the fiction that he had ever been elected in the first place -- even though his war in Iraq went badly south that year and the neocon dream of a Pax Americana across the Middle East was clearly falling apart. Indeed, 2004 gave us the only presidential election in the last six in which the Republican candidate won the popular vote. That campaign lacked the drama of the “hanging chad” stolen election of 2000 or the post-crash Obama landslide of 2008 (although that one also feels like a generation ago). In keeping with the supposed mood of shared trauma and national comity, the two candidates affected not to dislike each other, and expressed their policy differences only in the context of their boundless and unquestioned love for America. But Mary Mapes has not forgotten that election, and neither has Dan Rather. Painful as all of this is to recall, there are reasons we should not forget it either. Mapes was the CBS News producer who put together Rather’s infamous September 2004 report on "60 Minutes" about George W. Bush’s dubious service record in the Texas Air National Guard, a glamorous flyboy unit that played host in the late ’60s and early ’70s to numerous scions of prominent Lone Star State families who were less than eager to serve in Vietnam. After the story became the focus of a furious Internet-based right-wing counterattack over the disputed authenticity of two documents, which were not essential to the overall case, Rather was forced to resign as anchor of the "CBS Evening News" and Mapes was fired, along with several other executives and producers. (She never worked in TV journalism again.) When paired with the third-party “Swift Boat ads” attacking John Kerry’s record of actual service in Vietnam (and his actual war injuries), Rathergate announced the advent of a new form of Republican political black magic, which could distort reality so severely that truth became lies and lies became truth. James Vanderbilt’s tendentious docudrama “Truth,” which stars Cate Blanchett as Mapes and Robert Redford as Rather, is not a particularly good movie. It’s full of plucky newsroom banter, colorful Texas coots and craven corporate suits yelling at people. Blanchett uses essentially the same upper-crust American accent that won her an Oscar in “Blue Jasmine” to play Mapes, who grew up in modest circumstances in the Pacific Northwest. Redford, who has never played anything except the “Robert Redford role,” doesn’t look or sound anything like Dan Rather. But along the way, “Truth” makes the point that when right-wing operatives (in this case Karl Rove, Bush’s Gollum-like strategist) discovered that they could topple America’s most venerated newsman over sloppy reporting shortcuts in a story whose basic facts were not in dispute, then or now, a Rubicon had been crossed. Whether Rathergate actually won that election for Bush is debatable; in the great Democratic middle-road milquetoast tradition of Al Gore and Michael Dukakis, the dithery, soporific Kerry campaign did everything it could to ensure defeat. But that whole episode clearly boosted Republican morale and made Rove and his minions feel that they were in control of the narrative and had the power to transform any possible negative into a positive. Rathergate was certainly not the first counterfactual counterattack strategy in American political history, and was not quite the birth of the right-wing blogosphere (which had surfaced, in embryonic form, during the Clinton years), but it marked a crucial evolution on both fronts. From 2004 flowed many blessings, at least for those who sought to control political discourse and popular perception and in so doing alter the nature of reality. Suddenly all things were possible, and America became an endless episode of “The X-Files,” brought to you by faceless men in nice suits who are eager to warn you about the ravages of government and the duplicity of the “liberal media.” Every delirious resistance hypothesis previously promulgated by retired pharmacists who hand out mimeographed screeds in the mall now became a conduit for sowing widespread bewilderment and disempowerment. Climate-change denialism and birth-certificate trutherism. Death panels. Benghazi and Planned Parenthood. Every single aspect of what was once Tea Party ideology and has now become Republican orthodoxy, especially the upside-down economics in which high taxes (which are at historic lows) and social spending (which has been slashed) are reducing Americans to poverty and misery, rather than, let’s just say, a pointless $4 trillion war or the intensifying concentration of wealth at the very top of the pyramid. In watching “Truth,” I remembered why I hadn’t paid much attention to Rather’s original story or the ensuing scandal at the time. Partly because the whole thing seemed to get lost in boring technical minutiae and partly because there were much better reasons to hate George W. Bush than whatever scams he had pulled as a young man to stay out of a stupid and dangerous war. I get that in the flag-waving context of 2004 it made him look like a hypocrite, but I totally didn’t care and still don’t. My older brother starved himself down to 130 pounds (at 6’3”) in order to fail his medical exam during the Vietnam era; some of his friends fled to Canada or faked mental illness or told the draft examiners they were gay when they weren’t. Mapes and her investigative team dug up a decent amount of documentary and circumstantial evidence that wasn’t especially explosive but tended to support the rumors that had long swirled around Bush’s National Guard years. Bush or his dad, who was then a congressman representing an affluent Houston district, had apparently pulled strings to get him into the unit in the first place, and the young lieutenant had received all sorts of preferential treatment: He skipped a physical exam that might have included a drug test, got a yearlong transfer to an Alabama Guard unit but never showed up there (with no consequences), and finally got an early discharge so he could attend Harvard Business School. News flash: Rich kids have it easy. But Mapes and Rather couldn’t get any of Bush’s superior officers or Guard comrades to go on camera and testify that the future president had been a youthful hell-raiser who had essentially ditched his military commitment but was seen as politically untouchable. That was hardly surprising, but it was also what lured them into Karl Rove’s alternate universe -- and perhaps, according to some theories of the case, into a trap that Rove or Roger Stone or some other GOP Sith lord had laid for them. This gets played out in “Truth” (which is based on Mapes’ memoir) at far more length than it deserves, but the short version is that they got hold of photocopied documents that purported to be memos from Bush’s commanding officer, complaining about his young lieutenant who had gone AWOL but who couldn’t be disciplined for it. Those memos provided exactly the evidence Mapes needed to lock down the story, which might have been her first clue that something was wrong. Since her source did not have the originals -- and told bizarre and contradictory stories about who had given him the copies -- document experts could only make guesses as to their provenance and authenticity. Hilarious as this may seem in retrospect, none of the professionals paid by CBS News to examine these documents noticed that the typeface and font size and kerning and margins of military memos supposedly composed on an electric typewriter in the early 1970s precisely matched the default settings in Microsoft Word. If the account in “Truth” is any indication, Mary Mapes still believes those documents were authentic, or perhaps that their content was genuine but the physical copies she had Dan Rather present on “60 Minutes” had been retyped by unknown persons for unknown reasons. Of course I have no idea and maybe that’s so, but the whole Microsoft Word default settings problem is pretty difficult to get past. On one level, I get it: Mapes sees herself as an old-school, award-winning journalist who got railroaded for doing her job, and she’s right about that. She doesn’t want to admit that she got bamboozled by an easily detectable forgery into a dumb mistake that undermined a good muckraking story, sent Dan Rather into early retirement and drastically accelerated the downsizing, fluff-ification and corporate evisceration of the entire TV news business. As the title of this mediocre movie suggests, Mapes -- or at least the heroic version of her scripted by James Vanderbilt and portrayed by Cate Blanchett -- can’t get past the idea that the truth matters, whether it’s the truth about those admittedly puzzling documents, the truth about the long-ago military record of a former president we would all rather forget, or truth as the brilliant white beacon toward which all news reporting strives. But the factual accuracy of her “60 Minutes” story wasn’t the point in 2004 and is irrelevant today, when Karl Rove and the Koch brothers and the stewards of the global economy have moved long past such limited conceptions of truth,. Truth is not about finding a superscript “th” key on a vintage typewriter (an actual question of fact in Rathergate). It’s about telling a story that people want to believe. The kinds of stories that Mary Mapes and Dan Rather used to tell on CBS News didn’t make Americans feel good, and so they became untrue and were left behind.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 09:00

Bracing for the Third Intifada: Why violence in Jerusalem signals an ugly future

A wave of violence and reprisals flaring in East Jerusalem and occupied Palestine has both Israelis and Palestinians preparing for the potential of a third Intifada, or Palestinian uprising against the Israeli military occupation. Palestinian Authority president Mahmoud Abbas has uttered the powerful word; Hamas leader Ismail Haniyeh and the head of the second largest party in Israel, Isaac Herzog, all see the violence as an incipient third Intifada. The 1987 and 2000 Intifadas claimed more than 5,000 Palestinian and Israeli lives, most of them civilians on both sides. On the heels of the Gaza-oriented conflict of recent years, those brutal and bloody invasions of the Hamas-governed Strip having become almost routine, along with relative peace in the larger occupied West Bank, a Palestine-wide rebellion may surprise many who haven’t paid much attention since the second Intifada concluded in 2005. But it’s no longer the alarmists you follow on Twitter who warn now of an Intifada; the most important diplomat in the world, Secretary of State John Kerry, boldly took to Israeli television in 2013 to warn citizens that without a cessation to settlement growth, a central precondition to peace talks, Palestinian rebellion might become an inevitability. The statement came during an interview broadcast simultaneously on Israel's Channel 2 and the Palestinian Broadcasting Corporation, and saw Kerry uttering the following hypothetical: “I mean, does Israel want a third Intifada?” In language that would be fairly radical for any White House official, Kerry delivered directly into Israeli living rooms a warning about the “increasing isolation of Israel” and a continuation of the “delegitimization of Israel that’s been taking place on an international basis” due to its intransigent land-grabbing in the West Bank. “If we do not resolve the question of settlements and the question of who lives where and how and what rights they have,” Kerry continued, “if we don’t end the presence of Israeli soldiers perpetually within the West Bank, then there will be an increasing feeling [among West Bank Palestinians] that if we cannot get peace with a leadership that is committed to nonviolence, you may wind up with leadership that is committed to violence.” American largesse notwithstanding, the government of Benjamin Netanyahu and the voters who returned him to office earlier this year weren’t much fazed by Kerry’s warning, and settlement-building has continued mostly unabated. You might even think Israel is provoking an Intifada. Under Ariel Sharon, Israel removed its relative handful of Gaza settlers at the end of the second Intifada in 2005, only to step up its populating of the West Bank, with the intervening decade seeing that number rocket from around 250,000 to more than 350,000, with another 300,000 in Israeli-occupied East Jerusalem. The Israeli population in the West Bank has grown at twice the rate of Israel proper. Earlier this year, the New York Times described settlement growth under Netanyahu as “a march toward permanence,” the approach of a threshold of irreversibility and the effective annexation of the territory. In its latest provocation against Palestine -- and most of the international community that isn’t the U.S. government -- Prime Minister Netanyahu’s government in September sent as its ambassador to the United Nations General Assembly Danny Danon, a true radical who openly discusses the annexation of the West Bank and East Jerusalem, per millennia-old scripture. Israeli newspaper Haaretz said the selection of Danon “throws Israel off the diplomatic cliff” ahead of last month’s meeting of the community of nations. According to Israel’s second-largest party, the Zionist Union, the appointment is “another nail in the coffin that Bibi [Netanyahu] is putting in Israel’s foreign relations.” David Horovitz, the founding editor of the Times of Israel, wrote, “Undeniably, now, by the prime minister’s own decree, Danny Danon is the true face of Netanyahu’s Israel.” Danon, whom Horovitz calls the “arch-critic of a two-state solution,” proposed in a 2011 New York Times op-ed that Israel should annex the West Bank and establish an apartheid state. Danon wrote that Netanyahu “should annex the Jewish communities of the West Bank, or as Israelis prefer to refer to our historic heartland, Judea and Samaria,” laying bare the desire of many Israelis to establish the Israel of the Torah, stretching from the Mediterranean to the Jordan River. In Danon’s proposal, the nearly 3 million Palestinians caught in between the settlements “would not have the option to become Israeli citizens, therefore averting the threat to the Jewish and democratic status of Israel by a growing Palestinian population.” “I think we should no longer think of Jewish settlements in the West Bank, but Palestinian settlements in Israel,” Danon said in 2013, while still deputy defense minister. Unlike the democratic one-state outcomes that many observers see as increasingly inevitable in the event of ongoing Israeli expansion, Danon’s single state remains a Jewish Israel -- which would require a good bit more ethnic cleansing in order to ensure that an Israel built on top of Palestinian territory doesn’t include whatever Palestinians are left. One imagines that remaining Palestinians would be Jim Crow-style quasi-citizens or shunted off to Jordan somehow, as Danon has suggested. On the flash point of Jerusalem, divvied among Israelis and Palestinians under international law, both Danon and Netanyahu aggressively assert Israeli rule. The conclusion of the Six Days War in 1967 saw Israel gaining control of the West Bank, and, in particular, East Jerusalem’s Temple Mount, site of the Muslim Dome of the Rock and the al-Aqsa Mosque, the third-holiest site in Islam. Sharon’s visit to al-Aqsa in 2000 is widely regarded as the antagonistic spark that ignited the second Intifada, also called the “Al Aqsa Intifada.” Likewise, the current violence is seen to have emerged out of events at the Temple Mount. After a week of clashes between worshippers and the Israeli security force, the Israeli government last month barred men under 40 from the Muslim holy site. More deadly attacks by Palestinians and trigger-happy Israeli security forces since have continued to bear the mark of an Intifada. Israeli citizens, not combatants, are being targeted in knife attacks, and the response by Israeli security forces is seen as disproportionate by many Palestinians, with President Abbas’ spokesman comparing the police “execution” of an alleged 15-year-old assailant to the killing of 12-year-old Mohammad al-Dura in 2000, an event that helped propel the second uprising. Is this the third Intifada Secretary Kerry predicted? Netanyahu’s response since Kerry’s dire warning has demonstrated a surprising willingness to shrug off such admonitions from Israel’s principal benefactor and sole good friend on the international stage. The occupation has proceeded and expanded without pause, and the ultimate aim of annexation can no longer be denied, with Danon presented to the world community as the diplomatic face of Israel. The “settler”-based nomenclature no longer suffices; this is colonization, illegal by any measure. Who expects docility and acquiescence in the face of colonizers? Kerry surely doesn’t: “I’ve got news for you. Today’s status quo will not be tomorrow’s,” he warned in 2013. “As long as the aspirations of people are held down... the possibilities of violence increase.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 08:59

Howard Zinn: “The American Empire has always been a bipartisan project”

With an occupying army waging war in Iraq and Afghanistan, with military bases and corporate bullying in every part of the world, there is hardly a question any more of the existence of an American Empire. Indeed, the once fervent denials have turned into a boastful, unashamed embrace of the idea. However, the very idea that the United States was an empire did not occur to me until after I finished my work as a bombardier with the Eighth Air Force in the Second World War, and came home. Even as I began to have second thoughts about the purity of the "Good War," even after being horrified by Hiroshima and Nagasaki, even after rethinking my own bombing of towns in Europe, I still did not put all that together in the context of an American "Empire." I was conscious, like everyone, of the British Empire and the other imperial powers of Europe, but the United States was not seen in the same way. When, after the war, I went to college under the G.I. Bill of Rights and took courses in U.S. history, I usually found a chapter in the history texts called "The Age of Imperialism." It invariably referred to the Spanish-American War of 1898 and the conquest of the Philippines that followed. It seemed that American imperialism lasted only a relatively few years. There was no overarching view of U.S. expansion that might lead to the idea of a more far-ranging empire -- or period of "imperialism." I recall the classroom map (labeled "Western Expansion") which presented the march across the continent as a natural, almost biological phenomenon. That huge acquisition of land called "The Louisiana Purchase" hinted at nothing but vacant land acquired. There was no sense that this territory had been occupied by hundreds of Indian tribes which would have to be annihilated or forced from their homes -- what we now call "ethnic cleansing" -- so that whites could settle the land, and later railroads could crisscross it, presaging "civilization" and its brutal discontents. Neither the discussions of "Jacksonian democracy" in history courses, nor the popular book by Arthur Schlesinger Jr., The Age of Jackson, told me about the "Trail of Tears," the deadly forced march of "the five civilized tribes" westward from Georgia and Alabama across the Mississippi, leaving 4,000 dead in their wake. No treatment of the Civil War mentioned the Sand Creek massacre of hundreds of Indian villagers in Colorado just as "emancipation" was proclaimed for black people by Lincoln's administration. That classroom map also had a section to the south and west labeled "Mexican Cession." This was a handy euphemism for the aggressive war against Mexico in 1846 in which the United States seized half of that country's land, giving us California and the great Southwest. The term "Manifest Destiny," used at that time, soon of course became more universal. On the eve of the Spanish-American War in 1898, theWashington Post saw beyond Cuba: "We are face to face with a strange destiny. The taste of Empire is in the mouth of the people even as the taste of blood in the jungle." The violent march across the continent, and even the invasion of Cuba, appeared to be within a natural sphere of U.S. interest. After all, hadn't the Monroe Doctrine of 1823 declared the Western Hemisphere to be under our protection? But with hardly a pause after Cuba came the invasion of the Philippines, halfway around the world. The word "imperialism" now seemed a fitting one for U.S. actions. Indeed, that long, cruel war -- treated quickly and superficially in the history books -- gave rise to an Anti-Imperialist League, in which William James and Mark Twain were leading figures. But this was not something I learned in university either. The "Sole Superpower" Comes into View Reading outside the classroom, however, I began to fit the pieces of history into a larger mosaic. What at first had seemed like a purely passive foreign policy in the decade leading up to the First World War now appeared as a succession of violent interventions: the seizure of the Panama Canal zone from Colombia, a naval bombardment of the Mexican coast, the dispatch of the Marines to almost every country in Central America, occupying armies sent to Haiti and the Dominican Republic. As the much-decorated General Smedley Butler, who participated in many of those interventions, wrote later: "I was an errand boy for Wall Street." At the very time I was learning this history -- the years after World War II -- the United States was becoming not just another imperial power, but the world's leading superpower. Determined to maintain and expand its monopoly on nuclear weapons, it was taking over remote islands in the Pacific, forcing the inhabitants to leave, and turning the islands into deadly playgrounds for more atomic tests. In his memoir, "No Place to Hide," Dr. David Bradley, who monitored radiation in those tests, described what was left behind as the testing teams went home: "[R]adioactivity, contamination, the wrecked island of Bikini and its sad-eyed patient exiles." The tests in the Pacific were followed, over the years, by more tests in the deserts of Utah and Nevada, more than a thousand tests in all. When the war in Korea began in 1950, I was still studying history as a graduate student at Columbia University. Nothing in my classes prepared me to understand American policy in Asia. But I was reading I. F. Stone's Weekly. Stone was among the very few journalists who questioned the official justification for sending an army to Korea. It seemed clear to me then that it was not the invasion of South Korea by the North that prompted U.S. intervention, but the desire of the United States to have a firm foothold on the continent of Asia, especially now that the Communists were in power in China. Years later, as the covert intervention in Vietnam grew into a massive and brutal military operation, the imperial designs of the United States became yet clearer to me. In 1967, I wrote a little book called "Vietnam: The Logic of Withdrawal." By that time I was heavily involved in the movement against the war. When I read the hundreds of pages of the Pentagon Papers entrusted to me by Daniel Ellsberg, what jumped out at me were the secret memos from the National Security Council. Explaining the U.S. interest in Southeast Asia, they spoke bluntly of the country's motives as a quest for "tin, rubber, oil." Neither the desertions of soldiers in the Mexican War, nor the draft riots of the Civil War, not the anti-imperialist groups at the turn of the century, nor the strong opposition to World War I -- indeed no antiwar movement in the history of the nation reached the scale of the opposition to the war in Vietnam. At least part of that opposition rested on an understanding that more than Vietnam was at stake, that the brutal war in that tiny country was part of a grander imperial design. Various interventions following the U.S. defeat in Vietnam seemed to reflect the desperate need of the still-reigning superpower -- even after the fall of its powerful rival, the Soviet Union -- to establish its dominance everywhere. Hence the invasion of Grenada in 1982, the bombing assault on Panama in 1989, the first Gulf war of 1991. Was George Bush Sr. heartsick over Saddam Hussein's seizure of Kuwait, or was he using that event as an opportunity to move U.S. power firmly into the coveted oil region of the Middle East? Given the history of the United States, given its obsession with Middle Eastern oil dating from Franklin Roosevelt's 1945 deal with King Abdul Aziz of Saudi Arabia, and the CIA's overthrow of the democratic Mossadeq government in Iran in 1953, it is not hard to decide that question. Justifying Empire The ruthless attacks of September 11th (as the official 9/11 Commission acknowledged) derived from fierce hatred of U.S. expansion in the Middle East and elsewhere. Even before that event, the Defense Department acknowledged, according to Chalmers Johnson's book "The Sorrows of Empire," the existence of more than 700 American military bases outside of the United States. Since that date, with the initiation of a "war on terrorism," many more bases have been established or expanded: in Kyrgyzstan, Afghanistan, the desert of Qatar, the Gulf of Oman, the Horn of Africa, and wherever else a compliant nation could be bribed or coerced. When I was bombing cities in Germany, Hungary, Czechoslovakia, and France in the Second World War, the moral justification was so simple and clear as to be beyond discussion: We were saving the world from the evil of fascism. I was therefore startled to hear from a gunner on another crew -- what we had in common was that we both read books -- that he considered this "an imperialist war." Both sides, he said, were motivated by ambitions of control and conquest. We argued without resolving the issue. Ironically, tragically, not long after our discussion, this fellow was shot down and killed on a mission. In wars, there is always a difference between the motives of the soldiers and the motives of the political leaders who send them into battle. My motive, like that of so many, was innocent of imperial ambition. It was to help defeat fascism and create a more decent world, free of aggression, militarism, and racism. The motive of the U.S. establishment, understood by the aerial gunner I knew, was of a different nature. It was described early in 1941 by Henry Luce, multi-millionaire owner of Time, Life, and Fortune magazines, as the coming of "The American Century." The time had arrived, he said, for the United States "to exert upon the world the full impact of our influence, for such purposes as we see fit, and by such means as we see fit." We can hardly ask for a more candid, blunter declaration of imperial design. It has been echoed in recent years by the intellectual handmaidens of the Bush administration, but with assurances that the motive of this "influence" is benign, that the "purposes" -- whether in Luce's formulation or more recent ones -- are noble, that this is an "imperialism lite." As George Bush said in his second inaugural address: "Spreading liberty around the world is the calling of our time." The New York Times called that speech "striking for its idealism." The American Empire has always been a bipartisan project -- Democrats and Republicans have taken turns extending it, extolling it, justifying it. President Woodrow Wilson told graduates of the Naval Academy in 1914 (the year he bombarded Mexico) that the U.S. used "her navy and her army... as the instruments of civilization, not as the instruments of aggression." And Bill Clinton, in 1992, told West Point graduates: "The values you learned here will be able to spread throughout the country and throughout the world." For the people of the United States, and indeed for people all over the world, those claims sooner or later are revealed to be false. The rhetoric, often persuasive on first hearing, soon becomes overwhelmed by horrors that can no longer be concealed: the bloody corpses of Iraq, the torn limbs of American GIs, the millions of families driven from their homes -- in the Middle East and in the Mississippi Delta. Have not the justifications for empire, embedded in our culture, assaulting our good sense -- that war is necessary for security, that expansion is fundamental to civilization -- begun to lose their hold on our minds? Have we reached a point in history where we are ready to embrace a new way of living in the world, expanding not our military power, but our humanity?With an occupying army waging war in Iraq and Afghanistan, with military bases and corporate bullying in every part of the world, there is hardly a question any more of the existence of an American Empire. Indeed, the once fervent denials have turned into a boastful, unashamed embrace of the idea. However, the very idea that the United States was an empire did not occur to me until after I finished my work as a bombardier with the Eighth Air Force in the Second World War, and came home. Even as I began to have second thoughts about the purity of the "Good War," even after being horrified by Hiroshima and Nagasaki, even after rethinking my own bombing of towns in Europe, I still did not put all that together in the context of an American "Empire." I was conscious, like everyone, of the British Empire and the other imperial powers of Europe, but the United States was not seen in the same way. When, after the war, I went to college under the G.I. Bill of Rights and took courses in U.S. history, I usually found a chapter in the history texts called "The Age of Imperialism." It invariably referred to the Spanish-American War of 1898 and the conquest of the Philippines that followed. It seemed that American imperialism lasted only a relatively few years. There was no overarching view of U.S. expansion that might lead to the idea of a more far-ranging empire -- or period of "imperialism." I recall the classroom map (labeled "Western Expansion") which presented the march across the continent as a natural, almost biological phenomenon. That huge acquisition of land called "The Louisiana Purchase" hinted at nothing but vacant land acquired. There was no sense that this territory had been occupied by hundreds of Indian tribes which would have to be annihilated or forced from their homes -- what we now call "ethnic cleansing" -- so that whites could settle the land, and later railroads could crisscross it, presaging "civilization" and its brutal discontents. Neither the discussions of "Jacksonian democracy" in history courses, nor the popular book by Arthur Schlesinger Jr., The Age of Jackson, told me about the "Trail of Tears," the deadly forced march of "the five civilized tribes" westward from Georgia and Alabama across the Mississippi, leaving 4,000 dead in their wake. No treatment of the Civil War mentioned the Sand Creek massacre of hundreds of Indian villagers in Colorado just as "emancipation" was proclaimed for black people by Lincoln's administration. That classroom map also had a section to the south and west labeled "Mexican Cession." This was a handy euphemism for the aggressive war against Mexico in 1846 in which the United States seized half of that country's land, giving us California and the great Southwest. The term "Manifest Destiny," used at that time, soon of course became more universal. On the eve of the Spanish-American War in 1898, theWashington Post saw beyond Cuba: "We are face to face with a strange destiny. The taste of Empire is in the mouth of the people even as the taste of blood in the jungle." The violent march across the continent, and even the invasion of Cuba, appeared to be within a natural sphere of U.S. interest. After all, hadn't the Monroe Doctrine of 1823 declared the Western Hemisphere to be under our protection? But with hardly a pause after Cuba came the invasion of the Philippines, halfway around the world. The word "imperialism" now seemed a fitting one for U.S. actions. Indeed, that long, cruel war -- treated quickly and superficially in the history books -- gave rise to an Anti-Imperialist League, in which William James and Mark Twain were leading figures. But this was not something I learned in university either. The "Sole Superpower" Comes into View Reading outside the classroom, however, I began to fit the pieces of history into a larger mosaic. What at first had seemed like a purely passive foreign policy in the decade leading up to the First World War now appeared as a succession of violent interventions: the seizure of the Panama Canal zone from Colombia, a naval bombardment of the Mexican coast, the dispatch of the Marines to almost every country in Central America, occupying armies sent to Haiti and the Dominican Republic. As the much-decorated General Smedley Butler, who participated in many of those interventions, wrote later: "I was an errand boy for Wall Street." At the very time I was learning this history -- the years after World War II -- the United States was becoming not just another imperial power, but the world's leading superpower. Determined to maintain and expand its monopoly on nuclear weapons, it was taking over remote islands in the Pacific, forcing the inhabitants to leave, and turning the islands into deadly playgrounds for more atomic tests. In his memoir, "No Place to Hide," Dr. David Bradley, who monitored radiation in those tests, described what was left behind as the testing teams went home: "[R]adioactivity, contamination, the wrecked island of Bikini and its sad-eyed patient exiles." The tests in the Pacific were followed, over the years, by more tests in the deserts of Utah and Nevada, more than a thousand tests in all. When the war in Korea began in 1950, I was still studying history as a graduate student at Columbia University. Nothing in my classes prepared me to understand American policy in Asia. But I was reading I. F. Stone's Weekly. Stone was among the very few journalists who questioned the official justification for sending an army to Korea. It seemed clear to me then that it was not the invasion of South Korea by the North that prompted U.S. intervention, but the desire of the United States to have a firm foothold on the continent of Asia, especially now that the Communists were in power in China. Years later, as the covert intervention in Vietnam grew into a massive and brutal military operation, the imperial designs of the United States became yet clearer to me. In 1967, I wrote a little book called "Vietnam: The Logic of Withdrawal." By that time I was heavily involved in the movement against the war. When I read the hundreds of pages of the Pentagon Papers entrusted to me by Daniel Ellsberg, what jumped out at me were the secret memos from the National Security Council. Explaining the U.S. interest in Southeast Asia, they spoke bluntly of the country's motives as a quest for "tin, rubber, oil." Neither the desertions of soldiers in the Mexican War, nor the draft riots of the Civil War, not the anti-imperialist groups at the turn of the century, nor the strong opposition to World War I -- indeed no antiwar movement in the history of the nation reached the scale of the opposition to the war in Vietnam. At least part of that opposition rested on an understanding that more than Vietnam was at stake, that the brutal war in that tiny country was part of a grander imperial design. Various interventions following the U.S. defeat in Vietnam seemed to reflect the desperate need of the still-reigning superpower -- even after the fall of its powerful rival, the Soviet Union -- to establish its dominance everywhere. Hence the invasion of Grenada in 1982, the bombing assault on Panama in 1989, the first Gulf war of 1991. Was George Bush Sr. heartsick over Saddam Hussein's seizure of Kuwait, or was he using that event as an opportunity to move U.S. power firmly into the coveted oil region of the Middle East? Given the history of the United States, given its obsession with Middle Eastern oil dating from Franklin Roosevelt's 1945 deal with King Abdul Aziz of Saudi Arabia, and the CIA's overthrow of the democratic Mossadeq government in Iran in 1953, it is not hard to decide that question. Justifying Empire The ruthless attacks of September 11th (as the official 9/11 Commission acknowledged) derived from fierce hatred of U.S. expansion in the Middle East and elsewhere. Even before that event, the Defense Department acknowledged, according to Chalmers Johnson's book "The Sorrows of Empire," the existence of more than 700 American military bases outside of the United States. Since that date, with the initiation of a "war on terrorism," many more bases have been established or expanded: in Kyrgyzstan, Afghanistan, the desert of Qatar, the Gulf of Oman, the Horn of Africa, and wherever else a compliant nation could be bribed or coerced. When I was bombing cities in Germany, Hungary, Czechoslovakia, and France in the Second World War, the moral justification was so simple and clear as to be beyond discussion: We were saving the world from the evil of fascism. I was therefore startled to hear from a gunner on another crew -- what we had in common was that we both read books -- that he considered this "an imperialist war." Both sides, he said, were motivated by ambitions of control and conquest. We argued without resolving the issue. Ironically, tragically, not long after our discussion, this fellow was shot down and killed on a mission. In wars, there is always a difference between the motives of the soldiers and the motives of the political leaders who send them into battle. My motive, like that of so many, was innocent of imperial ambition. It was to help defeat fascism and create a more decent world, free of aggression, militarism, and racism. The motive of the U.S. establishment, understood by the aerial gunner I knew, was of a different nature. It was described early in 1941 by Henry Luce, multi-millionaire owner of Time, Life, and Fortune magazines, as the coming of "The American Century." The time had arrived, he said, for the United States "to exert upon the world the full impact of our influence, for such purposes as we see fit, and by such means as we see fit." We can hardly ask for a more candid, blunter declaration of imperial design. It has been echoed in recent years by the intellectual handmaidens of the Bush administration, but with assurances that the motive of this "influence" is benign, that the "purposes" -- whether in Luce's formulation or more recent ones -- are noble, that this is an "imperialism lite." As George Bush said in his second inaugural address: "Spreading liberty around the world is the calling of our time." The New York Times called that speech "striking for its idealism." The American Empire has always been a bipartisan project -- Democrats and Republicans have taken turns extending it, extolling it, justifying it. President Woodrow Wilson told graduates of the Naval Academy in 1914 (the year he bombarded Mexico) that the U.S. used "her navy and her army... as the instruments of civilization, not as the instruments of aggression." And Bill Clinton, in 1992, told West Point graduates: "The values you learned here will be able to spread throughout the country and throughout the world." For the people of the United States, and indeed for people all over the world, those claims sooner or later are revealed to be false. The rhetoric, often persuasive on first hearing, soon becomes overwhelmed by horrors that can no longer be concealed: the bloody corpses of Iraq, the torn limbs of American GIs, the millions of families driven from their homes -- in the Middle East and in the Mississippi Delta. Have not the justifications for empire, embedded in our culture, assaulting our good sense -- that war is necessary for security, that expansion is fundamental to civilization -- begun to lose their hold on our minds? Have we reached a point in history where we are ready to embrace a new way of living in the world, expanding not our military power, but our humanity?With an occupying army waging war in Iraq and Afghanistan, with military bases and corporate bullying in every part of the world, there is hardly a question any more of the existence of an American Empire. Indeed, the once fervent denials have turned into a boastful, unashamed embrace of the idea. However, the very idea that the United States was an empire did not occur to me until after I finished my work as a bombardier with the Eighth Air Force in the Second World War, and came home. Even as I began to have second thoughts about the purity of the "Good War," even after being horrified by Hiroshima and Nagasaki, even after rethinking my own bombing of towns in Europe, I still did not put all that together in the context of an American "Empire." I was conscious, like everyone, of the British Empire and the other imperial powers of Europe, but the United States was not seen in the same way. When, after the war, I went to college under the G.I. Bill of Rights and took courses in U.S. history, I usually found a chapter in the history texts called "The Age of Imperialism." It invariably referred to the Spanish-American War of 1898 and the conquest of the Philippines that followed. It seemed that American imperialism lasted only a relatively few years. There was no overarching view of U.S. expansion that might lead to the idea of a more far-ranging empire -- or period of "imperialism." I recall the classroom map (labeled "Western Expansion") which presented the march across the continent as a natural, almost biological phenomenon. That huge acquisition of land called "The Louisiana Purchase" hinted at nothing but vacant land acquired. There was no sense that this territory had been occupied by hundreds of Indian tribes which would have to be annihilated or forced from their homes -- what we now call "ethnic cleansing" -- so that whites could settle the land, and later railroads could crisscross it, presaging "civilization" and its brutal discontents. Neither the discussions of "Jacksonian democracy" in history courses, nor the popular book by Arthur Schlesinger Jr., The Age of Jackson, told me about the "Trail of Tears," the deadly forced march of "the five civilized tribes" westward from Georgia and Alabama across the Mississippi, leaving 4,000 dead in their wake. No treatment of the Civil War mentioned the Sand Creek massacre of hundreds of Indian villagers in Colorado just as "emancipation" was proclaimed for black people by Lincoln's administration. That classroom map also had a section to the south and west labeled "Mexican Cession." This was a handy euphemism for the aggressive war against Mexico in 1846 in which the United States seized half of that country's land, giving us California and the great Southwest. The term "Manifest Destiny," used at that time, soon of course became more universal. On the eve of the Spanish-American War in 1898, theWashington Post saw beyond Cuba: "We are face to face with a strange destiny. The taste of Empire is in the mouth of the people even as the taste of blood in the jungle." The violent march across the continent, and even the invasion of Cuba, appeared to be within a natural sphere of U.S. interest. After all, hadn't the Monroe Doctrine of 1823 declared the Western Hemisphere to be under our protection? But with hardly a pause after Cuba came the invasion of the Philippines, halfway around the world. The word "imperialism" now seemed a fitting one for U.S. actions. Indeed, that long, cruel war -- treated quickly and superficially in the history books -- gave rise to an Anti-Imperialist League, in which William James and Mark Twain were leading figures. But this was not something I learned in university either. The "Sole Superpower" Comes into View Reading outside the classroom, however, I began to fit the pieces of history into a larger mosaic. What at first had seemed like a purely passive foreign policy in the decade leading up to the First World War now appeared as a succession of violent interventions: the seizure of the Panama Canal zone from Colombia, a naval bombardment of the Mexican coast, the dispatch of the Marines to almost every country in Central America, occupying armies sent to Haiti and the Dominican Republic. As the much-decorated General Smedley Butler, who participated in many of those interventions, wrote later: "I was an errand boy for Wall Street." At the very time I was learning this history -- the years after World War II -- the United States was becoming not just another imperial power, but the world's leading superpower. Determined to maintain and expand its monopoly on nuclear weapons, it was taking over remote islands in the Pacific, forcing the inhabitants to leave, and turning the islands into deadly playgrounds for more atomic tests. In his memoir, "No Place to Hide," Dr. David Bradley, who monitored radiation in those tests, described what was left behind as the testing teams went home: "[R]adioactivity, contamination, the wrecked island of Bikini and its sad-eyed patient exiles." The tests in the Pacific were followed, over the years, by more tests in the deserts of Utah and Nevada, more than a thousand tests in all. When the war in Korea began in 1950, I was still studying history as a graduate student at Columbia University. Nothing in my classes prepared me to understand American policy in Asia. But I was reading I. F. Stone's Weekly. Stone was among the very few journalists who questioned the official justification for sending an army to Korea. It seemed clear to me then that it was not the invasion of South Korea by the North that prompted U.S. intervention, but the desire of the United States to have a firm foothold on the continent of Asia, especially now that the Communists were in power in China. Years later, as the covert intervention in Vietnam grew into a massive and brutal military operation, the imperial designs of the United States became yet clearer to me. In 1967, I wrote a little book called "Vietnam: The Logic of Withdrawal." By that time I was heavily involved in the movement against the war. When I read the hundreds of pages of the Pentagon Papers entrusted to me by Daniel Ellsberg, what jumped out at me were the secret memos from the National Security Council. Explaining the U.S. interest in Southeast Asia, they spoke bluntly of the country's motives as a quest for "tin, rubber, oil." Neither the desertions of soldiers in the Mexican War, nor the draft riots of the Civil War, not the anti-imperialist groups at the turn of the century, nor the strong opposition to World War I -- indeed no antiwar movement in the history of the nation reached the scale of the opposition to the war in Vietnam. At least part of that opposition rested on an understanding that more than Vietnam was at stake, that the brutal war in that tiny country was part of a grander imperial design. Various interventions following the U.S. defeat in Vietnam seemed to reflect the desperate need of the still-reigning superpower -- even after the fall of its powerful rival, the Soviet Union -- to establish its dominance everywhere. Hence the invasion of Grenada in 1982, the bombing assault on Panama in 1989, the first Gulf war of 1991. Was George Bush Sr. heartsick over Saddam Hussein's seizure of Kuwait, or was he using that event as an opportunity to move U.S. power firmly into the coveted oil region of the Middle East? Given the history of the United States, given its obsession with Middle Eastern oil dating from Franklin Roosevelt's 1945 deal with King Abdul Aziz of Saudi Arabia, and the CIA's overthrow of the democratic Mossadeq government in Iran in 1953, it is not hard to decide that question. Justifying Empire The ruthless attacks of September 11th (as the official 9/11 Commission acknowledged) derived from fierce hatred of U.S. expansion in the Middle East and elsewhere. Even before that event, the Defense Department acknowledged, according to Chalmers Johnson's book "The Sorrows of Empire," the existence of more than 700 American military bases outside of the United States. Since that date, with the initiation of a "war on terrorism," many more bases have been established or expanded: in Kyrgyzstan, Afghanistan, the desert of Qatar, the Gulf of Oman, the Horn of Africa, and wherever else a compliant nation could be bribed or coerced. When I was bombing cities in Germany, Hungary, Czechoslovakia, and France in the Second World War, the moral justification was so simple and clear as to be beyond discussion: We were saving the world from the evil of fascism. I was therefore startled to hear from a gunner on another crew -- what we had in common was that we both read books -- that he considered this "an imperialist war." Both sides, he said, were motivated by ambitions of control and conquest. We argued without resolving the issue. Ironically, tragically, not long after our discussion, this fellow was shot down and killed on a mission. In wars, there is always a difference between the motives of the soldiers and the motives of the political leaders who send them into battle. My motive, like that of so many, was innocent of imperial ambition. It was to help defeat fascism and create a more decent world, free of aggression, militarism, and racism. The motive of the U.S. establishment, understood by the aerial gunner I knew, was of a different nature. It was described early in 1941 by Henry Luce, multi-millionaire owner of Time, Life, and Fortune magazines, as the coming of "The American Century." The time had arrived, he said, for the United States "to exert upon the world the full impact of our influence, for such purposes as we see fit, and by such means as we see fit." We can hardly ask for a more candid, blunter declaration of imperial design. It has been echoed in recent years by the intellectual handmaidens of the Bush administration, but with assurances that the motive of this "influence" is benign, that the "purposes" -- whether in Luce's formulation or more recent ones -- are noble, that this is an "imperialism lite." As George Bush said in his second inaugural address: "Spreading liberty around the world is the calling of our time." The New York Times called that speech "striking for its idealism." The American Empire has always been a bipartisan project -- Democrats and Republicans have taken turns extending it, extolling it, justifying it. President Woodrow Wilson told graduates of the Naval Academy in 1914 (the year he bombarded Mexico) that the U.S. used "her navy and her army... as the instruments of civilization, not as the instruments of aggression." And Bill Clinton, in 1992, told West Point graduates: "The values you learned here will be able to spread throughout the country and throughout the world." For the people of the United States, and indeed for people all over the world, those claims sooner or later are revealed to be false. The rhetoric, often persuasive on first hearing, soon becomes overwhelmed by horrors that can no longer be concealed: the bloody corpses of Iraq, the torn limbs of American GIs, the millions of families driven from their homes -- in the Middle East and in the Mississippi Delta. Have not the justifications for empire, embedded in our culture, assaulting our good sense -- that war is necessary for security, that expansion is fundamental to civilization -- begun to lose their hold on our minds? Have we reached a point in history where we are ready to embrace a new way of living in the world, expanding not our military power, but our humanity?

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 17, 2015 08:00

October 16, 2015

Ian McKellen on playing straight: “Heterosexuality is far too interesting a phenomenon to be ignored”

Sir Ian McKellen has had a pretty terrific year. He received warm accolades for his turn as Sir Arthur Conan Doyle’s famous detective in “Mr. Holmes,” a film that reunited him with his “Gods and Monsters” director, Bill Condon. “Mr. Holmes” has been one of the year’s highest grossing independent films, most likely because of McKellen’s participation. He has those qualities of intelligence, introspection and dry wit that make him perfectly suited to the role. The actor also just finished the second season of “Vicious,” playing the gay, egocentric actor Freddie Thornhill. In the show, Freddie and his longtime, long-suffering partner Stuart (Derek Jacobi), contemplate marriage. The series, which has a quaint, old-fashioned style to it, reunites McKellen with Jacobi, as well as co-stars Frances de la Tour and Marcia Warren, all of whom have worked with McKellen on stage in the past. Their familiarity drives much of the show’s humor. And just this month, McKellen was honored for his career in stage and film and TV with a Lifetime Achievement Award from the Mill Valley Film Festival in Northern California. While at the festival, the actor chatted with Salon about his various projects, being a gay icon and aging. Sir Ian, you have been performing on stage and in film and on television for over 50 years now. You are now receiving a career achievement award at the Mill Valley Film Festival. Is it all downhill from here? What can you say about the highs and lows of the business? [Laughs]. I don’t feel much exasperation. It’s been a charm, the last half century. There are things in my life that aren’t as good as my career. I regret only two jobs — and I won’t say which ones — but both were on stage. Film and TV has been very enjoyable. It’s all gone according to plan — not that there was a plan. I’m a happy bunny. You had a career renaissance on screen in part because of “Gods and Monsters” — and those little films you made, “Lord of the Rings” and “X-Men.” You’re a Royal Shakespeare Company actor. Did you ever expect you’d be making blockbuster CGI spectacles? No! I did have a plan — get better as an actor rather than get rich and famous. I started in a modest way in the theater. I don’t remember of getting jealous of Tom Courtney, Albert Finney or Alan Bates [who had budding screen careers in the 1960s]. My friends say that’s not true, that I wanted to be in film. It changed with “Richard III,” which I did on stage, and toured in the States. We closed in Los Angeles. I then sent the script for the film version of “Richard III” off, and raised the money to make it, and once that was done and approved, that became a calling card. I was entrusted to work for the camera. That was how I got Brian Singer to cast me in “Apt Pupil” and later “X-Men.” Peter Jackson may have had some desperation that he couldn’t find anyone else to go to New Zealand for “Lord of the Rings.” I always thought it would be appealing to be in a classic film like “Casablanca.” It’s come about. “Lord of the Rings” finds a new generation of fans, judging by the kids who say hello to me. You reunited with your “Gods and Monsters” director Bill Condon to make “Mr. Holmes” earlier this year. Did you put a bug in his ear to play Sir Arthur Conan Doyle’s detective? How did that film come about? He called me out of the blue. There’s a line in “Gods & Monsters” that “It’s the most wonderful thing in the world — making movies and entertaining people.” When I went to New Zealand for “Lord of the Rings,” I’d stop by Los Angeles and stay with Bill [Condon] and his partner Jack. We always said we must find something else to do together, and it came about when he brought me “Mr. Holmes.” Now we’re doing our third film together, a live-action “Beauty and the Beast.” I play Cogsworth, the Clock. On that same point of old friends, let’s discuss “Vicious.” Your character, Freddie, is a vain actor — obviously a stretch for you. What can you say about working on the sitcom and being so bitchy to Derek Jacobi? [Laughs.] I know! I had to keep saying to Derek, “I don’t really mean it, darling. I love you.” Initially, I thought, "Do I really want to play someone my own age who is gay and an actor and has a vicious tongue?" Derek said he’d play that part, but then I said I’d stick with Freddie. It was a joy to work with old friends. Frances de la Tour and I did Strindberg’s “Dance of Death” together on stage in London. Marcia Warren was understudying Gertrude when I played Hamlet. Once — and it was only once — she went on and played my mother on stage! We decided not to do a third season of “Vicious.” We wanted to leave on a high note. We never saw Freddie and Stuart in bed together, though. I once asked the studio audience if they thought Freddie and Stuart had a double bed or two singles? Unanimously, they said, “Double.” They were two active people still having sex. Freddie and Stuart may be horrible to each other, but they are survivors and heroic. You have long been openly gay. Back in the early 1990s, you appeared in classic independent queer cinema such as “Ballad of Little Jo,” “Six Degrees of Separation,” and the LGBT-themed TV films, “And the Band Played on” and “Tales of the City.” What can you say about the state of queer cinema, which is changing, but still not completely mainstream? That’s an interesting question. In San Francisco, I was in conversation with Armistead Maupin, and he talked about coming out. For some gay activists, it is a duty and destiny to eschew the mainstream. I can’t do that. Heterosexuality is far too interesting a phenomenon to be ignored. You can’t play “King Lear” if you are an actor who won’t play straight. I wouldn’t be adverse at all at working with gay people on a queer project. The crew of “Gods and Monsters” was gay, and yet it became a mainstream movie. Some gay people don’t want to get married because they think that’s mainstream. They don’t want to assimilate; they want to be proud of their differences. To each his own, I say. But movies should be about truth, whether they are about gay or straight people, old or young. They will attract my attention as an audience or an actor if the story is well told. It seems that you are working now more than ever. What observations do you have about your career longevity? It’s not just actors — everyone is living longer. It’s wonderful that the industry embraces us and lets us get on the screen. I was just watching “The Second Best Exotic Marigold Hotel,” and I kept waiting for me to come on screen. It’s great that all those actors [Judi Dench, Maggie Smith, etc.] are all still in their prime. I never like to think of myself as an old-fashioned actor; I want to be up to date. If I can do films by and for young people, I feel I’ve found a place in society. I do slip into student films that no one gets to see, but that keeps me fresh and young at heart.Sir Ian McKellen has had a pretty terrific year. He received warm accolades for his turn as Sir Arthur Conan Doyle’s famous detective in “Mr. Holmes,” a film that reunited him with his “Gods and Monsters” director, Bill Condon. “Mr. Holmes” has been one of the year’s highest grossing independent films, most likely because of McKellen’s participation. He has those qualities of intelligence, introspection and dry wit that make him perfectly suited to the role. The actor also just finished the second season of “Vicious,” playing the gay, egocentric actor Freddie Thornhill. In the show, Freddie and his longtime, long-suffering partner Stuart (Derek Jacobi), contemplate marriage. The series, which has a quaint, old-fashioned style to it, reunites McKellen with Jacobi, as well as co-stars Frances de la Tour and Marcia Warren, all of whom have worked with McKellen on stage in the past. Their familiarity drives much of the show’s humor. And just this month, McKellen was honored for his career in stage and film and TV with a Lifetime Achievement Award from the Mill Valley Film Festival in Northern California. While at the festival, the actor chatted with Salon about his various projects, being a gay icon and aging. Sir Ian, you have been performing on stage and in film and on television for over 50 years now. You are now receiving a career achievement award at the Mill Valley Film Festival. Is it all downhill from here? What can you say about the highs and lows of the business? [Laughs]. I don’t feel much exasperation. It’s been a charm, the last half century. There are things in my life that aren’t as good as my career. I regret only two jobs — and I won’t say which ones — but both were on stage. Film and TV has been very enjoyable. It’s all gone according to plan — not that there was a plan. I’m a happy bunny. You had a career renaissance on screen in part because of “Gods and Monsters” — and those little films you made, “Lord of the Rings” and “X-Men.” You’re a Royal Shakespeare Company actor. Did you ever expect you’d be making blockbuster CGI spectacles? No! I did have a plan — get better as an actor rather than get rich and famous. I started in a modest way in the theater. I don’t remember of getting jealous of Tom Courtney, Albert Finney or Alan Bates [who had budding screen careers in the 1960s]. My friends say that’s not true, that I wanted to be in film. It changed with “Richard III,” which I did on stage, and toured in the States. We closed in Los Angeles. I then sent the script for the film version of “Richard III” off, and raised the money to make it, and once that was done and approved, that became a calling card. I was entrusted to work for the camera. That was how I got Brian Singer to cast me in “Apt Pupil” and later “X-Men.” Peter Jackson may have had some desperation that he couldn’t find anyone else to go to New Zealand for “Lord of the Rings.” I always thought it would be appealing to be in a classic film like “Casablanca.” It’s come about. “Lord of the Rings” finds a new generation of fans, judging by the kids who say hello to me. You reunited with your “Gods and Monsters” director Bill Condon to make “Mr. Holmes” earlier this year. Did you put a bug in his ear to play Sir Arthur Conan Doyle’s detective? How did that film come about? He called me out of the blue. There’s a line in “Gods & Monsters” that “It’s the most wonderful thing in the world — making movies and entertaining people.” When I went to New Zealand for “Lord of the Rings,” I’d stop by Los Angeles and stay with Bill [Condon] and his partner Jack. We always said we must find something else to do together, and it came about when he brought me “Mr. Holmes.” Now we’re doing our third film together, a live-action “Beauty and the Beast.” I play Cogsworth, the Clock. On that same point of old friends, let’s discuss “Vicious.” Your character, Freddie, is a vain actor — obviously a stretch for you. What can you say about working on the sitcom and being so bitchy to Derek Jacobi? [Laughs.] I know! I had to keep saying to Derek, “I don’t really mean it, darling. I love you.” Initially, I thought, "Do I really want to play someone my own age who is gay and an actor and has a vicious tongue?" Derek said he’d play that part, but then I said I’d stick with Freddie. It was a joy to work with old friends. Frances de la Tour and I did Strindberg’s “Dance of Death” together on stage in London. Marcia Warren was understudying Gertrude when I played Hamlet. Once — and it was only once — she went on and played my mother on stage! We decided not to do a third season of “Vicious.” We wanted to leave on a high note. We never saw Freddie and Stuart in bed together, though. I once asked the studio audience if they thought Freddie and Stuart had a double bed or two singles? Unanimously, they said, “Double.” They were two active people still having sex. Freddie and Stuart may be horrible to each other, but they are survivors and heroic. You have long been openly gay. Back in the early 1990s, you appeared in classic independent queer cinema such as “Ballad of Little Jo,” “Six Degrees of Separation,” and the LGBT-themed TV films, “And the Band Played on” and “Tales of the City.” What can you say about the state of queer cinema, which is changing, but still not completely mainstream? That’s an interesting question. In San Francisco, I was in conversation with Armistead Maupin, and he talked about coming out. For some gay activists, it is a duty and destiny to eschew the mainstream. I can’t do that. Heterosexuality is far too interesting a phenomenon to be ignored. You can’t play “King Lear” if you are an actor who won’t play straight. I wouldn’t be adverse at all at working with gay people on a queer project. The crew of “Gods and Monsters” was gay, and yet it became a mainstream movie. Some gay people don’t want to get married because they think that’s mainstream. They don’t want to assimilate; they want to be proud of their differences. To each his own, I say. But movies should be about truth, whether they are about gay or straight people, old or young. They will attract my attention as an audience or an actor if the story is well told. It seems that you are working now more than ever. What observations do you have about your career longevity? It’s not just actors — everyone is living longer. It’s wonderful that the industry embraces us and lets us get on the screen. I was just watching “The Second Best Exotic Marigold Hotel,” and I kept waiting for me to come on screen. It’s great that all those actors [Judi Dench, Maggie Smith, etc.] are all still in their prime. I never like to think of myself as an old-fashioned actor; I want to be up to date. If I can do films by and for young people, I feel I’ve found a place in society. I do slip into student films that no one gets to see, but that keeps me fresh and young at heart.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 16, 2015 16:00

My unconventional Texas family: A gay dad and a mom who wears mens’ clothes, in the Lone Star State

Not long ago, I called an old friend of mine, a white, straight, 50-year-old, Texas-born and bred, religiously conservative Republican friend of mine named Scott. He was also my former boss and current business partner. I’ve known for him almost ten years, and yet when I called him, I was nervous. “I’m, uh, coming to Houston to do a show. You’re welcome to come, but I’m not sure if it would be your cup of tea,” I told him.

I used to work for Scott when I lived in Houston, and had since moved to Los Angeles where, with his help, I started my own business. And while I had told him about my side gig as a comedian, I did not have the heart to tell him the subject matter of my material, which is about my experience being raised by gay men in Texas during the '90s.

Scott said, in his kind Texan accent, “I’d love to see your show, girl! I’ll be there.” While this was very sweet of him, I knew I now had a lot of explaining to do before he came to see it.

Scott was the first real boss I ever had who believed in me AND gave me health insurance. Before I worked for him, I was carless and still living with my mama. By the time I left for Los Angeles, I had a Honda and a beautiful red sleeper sofa to call my own.

He was a patient and fun person to work for. We saw each other day in and day out for almost three years. We went to lunch once a week and had the occasional Friday afternoon scotch.  I knew his daughter, who was 12 at the time. And though I never met his wife, I felt like I knew her through talking to her on the phone when she called to speak to him. I knew so much about Scott, but Scott knew little about me and he thought it was because I was shy.

I have been considered “shy” most of my life. The fact that I do stand-up comedy bewilders people. I have realized, through doing standup, that I am not “shy," but that there is a lot I hide from the world. Even when I was a little girl, I knew my family was not like other families. I had a mom and dad like everyone else, but found myself telling people, when I got to know them, “My mom is kinda like the dad, and my dad is kinda like the mom.”

My dad came out when I was 11 and my parents divorced. After the fact, I first lived with my mother. She was a tomboyish woman. In fact, there were many times when people would see her and think that she was my dad. While my mother was not gay, it was difficult to explain to people that Yes, my dad is gay, and my mom wears men’s clothes. Especially in Texas. Especially in the nineties.

Since 2010 I’ve been more open, telling people on stages at comedy clubs from LA to Paris about my family. So why, in the year 2015, was it so hard to tell Scott, someone I considered a close friend? Why couldn’t I just say, “I’m doing a show where I talk about my dad who is gay.”

Part of it was the memory of former rejections when I have had to “come out” in the past.  In high school, I lived with my dad and his partner. When I explained our family life to other people, I'd get a mixture of responses. Some of my friends would say, “That is so cool! I wish my dad were gay.” But there were many times when people would surprise me and say, “That there is an abomination.” Even people who knew and loved my dad became confused and distraught when learning that he and his “roommate” Dale were more than friends.

I worked with Scott in my twenties. At that time, since I didn’t live with my dad, it didn’t seem like a necessary topic to bring up. My dad wasn’t dating anyone, so there was no reason to say, “I hung out with my dad and his partner over the weekend.” But I knew about Scott’s mom. I knew that he took care of her after his dad passed away. I knew when she was sick and he had to take time off work to take her to the hospital. I didn’t bring up anything about my dad, because I wasn’t sure where the conversation would go. I worried I would let something spill that would reveal him.

Many people think it’s not necessary to mention a person’s sexuality in casual conversation. But the act of avoiding it cuts off an entire part of a life and a history. I couldn’t talk to Scott about my dad’s partner, Dale, who was like a second father to me when I was growing up. What would Scott say? Okay, fine or That there is an abomination? I wasn’t worried about losing my job; it was about having someone I loved and respected blindside me by rejecting my family and me to my face.

Speaking to Scott on the phone, I thought, “I’m a grown woman in her 30s. It’s 2015. I care about this person, but it’s time I take a risk and reveal a huge part of who I am. He may reject me, but I’ll be okay.”

I told Scott the nature of the show and that it was about my dad who was gay. In fact, my show was called, “Raised By Gays and Turned Out OK!” Scott's response was muted. He said, “I’ll see if I can go. Talk to you later.”

Honestly, no response was the best response for me. I was elated to let go of this burden. Glad that he did not say, “Whuuut?” The cat was out of the bag and if he went to the show, fine. If he didn’t go, fine. At least I no longer felt like a liar.

Recently, I’ve thought a lot about what it must have been like for my father to come out. He risked rejection by his family, friends, coworkers and society as a whole. Most of our family, who learned the truth, did reject him, not only for being gay, but for hurting my mother. Ultimately, I think the most difficult part was admitting the truth to himself. He loved my mom, my brother, and me. He wanted to be a traditional family man, and the last thing he wanted was to tear our family apart. Through doing my standup and one person show, I have explored this transition in my family’s life and have found nothing but tremendous empathy and admiration for the life my father had to lead. And I've realized that his coming out didn’t tear us apart; it made us better people to see him living as his true self.

I in no way thought my “coming out” experiences were exactly like his. But I was there with him by his side. When our family rejected him, I felt rejected. When he expressed concern about being “out” at work and possibly losing his job, I was concerned too. When we saw reports in the news of attacks against men in areas with concentrations of gay bars and other gay-owned businesses, I feared for his life. Along with having to come out to the kids at school and being the only person I knew with two dads, I often worried, “If my dad is gay, what does that make me?” I discovered it did not make me different from anyone else in school. My dad's orientation didn't make me different; living in a world that didn't accept his orientation made me different. Through watching my standup, my dad has seen my side of the story and has realized that he was not completely alone.

Many times, children of gay parents are overlooked, lost in the shadows of their parents' enormous struggles. But even my father, who had his own challenges, didn’t have to worry about sharing pictures of his mom and dad for fear that he might be outing his parents as a heterosexual couple. He could be proud of them. As the child of gay parents in an often homophobic world, I would sometimes feel ashamed. But I'd also feel ashamed of my shame. I felt that I had to completely cut out a section of my childhood because if I talked about it, it would bring up so many questions, politically-charged conversations and potentially nasty remarks about people I loved. I feared it would end friendships and kill romances. But not talking about my dad, whom I loved and respected, meant not talking about myself. It felt like a betrayal.

The day of my show in Houston, Scott called and said, “I’m gonna be there! You want me to videotape it?” I declined his offer to tape it because the show was my turn to talk about my family with pride and I wanted Scott’s undivided attention. While performing, I was so happy to see him sitting in the audience listening.

I still don't know how Scott feels about gay rights as a social or political issue. But after it was over I was too filled with joy and relief that he was still there at the end, standing beside me, to care about any of that. He patted me on the shoulder and said, “That was awesome, girl!”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 16, 2015 16:00

“Elizabethtown” isn’t the worst movie ever: Why Cameron Crowe’s flaming fiasco deserves a second look

“As someone once said, there’s a difference between a failure and a fiasco,” Orlando Bloom’s aloof sneaker designer Drew Baylor muses as he begins to awaken from an eight-year trance that left him estranged from his warm family and bound only to a beautiful co-worker (Jessica Biel). Ten years ago this week, Cameron Crowe’s "Elizabethtown" opened to almost unanimously negative reviews, which haven't abated in the last decade. But it wasn’t just a critical failure (commercially, it actually made a little profit). "Elizabethtown" was a fiasco. While there have been “Manic Pixie Dream Girls,” or slightly-written budding earth-mother characters who really only function as tools to help a gifted male lead grow from child to adult (Geena Davis won an Oscar for playing one in 1988’s acclaimed "The Accidental Tourists") for decades, film critics (and I do not consider myself one) seized on Kirsten Dunst’s awkward, too-perfect flight attendant Claire, and one scribe (the AV Club’s Nathan Rabin, a really good writer who was almost certainly not out to coin a phrase) gave the type its name. Every screenwriter (especially the dudes) has checked their heads ever since. It’s amazing Final Draft hasn’t developed a Manic Pixie warning in its updated software. Was Rabin's a fair hit? Certainly. (“I can’t help helping,” she confesses at one point). Dunst is a rock boy's dream. She has a steady job, but unwinds in a Maker’s Mark t-shirt, listening to stacks of CDs. She’s true blue, but willing to stay the night in Drew’s hotel, order room service and do the walk of shame. Were Renee Zellweger in "Jerry Maguire" and Kate Hudson in "Almost Famous" also Manic Pixie Dream Girls? If so, nobody called Crowe out for it. Were they simply better at hiding it than Dunst? (Hudson’s Penny Lane, aka Lady Goodman, declares herself a “Band Aid” so that nobody, least of all a critic, can call her anything else.) Or were those films just far superior? "Jerry Maguire" and "Almost Famous" were Oscar winners, and the former even launched two phrases — “You had me at ‘hello,'” and, of course, “Show me the money!” — into the AFI-approved vernacular, which gives Crowe one up on Rabin as far as immortal phrase-eology goes. The thing is, I know Jerry Maguire. I’ve met him before. And I have had the gift (maybe the curse, given my bank balance these days) of actually living out "Almost Famous." I got hired by a print music magazine back when they were nearly as thick as fashion magazines, and someone paid me to go on tour with bands. I had health insurance and a 401k option … as a rock writer! Crowe was not only familiar, he was a hero to most of us: The wunderkind who sat with a coked-up David Bowie and earned the trust of the notoriously suspicious Led Zeppelin and the king piss-taker Lester Bangs. He thought up and wrote down the “Moving In Stereo”/Phoebe Cates swimming pool scene. If he’d died in ’83, there’d be no "The Wild Life," first of all, but he’d still be a legend, one of the few rock writers, like Bangs and Nick Kent, who seem more like bands themselves. The thing is, I find my favorite bands more interesting these days when they produce fiascos. This may be a byproduct of getting older, more jaded or wise, depending — but I do not know about "Elizabethtown." One minute I'm cringing (Susan Sarandon’s stand up/tap dance routine at her husband’s memorial comes to mind) and the next I'm rooting for Drew and Claire. The film has its romantic comedy triggers (“Come Pick Me Up,” Ryan Adams’ wearing and tearing signature song, plays over a “getting to know you” montage) but its usage of an Elton John song (“My Father’s Gun” from the perfect "Tumbleweed Connection") is so much spookier and affecting than the now iconic “Tiny Dancer” sequence in "Almost Famous." Drew doesn’t know his family. He’s been in front of a computer for nearly a decade, and now he’s in the titular Kentucky suburb to dress his dead father in a blue suit and bring him home. He doesn’t even know the old man. As Elton sings, Drew stares into the coffin at embalmed dad (we get a corpse POV) in wonder and disgust. The emotions the "Almost Famous" sequence provokes are uniform — rock and roll is great; life is a trip; roll on, boys and Band Aids. This scene has weight. The song continues as Drew is welcomed by his distant Kentuckian relatives, including a pre-fame Paula Deen and Loudon Wainwright III. A small child feeds a pair of twin, salivating dogs a big, glistening country ham and they tear it to pieces. (Also, did I mention Paula Deen?) Crowe’s mother, as she usually does, appears in there somewhere. It’s Polanski and Fellini gothic and one of the darkest Elton John songs. Who doesn’t know what it’s like to feel like an utter stranger among your kin? In "Almost Famous," while among strangers, Kate Hudson assures Patrick Fugit’s William Miller, the Crowe stand-in, that he is “home” with them. Bullshit —I know because I’ve been there. Home is where the ham is. So if Crowe is a band, "Elizabethtown," whose Rotten Tomatoes and Metacritic numbers are seriously low, could very well be his "Self Portrait," "Bitches Brew" or "Berlin," his "Some Time in New York City," his "Trans," his "Kid A," his "Get Behind Me Satan": Albums that threw fans a big, fat curveball when they expected more of the same old moves. Most didn’t swing at it. I didn’t at the time, either. I remember, still besotted by "Almost Famous," railing first against "Vanilla Sky," his follow-up. I was sitting with a gaggle of rock writers, many of them well known at the time (there was a time) in a downtown bar wondering who the hell Crowe thought he was serving up an art film about virtual reality and cryogenic science with Spiritualized and post-Britpop Radiohead on the soundtrack. This wasn’t post-funeral comfort food. Tom Cruise, whose star power pushed the film, like "Jerry Maguire," into blockbuster status, spends a lot of his time wearing a mask, or deformed. Today, I realize that "Vanilla Sky" is another incredibly brave move for both Cruise and Crowe (who at that time literally could have done whatever they wanted together and gotten it funded). "Jerry Maguire" with a deformed face! Cameron Diaz as a murderous, spurned lover. Jason Lee with … a beard. Okay, some things can’t be fucked around with. Of the two, I still prefer "Elizabethtown," with all its disorienting and often pitch-black textures (early on, Bloom, Mr. Lord of the Rings heartthrob, intends to stab himself with a self-rigged hybrid of an exercise bike and a serrated knife). It’s streaming on Netflix, and if nothing else, you will witness the birth of another immortal: Alec Baldwin’s Jack Donaghy. A full year before "30 Rock," you can see Baldwin sculpting the clay with his early cameo as “Phil,” the guru of a Nike-like footwear giant, strolling the campus and uttering lines like, “My global environmental watchdog project will have to go. Sweet people. We could have saved the planet.” No disrespect to Tina Fey, but watch these scenes and tell me that’s not Jackie Boy. I am going to also stick my neck out here — as Crowe has shown us, fortune rewards the bold — and say that the film’s final 15 or so minutes, the road trip sequence, is maybe the best thing he’s ever done. It's amazing Drew's legs still work, having sat at his console for so long, but he's persuaded to drive (with his father’s urn strapped into the passenger seat) back home and make stops through the Kentucky they call “God’s country,” as well as Memphis. It’s magical realism: Bloom and Dunst narrate, and we are supposed to believe that she has timed her series of mix CDs to each quarter-mile perfectly — and coordinated “Pride (In the Name of Love)” to play just as wide-eyed, now fully-liberated Drew scatters parts of his dad in front of the Lorraine Motel Civil Rights Museum. And you will. Had this sequence alone been released today as a digital short, it probably would have been nominated for an Oscar. But as it is, it’s the last pre-MP3, pre-GPS romantic road trip film sequence, probably for all time. Any attempt to recreate it would have to be a period piece.  I say it’s stunning, maybe because I, like Crowe, wish someone would do something like that for me. Or because I haven’t spoken to my own father in nearly as much time as Drew has been away. This isn't a romance between the lost boy and his Manic Pixie Dream Girl; it’s Crowe’s own romance with America, not Hollywood. It’s his Lou Reed-ian chomping on the hand that bred, fed and awarded him (“Success, not greatness, was the only God the entire world served,” Bloom says at a point). Yes, Crowe has been puzzlingly risk-free lately, following this up with the relatively safe "We Bought A Zoo," a documentary on Pearl Jam — who doesn’t love Pearl Jam? Grouches, that’s who — and the romantic comedy "Aloha," (which seemed to disappear overnight, despite its promise of sweet, return-to-form style), but only Woody Allen and occasionally Oliver Stone are better than him at taking major movie stars and allowing them to get in touch with their inner Nic Cages. Now there’s a guy whose snaky curves — "Vampire's Kiss," "Peggy Sue Got Married"— are more interesting than his by-rote, hit-this fastball runs. And in an age of all-Marvel-everything hits, we need our frustrating, intriguing fiascos.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 16, 2015 15:59

Fox News gets suckered: 11 outrageous lies by their “terror analyst” who was actually a con man

Surprise: One of Fox News' most popular so-called "terror analysts" was actually a con man. Con artist Wayne Simmons created an elaborate life story. It is fake. He identified as a CIA outside paramilitary special operations officer. He wasn't. He wrote a book claiming he worked in the CIA for 27 years. He didn't. Fox News took him at his word. So did the U.S. government. Simmons worked as a subcontractor for the government multiple times, and was even invited to train at an Army facility. He ended up receiving security clearance and served as an intelligence advisor to senior military personnel overseas. So much for background checks. Simmons' website is chock-full of articles, media appearances, and "patriot pictures." He even has a page devoted to patented credit card technology called HADRiAN he claims to have created with a Delaware company that is actually in New Jersey. To say Wayne Simmons is a suspicious character would be to engage in understatement. The feds could tell. A federal grand jury indicted him on numerous counts of fraud and making false statements. This doesn't undo the damage Simmons has already undone, however. For what is even more disturbing than the fact that he deceived the media and the government for this long is the fact that Simmons used his faux-authority to spread ludicrous and jingoist right-wing propaganda. For 13 years, Simmons ceaselessly spewed unsubstantiated opinions on Fox News, under the facade of being a CIA veteran and "national security and terrorism expert." The following are just 11 of the preposterous things Fox favorite Wayne Simmons -- a snake oil salesman who peddles odious lies and fear -- proposed and claimed: Racial profiling In 2011, Simmons insisted on Fox News that the U.S. government should racially profile people from Muslim-majority countries. Calling himself a "pro-profiler," Simmons proclaimed "I am just adamant about profiling, and we need to do it." When Senator John McCain proposed, in Fox's words, "banning some immigrants from radical countries," Simmons replied "I think it's a great idea; should have been done years ago." Muslim paramilitary camps Simmons claimed on Fox News in early 2015 that there are "at least 19 paramilitary Muslim training facilities in the United States," where Muslims are being trained to carry out terrorist attacks on Americans. As a source, he cited Islamophobic right-wing propaganda outlet the Clarion Project. While spreading flagrantly false rumors about supposed "no-go zones" in Europe in which non-Muslims are not allowed to enter, Simmons warned viewers "We are in a global war, a global war against Islamic jihad." Assassinating democratically elected leaders Simmons called for the U.S. government to assassinate democratically elected Venezuelan President Hugo Chávez in 2005. "If a stray bullet from a hunter in Kentucky should find its way between this guy's eyes, no American should lose any sleep over it," Simmons quipped. Fox News hosts Sean Hannity and Alan Colmes egged him on. "Do you want him dead?" Colmes asked, referring to Chávez -- who was democratically elected numerous times and was, by far, the most popular leader in Venezuela's history. "Absolutely," Simmons replied. "He should have been killed a long time ago... It doesn't matter to me who kills this guy. He's to go." Simmons even went so far, at the nudging of Hannity and Colmes, to compare the Venezuelan president to Hitler. Executing 'traitors' Simmons frequently called for bloodletting on air. On Fox News in 2005, he asserted that American "traitors" should be executed by "firing squad." Fox host Alan Colmes asked "You want America to have firing squads?" Simmons replied "You doggone right I do, for traitors, that's absolutely what they should have." Whistleblowers as 'terrorists' On Fox's Freedom Watch in 2010, Simmons called whistleblowing journalism organization WikiLeaks "a terrorist organization." Simmons accused WikiLeaks of "hiding behind" the First Amendment in order "to come against the national security of the United States." The Fox host characterized WikiLeaks and those who leaked to it as "threats to America." He also introduced the con man saying "Wayne Simmons, you are a former intelligence agent of the USA who risked his life for national security and other laudable purposes; you also took an oath to uphold the Constitution." Hidden WMDs When weapons of mass destruction weren't found in Iraq, Simmons happily went on Fox in 2007 to claim they could have been hidden in other Middle Eastern countries, namely in Syria or Lebanon. Simmons was still riffing on this groundless ruse in 2013. He told Fox News that there was a "very high probability" that the late Iraqi President Saddam Hussein had hidden WMDs in Syria. Mass surveillance Simmons always came out to bat for mass surveillance. When, in 2006, the FBI admitted that a supposed terrorist plot to attack NFL stadiums was a "hoax," Simmons claimed on Fox News that this was "the perfect example of the president's Military Commissions Act of 2006 and the NSA terrorist eavesdropping program, how vital they are. Without them, we cannot, as the president wants us to do, pre-emptively strike the terrorists." The other guest on the program at the time, Frank Gaffney, founder and president of the right-wing Center for Security Policy think tank, fearmongered about so-called "Islamofascists," who he insisted "are determined to kill as many of us as they can." In a sigh of relief, Fox News host Neil Cavuto declared "We dodged a bullet here, or presumably a hoaxed bullet, but still." '9/11s unabated' In 2005, on Fox News' The O'Reilly Factor, Simmons claimed that if "the Democrats come into power in the United States and re-employ their vision of defense for this country, we will have 9/11s unabated." "This will absolutely be proven to be fact," Simmons insisted. "That's not maybe," he warned. Anti-war 'psy-ops' When Democratic Congressman John Murtha criticized the Iraq War in 2006, Simmons said a "psy-ops, or a psychological operation, in and by itself, can decimate the enemy if it's run correctly. The problem is Murtha's running a psy-op against his own people and against his own military." Media controlled by al-Qaeda "The terrorists know that they have the press and they have the ACLU in their pocket" Simmons stated on Fox News in 2005. He maintained that news outlets like The New York Times and The LA Times, along with NGOs like the ACLU, were helping terrorist groups by reporting on or criticizing the U.S. government's illegal torture program. White House conspiracies In 2011, Simmons was invited on Fox to discuss U.S. military involvement in Pakistan. Fox introduced Simmons as a "former CIA operative who knows the area well," and asked the con artist to comment on military policy. Naturally, Simmons insisted that the Obama administration -- which for years had conducted a covert drone war there that left thousands of people dead, including hundreds of civilians -- was not being aggressive enough. Conspiracy-theory style, Simmons speculated that the White House had ordered General Petraeus to downplay the threat of terrorism in Pakistan in order to "help soft sell the eventual withdrawal." 'Terrorism "experts"' For 13 years, Simmons always came out with guns blazing in defense of conservative causes. There are bound to be countless more examples of the con man giving credence to baseless right-wing myths. Wayne Simmons is a paragon of the fraudulent "terrorism expert." Pulitzer Prize-winning journalist Glenn Greenwald has pointed out that there are essentially no official standards in the U.S. media by which "counter-terrorism" pundits' purported "expertise" is measured; they must simply ignore facts, blame Muslims, and trumpet U.S. propaganda. Simmons fulfills each of these preconditions and more. The question everyone should now be asking is how many more Wayne Simmons are out there?

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 16, 2015 14:56