Helen H. Moore's Blog, page 147

March 3, 2018

Loneliness is bad for your health

Guy Drinking

(Credit: Getty/Mixmike)


Imagine a 65-year-old woman who sees her physician frequently for a variety of aches and pains. She might complain of back pain on one visit, headaches another time, and feeling weak on the next. Each time, her physician does a physical exam and runs the appropriate tests, without finding anything to account for her symptoms. Each time, she leaves the office feeling frustrated that “nothing can be done” for what ails her.


However, if we looked more closely, we’d find out that this patient lost her husband five years earlier and has been living alone since. Her three children all live in other states. Although she dotes on her grandchildren, she sees them only about once a year. She has a few friends that she only sees occasionally. If asked, she would probably tell you that, yes, she is lonely.


This is a common picture in a family physician’s office. These ill-defined symptoms without any clear cause might well be the result of social isolation and boredom. Research shows that people who feel lonely have more health problems, feel worse and perhaps die at an earlier age.


Psychiatry, my specialty, has long known that feelings of all kinds can affect our physical health in profound ways. It seems officials are starting to take that seriously – the United Kingdom now even has a minister for loneliness. And for good reason.


Negative effects


In 2015, researchers from Brigham Young University looked at multiple studies on loneliness and isolation. Their results from several hundred thousand people showed that social isolation resulted in a 50 percent increase in premature death.


Loneliness and social isolation are also associated with increased blood pressure, higher cholesterol levels, depression and, if that weren’t bad enough, decreases in cognitive abilities and Alzheimer’s disease.


Humans evolved to be around others. Long ago, we hunted in small hunter-gather groups, where social cohesion could help protect from predators. Being alone without support in the wild is dangerous – and stressful. You’d have to be constantly vigilant for dangers, ready to go into “fight or flight” mode at any time.


Over the short term, stress can be healthy. But in the long term, uncontrolled stress becomes a problem. There’s good evidence that chronic stress elevates levels of a hormone called cortisol in the brain. Cortisol can decrease immune system responses to infections. It might even make neurons in the brain less active and even lead to cell death. It contributes to inflammation, which is connected to cardiovascular disease, stroke and hypertension and is probably a cause of depression.


Just like the person long ago in the wild, someone who’s lonely over the long term can experience these cortisol responses. Lonely people are stressed much of the time.


Another hormone named oxytocin seems to play a role in social isolation. In popular media, oxytocin is often referred to as the “love hormone.” This is an overstatement, but oxytocin is involved in relationships and pair bonding. For example, after birth, high oxytocin levels are associated with better mother-infant bonding.


Oxytocin also seems to be linked to reduced stress. For instance, it’s associated with decreases in levels of norepinephrine, the “fight or flight” hormone, as well as decreases in blood pressure and heart rate, much the opposite of chronic cortisol. Oxytocin also seems to decrease activity in the amygdala, a part of the brain that activates whenever there’s a perceived threat.


A little less lonely


So what can we do about all this? There are no real medications to treat loneliness, unless one is also depressed or has high levels of anxiety.


Issues with loneliness seem to be more prevalent in older adults. The AARP found that about 17 percent of older Americans are lonely and or isolated.


CNN reporter and physician Sanjay Gupta suggests that society should start to view loneliness as another chronic disease. If so, then patients need long-term strategies to manage this problem.


Not surprisingly, the currently recommended treatment revolves around establishing social relationships. For older adults, joining the local senior center is a wonderful way to get involved in activities and meet people. What about volunteering? Senior volunteer programs are always looking for older adults who will deliver meals, do mailings and a variety of other activities. It is surprising how small things can also be helpful.


A simple phone call once a day from an adult child is an opportunity to share things from the day or about grandchildren. Even better, video conferencing via computer is easy and cheap. You can actually talk to and see your children and grandchildren who might be on the other side of the country. Studies in long-term care facilities found that pets can also reduce loneliness.


Because people have to be followed for years in order to determine if these or other interventions actually counteract the effects of loneliness, little work of this kind has been done as of yet. It seems reasonable, though, to think that psychosocial interventions are powerful, since healthy adults have these kinds of coping skills.


From a medical perspective, the wise physician will schedule people who seem to primarily be lonely for periodic visits just to talk. In my opinion, this could prevent more unnecessary testing and costly care.



Finally, even if you have a rich set of social contacts, maybe your neighbor who walks by alone on occasion doesn’t. Say hello.


Jed Magen, Associate Professor of Psychiatry, Michigan State University



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 17:00

Don’t say what you “would have done” in a crisis

Superhero Trump

(Credit: AP/Shutterstock/Salon)


In the wake of the Parkland tragedy last month, the current occupant of the White House — a man who avoided serving his country in Vietnam because his feet hurt — boasted that “I really believe I’d run in there even if I didn’t have a weapon, and I think most of the people in this room would have done that, too.” It’s a good thing he’ll likely never have to prove it. Because what we think when we’re safe and cozy and what we actually do when the rubber hits the road are, for a great many of us, two very different things.


Soliders and first responders go through rigorous training to override our primitive fight or flight responses when the amygdala takes the wheel. When my friends who are combat veterans and EMTs say how they think they’d behave in an emergency, I believe them. Everybody else gets a hard “maybe.” Yet panic has a strong pull, even for the pros. Why would anyone be brazen enough to say he’d run into a dangerous situation like Parkland when Marjory Stoneman Douglas High School’s own armed security guard reportedly did not?


I’ve been thinking a lot lately about this subject. About the men and women who have come forward with their experiences of physical and sexual abuse and been reflexively told, “I wouldn’t have stood for it. Why didn’t you just leave?” Of parents who brag that if a bully ever said anything like that to their kid, there would be hell to pay. Of elected officials who say a teacher with a gun would be the best defense against a murderer with an AR-15.


I was out with a friend recently, describing how in the midst of a medical crisis I had been struck dumb by another person’s astonishing insensitivity. “How come you didn’t chew him out?” she asked. “I would have reamed him.” Perhaps she would have. The cooler-headed, sitting in a bar with a pal version of me absolutely would have. But, at the time, it took me hours to even register what had happened.



University of Lancaster psychologist John Leach has extensively explored the subject of survival psychology, and why some people freeze and some people take action. “It is not the ‘will-to-live,’ but the ‘won’t to live’ that matters,” he’s written, citing the powerful role that memory and cognitive function play in response to stressful situations. Thanks to their prior conditioning, certain people will simply physically shut down. And if, say, you’ve just jumped out of a plane, that can be a problem.


Leach writes, “We tend to view such survivors as possessing a quality above the ordinary; a strength of character, a purposefulness or drive to overcome the crushing physical and psychological duresses they encounter,” adding “yet personality constructs have proven inadequate.” Just ask any trauma survivor how much their character had to do with what happened during their experience. Similarly, ask someone who’s done something heroic on the spur of the moment, and you might discover, as neuroscientist Robert Sapolsky explained recently on Radiolab, “People don’t think their way to a moral decision … It’s about a type of empathy that allows you to remain detached enough to actually act.” 


We humans have within us the capacity for what’s been labeled “hysterical strength” — the ability, when extraordinary circumstances call for it, to call upon reserves we didn’t even know we possess. Adrenaline can override the parts of the brain that usually say, “Nope, not going to let you do that; you’ll hurt yourself.” It’s why a person who normally can’t open a jar of pickles might, in a pinch, be able to lift a car. Yet as science has shown, the same event could instead make them go completely deer-in-headlights. The circuitry is unpredictable.


I am the person who furiously chased after my two muggers, who called the cops when I saw a woman randomly assaulted on the street, amazed that no one else seemed to even notice it had happened. I am also the same person who was rendered speechless when a neighbor went on a racist tirade, and who has sat frozen while a man on the subway masturbated in front of me. I have been on the receiving end of unwanted groping. Sometimes I’ve yelled at the person, and sometimes I have slunk away in misguided shame. I have had debilitating panic attacks seemingly unrelated to any exterior provocation. I’m part Wonder Woman, part possum. I wish I could have the confidence to know that I will always be self-preservingly brave and always leap in to help others, but the truth is that whether I will or won’t seems pretty random. What I do know is that it’s a lot easier to be fearless when there’s nothing fearful present, and that you never truly know what you would do in a crisis until you’re called upon to do it.


 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 16:30

Antisemitism: how the origins of history’s oldest hatred still hold sway today

Confederate Monument Protest

(Credit: AP Photo/Steve Helber)


Antisemitism is on the march. From the far-right demonstrators in Charlottesville, Virginia, with their “Blood and Soil” chants and their “Jews will not replace us” placards to attacks on synagogues in Sweden, arson attacks on kosher restaurants in France and a spike in hate crimes against Jews in the UK. Antisemitism seems to have been given a new lease of life.


The seemingly endless conflicts in the Middle East have made the problem worse as they spawn divisive domestic politics in the West. But can the advance of antisemitism be attributed to the rise of right-wing populism or the influence of Islamic fundamentalism? One thing is clear. Antisemitism is here and it’s getting worse.


Antisemitism rears its ugly head in every aspect of public life, whether internal debates within political parties or accusations of conspiratorial networks or plots in politics and business. Or even in the accusations that Hollywood mogul Harvey Weinstein’s sexually predatory behaviour was somehow linked to his Jewish origins.


But by focusing narrowly on the contemporary context of modern antisemitism, we miss a central, if deeply depressing, reality. Jeffrey Goldberg, the editor of The Atlantic magazine, puts it correctly when he says that what we are seeing is an ancient and deeply embedded hostility towards Jews that is reemerging as the barbarous events of World War II recede from our collective memory.


Goldberg says that for 70 years, in the shadow of the death camps, antisemitism was culturally, politically and intellectually unacceptable. But now “we are witnessing … the denouement of an unusual epoch in European life, the age of the post-Holocaust Jewish dispensation”. Without an understanding of antisemitism’s ancient roots, the dark significance of this current trend may not be fully understood and hatred may sway popular opinion unchallenged.


Antisemitism has been called history’s oldest hatred and it has shown itself to be remarkably adaptable. It is carved from — and sustained by — powerful precedents and inherited stereotypes. But it also taking on variant forms to reflect the contingent fears and anxieties of an ever-changing world. Understood this way, it is the modern manifestation of an ancient prejudice — one which some scholars believe stretches back to antiquity and medieval times.


Ancient tradition of hatred


The word “antisemitism” was popularised by the German journalist Wilhelm Marr. His polemic, Der Sieg des Judentums über das Germentum (The Victory of Jewry over Germandom), was published in 1879. Outwardly, Marr was a thoroughly secular man of the modern world. He explicitly rejected the groundless but ancient Christian allegations long made against the Jews, such as deicide or that Jews engaged in the ritual murder of Christian children. Instead, he drew on the fashionable theories of the French academic Ernest Renan (who viewed history as a world-shaping contest between Jewish Semites and Aryan Indo-Europeans). Marr suggested that the Jewish threat to Germany was racial. He said that it was born of their immutable and destructive nature, their “tribal peculiarities” and “alien essence”.


Antisemites like Marr strove for intellectual respectability by denying any connection between their own modern, secular ideology and the irrational, superstitious bigotry of the past. It is a tactic which is employed by some contemporary antisemites who align themselves with “anti-Zionism”, an ideology whose precise definition consequently excites considerable controversy. But this continuing hostility towards Jews from pre-modern to modern times has been manifest to many.


The American historian Joshua Trachtenberg, writing during World War II, noted:


Modern so-called ‘scientific’ antisemitism is not an invention of Hitler’s … it has flourished primarily in central and eastern Europe, where medieval ideas and conditions have persisted until this day, and where the medieval conception of the Jew which underlies the prevailing emotional antipathy toward him was, and still is, deeply rooted.



In fact, up until the Holocaust, antisemitism flourished just as much in western Europe as in central or eastern Europe. Consider, for example, how French society was bitterly divided between 1894-1906, after the Jewish army officer, Captain Alfred Dreyfus, was falsely accused and convicted of spying for Germany. It saw conservatives squaring up against liberals and socialists, Catholics against Jews.


Yet Trachtenberg was undoubtedly correct in suggesting that many of those who shaped modern antisemitism were profoundly influenced by the older “medieval” tradition of religious bigotry. The Russian editor of the infamous Protocols of Zion — a crude and ugly, but tragically influential, forgery alleging a Jewish world conspiracy — was the political reactionary, ultra-Orthodox, and self-styled mystic Sergei Nilus.


Wrought by fear and hatred of the challenges to traditional religion, social hierarchies and culture posed by modernity, Nilus was convinced that the coming of the Antichrist was imminent and that those who failed to believe in the existence of “the elders of Zion” were simply the dupes of “Satan’s greatest ruse”.


So modern antisemitism cannot be easily separated from its pre-modern antecedents. As the Catholic theologian Rosemary Ruether observed:


The mythical Jew, who is the eternal conspiratorial enemy of Christian faith, spirituality and redemption, was … shaped to serve as the scapegoat for [the ills of] secular industrial society.



Antisemitism in antiquity?


Some scholars would look to the pre-Christian world and see in the attitudes of ancient Greeks and Romans the origins of an enduring hostility. Religious Studies scholar Peter Schäfer believes the exclusive nature of the monotheistic Jewish faith, the apparent haughty sense of being a chosen people, a refusal to intermarry, a Sabbath observance and the practise of circumcision were all things that marked Jews out in antiquity for a particular odium.


Finding examples of hostility towards Jews in classical sources is not difficult. The politician and lawyer Cicero, 106-43BC, once reminded a jury of “the odium of Jewish gold” and how they “[stick together]” and are “influential in informal assemblies”. The Roman historian Tacitus, c.56-120AD, was contemptuous of “base and abominable” Jewish customs and was deeply disturbed by those of his compatriots who had renounced their ancestral gods and converted to Judaism. The Roman poet and satirist Juvenal, c.55-130AD, shared his disgust at the behaviour of converts to Judaism besides denouncing Jews generally as drunken and rowdy.


These few examples may point towards the existence of antisemitism in

antiquity. But there is little reason to believe that Jews were the objects of a specific prejudice beyond the generalised contempt that both Greeks and Romans exhibited towards “barbarians” — especially conquered and colonised peoples. Juvenal was every bit as rude about Greeks and other foreigners in Rome as he was about Jews. He complained bitterly: “I cannot stand … a Greek city of Rome. And yet what part of the dregs comes from Greece?” Once the full extent of Juvenal’s prejudice has been recognised, his snide remarks about Jews might be understood as being more indicative of an altogether more sweeping xenophobia.


The ‘Christ killers’


It is in the theology of early Christians that we find the clearest foundations of antisemitism. The Adversus Judaeos (arguments against the Jews) tradition was established early in the religion’s history. Sometime around 140AD the Christian apologist Justin Martyr was teaching in Rome. In his most celebrated work, Dialogue with Trypho the Jew, Justin strove to answer Trypho when he pointed to the contradictory position of Christians who claimed to accept Jewish scripture but refused to follow Torah (the Jewish law).


Justin responded that the demands of Jewish law were meant only for Jews as a punishment from God. Although still accepting the possibility of Jewish salvation, he argued that the old covenant was finished, telling Trypho: “You ought to understand that [the gifts of God’s favour] formerly among your nation have been transferred to us.” Yet Justin’s concern was not really with Jews. It was with his fellow Christians. At a time when the distinction between Judaism and Chritianity was still blurred and rival sects competed for adherents, he was striving to prevent gentile converts to Christianity from observing the Torah, lest they go over wholly to Judaism.


Vilifying Jews was a central part of Justin’s rhetorical strategy. He alleged that they were guilty of persecuting Christians and had done so ever since they “had killed the Christ”. It was an ugly charge, soon levelled again in the works of other Church Fathers, such as Tertullian (c.160-225AD) who referred to the “synagogues of the Jews” as “fountains of persecution”.


The objective of using such invective was to settle internal debates within Christian congregations. The “Jews” in these writings were symbolic. The allegations did not reflect the actual behaviour or beliefs of Jews. When Tertullian attempted to refute the dualist teachings of the Christian heretic Marcion (c.144AD), he needed to demonstrate that the vengeful God of the Old Testament was indeed the same merciful and compassionate God of the Christian New Testament. He achieved this by presenting the Jews as especially wicked and especially deserving of righteous anger; it was thus, Tertullian argued, that Jewish behaviours and Jewish sins explained the contrast between the Old and the New Testament.


To demonstrate this peculiar malevolence, Tertullian portrayed Jews as denying the prophets, rejecting Jesus, persecuting Christians and as rebels against God. These stereotypes shaped Christian attitudes towards Jews from late antiquity into the medieval period, leaving Jewish communities vulnerable to periodic outbreaks of persecution. These ranged from massacres, such as York in 1190, to “ethnic cleansing”, as seen in the expulsions from England in 1290, France in 1306 and Spain in 1492.


Although it was real people who often suffered as a result of this ugly prejudice, antisemitism as a concept largely owes its longevity to its symbolic and rhetorical power. American historian David Nirenberg concludes that “anti-Judaism was a tool that could usefully be deployed to almost any problem, a weapon that could be deployed on almost any front”. And this weapon has been wielded to devastating effect for centuries. When Martin Luther thundered against the Papacy in 1543 he denounced the Roman Church as “the Devil’s Synagogue” and Catholic orthodoxy as “Jewish” in its greed and materialism. In 1790, the Anglo-Irish conservative Edmund Burke published his manifesto, Reflections on the Revolution in France, and condemned the revolutionaries as “Jew brokers” and “Old Jewry.”


From Marxism to Hollywood


Despite Karl Marx’s Jewish ancestry, Marxism was tainted at its very birth by antisemitism. In 1843, Karl Marx identified modern capitalism as the result of the “Judiasing” of the Christian:


The Jew has emancipated himself in a Jewish manner not only annexing the power of money but also through him and also apart from him money has become a world power and the practical spirit of the Jew has become the practical spirit of the Christian people. The Jews have emancipated themselves in so far as the Christians have become Jews … Money is the jealous god of Israel before whom no other god may stand … The god of the Jews has been secularised and has become the god of the world.



And there remain those, from across the political spectrum, who are still ready to deploy what Nirenberg referred to as “the most powerful language of opprobrium available” in Western political discourse, commonly using the language of conspiracy, webs and networks. In 2002, the left-leaning New Statesman included articles by Dennis Sewell and John Pilger, debating the existence of a “pro-Israeli lobby” in Britain. Their articles, however, proved less controversial than the the cover illustration chosen to introduce this theme, which drew on familiar tropes of secret Jewish machinations and dominance over national interests: a gold Star of David resting on the Union Jack, with the title: “A Kosher Conspiracy?” The following year, veteran Labour MP Tam Dalyell accused the then prime minister, Tony Blair, of “being unduly influenced by a cabal of Jewish advisers”. It is still language that is being used now.


On the far right, white supremacists have been quick to project their own time-honoured fantasies of Jewish malfeasance and power onto contemporary events, however seemingly irrelevant. This was quickly apparent in August 2017, as the future of memorials glorifying those who had rebelled against the union and defended slavery during America’s Civil War became the focus of intense debate in the United States. At Charlottesville, Virginia, demonstrators protesting against the removal of a statue of Confederate General Robert E Lee, began chanting “Jews will not replace us”. When journalist Elspeth Reeve asked one why, he replied that the city was “run by Jewish communists”.


When accusations of serious sexual misconduct by Weinstein were published by The New York Times in October 2017, he was quickly cast by the far right as a representative of the “eternal conspiratorial enemy” of American society as a whole. David Duke, former head of the Ku Klux Klan, would write on his website that the “Harvey Weinstein story … is a case study in the corrosive nature of Jewish domination of our media and cultural industries”.


‘The hatreds of our time …’


Responding to such language, The Atlantic’s Emma Green astutely commented on how “the durability of anti-Semitic tropes and the ease with which they slide into all displays of bigotry, is a chilling reminder that the hatreds of our time rhyme with history and are easily channelled through timeless anti-Semitic canards”.


There is real danger here as the spike in antisemitic hate crimes shows. This peculiar way of thinking about the world has always retained the potential to turn hatred of symbolic Jews into the very real persecution of actual Jews. Given the marked escalation of antisemitic incidents recorded in 2017, we are now faced with the unsettling prospect that this bigotry is becoming “normalised”.


For example, the European Jewish Congress expressed “grave concerns” over an increase in antisemitic acts in Poland under the right-wing Law and Justice government which won the 2015 parliamentary election with an outright majority. The group said the government was “closing … communications with the official representatives of the Jewish community” and there was a “proliferation of ‘fascist slogans’ and unsettling remarks on social media and television, as well as the display of flags of the nationalist … group at state ceremonies”.


In response to these fears, a survey investigating antisemitism within the European Union will be undertaken in 2018, led by the European Union Agency for Fundamental Rights. The agency’s director, Michael O’Flaherty, commented, correctly, that: “Antisemitism remains a grave worry across Europe despite repeated efforts to stamp out these age-old prejudices.”


Given the phenomenon’s deep historical roots and its epoch defying capacity for reinvention, it would be easy to be pessimistic about the prospect of another effort to “stamp it out”. But an historical awareness of the nature of antisemitism may prove a powerful ally for those who would challenge prejudice. The ancient tropes and slights may cloak themselves in modern garb but even softly-spoken allegations of conspiratorial “lobbies” and “cabals” should be recognised for what they are: the mobilisation of an ancient language and ideology of hate for which there should be no place in our time.


Gervase Phillips, Principal Lecturer in History, Manchester Metropolitan University



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 16:29

“We regret to inform you that your husband has been killed”

Marine Killed in Action

(Credit: AP/David Zalubowski)


The following passage illustrates the all-too-common experience of serviceman and women, as well as their loved ones, in discovering and dealing with the tragic loss that is so commonplace in today’s climate of perpetual military conflict and war. Excerpted with permission from “All the Ways We Kill and Die: A Portrait of Modern War” by Brian Castner. Copyright 2018 by Arcade Publishing, an imprint of Skyhorse Publishing, Inc.  Available for purchase on Amazon, Barnes & Noble or Indiebound.


I had been an Explosive Ordnance Disposal officer, a leader in the military’s bomb squad. We call it a brotherhood, and there are so few of us we’re all connected by only one or two degrees of separation. The brotherhood is a mindset, an affection, a burden, a bond that endures long after the crucible of EOD school and deployments around the world are over. It’s the covenant we keep with those in the ground, our responsibility to those hobbled before their time, the standard by which I secretly measure everyone I meet.


In EOD, our job is to make bombs safe. Sometimes we can disarm the device before it goes off. Too often, though, the bomb works as designed, and we’re left picking up the pieces, human and mechanical, to figure out what happened. Collecting the forensic evidence, recreating the scene, imagining the attacker’s intentions, noting the effect of each munition on the human body. This is all fundamental to how we are trained to think.


War can be random; you can die from bad luck, wrong place at the wrong time. But other times they pick you out of the crowd, and it’s intentional and premeditated. If it weren’t a war, we’d call that murder.


My story, and the story of so many EOD officers, can be summed up as: All the ways we die, and nearly die, and who and how we kill in response.


* * *


In January, 2012, a western mountain warm spell had stolen the modest Christmas snows, and the home of Matthew and Jennifer Schwartz sat among bare trees and dying grass, a pale house on a brown lawn.


The house was nearly empty. The girls were off at school. Jesus and his radiant Sacred Heart stared from the living room wall at a blank television and forgotten couch. Duke the chocolate Lab slept at the foot of the stairs. The only sound in the empty house was the mechanical hum of the treadmill and the regular beat of a runner’s footfalls.


The house was often empty. A new pickup truck and trailer filled the driveway, camping equipment filled the garage, dirty dishes filled the sink, Duke shuffled and huffed about the backyard, the three girls laughed and sang songs, but Matt was gone, always gone, and the hole remained. A toothbrush here, a T-shirt there, the small reminders of him were strewn about the house like so many pretty gold rings, and she but the amputated stump of a hand with no fingers.


That morning Jenny was finishing another long run on her treadmill. She had discovered running on Matt’s second tour. At the start of his deployments, she ran four or six miles. Now that he had been gone three months, she was up to ten and barely out of breath.


Jenny had learned long ago not to pine by the phone; it only made the hours crawl. But she had also learned to save the last recording on the answering machine, not to delete the last email. Matt had been out on a long-distance patrol for over a week, and had managed only a quick and broken sat phone call. So more than anything, it was a last email that kept tumbling through her head. It bothered her that it read like a last email. Heavy zippered sweatshirts in the dryer, tumble, tumble, the email always in the back of her mind as she ran.


Jenny was soaked when she got off the treadmill, dripping the sweaty, unwashed funk that comes from not having showered since, well, who keeps track of these things when your husband is gone and the girls need you? She paced and began her stretching routine, and the doorbell rang. Under no circumstances would she ever answer the door smelling like she did, but she did look out the window.


She saw a sea of uniform blue hats stark against the dry Wyoming prairie.


If I don’t answer the door, she thought, he’s not dead. He’s not dead yet.


The doorbell rang again. Perhaps a third time. They weren’t leaving.


Jenny disconnected her mind and entered a dream. She felt herself drifting across the floor as her feet, under their own programming and direction, moved her body to the door.


“Ma’am, are you Mrs. Jennifer Schwartz?”


Yes, the empty body answered.


“Ma’am, on behalf of the United States government, we regret to inform you that your husband has been killed in action in Afghanistan.”


* * *


That January evening, soon after the New Year, when darkness comes early to New York State’s northern tier and the chill clamps tight, I finished a walk in my woods and shed my snowshoes at the back door to find my wife curled up under a knitted blanket on the couch, nestled in front of the Christmas tree as one would sit before a fire, a still twinkling in an otherwise unlit room.


The kids busied themselves with an embarrassment of new toys, recent Christmas gifts from all members of the family. A pile of papers, my wife’s half-edited PhD dissertation, lay abandoned next to her in this, her favorite of post-holiday spots; Jessie’s efforts to work were stymied by the softness of the seat and the comfort of the blanket, the pleasant glow from so many small white lights and the snow outside. I kissed her and snuggled in and felt the warmth from her back and neck and no one had tried to kill me in five years.


We sat together on the couch, and I pulled out my phone, an unconscious habit. My thumb moved through various Facebook status updates, past children at Disney World, a four-year-old’s birthday party, a new hairstyle and car, political memes like modern prayer cards. I checked on Dan Fye, who had lost a leg half a year earlier and was struggling through rehab with a halo of pins and screws erupting from joints. I checked on Evil, to see if he had time to update while flying out of Bagram. I checked on three dozen other friends, brothers really, closer than any friends, who were in Afghanistan, about to leave for Afghanistan, just back from another tour. Jessie asked me what I was looking at, and I lied and said, “Nothing,” as she stared at the tree in peace.


I was thumbing through my phone, my wife’s head across my chest, my children distantly playing some electronic game in the basement, when it happened. No telegram arrived. The phone did not ring. There was no knock on the front door. The tiny screen on my phone simply flickered as I scrolled to more recent updates.


First, a more distant acquaintance changed his Facebook profile photo. His smiling suntanned face became the bomb squad’s ordnance-and-lightning badge with a thick black band across it, the universal symbol for mourning. Someone had died. Then a second friend changed his photo as well. Someone had died recently, or at least, the news was only now getting out.


So I started over, reviewed when everyone in Afghanistan had last checked in. It was an exercise in frustration. For some it had been weeks; when on patrol in the mountains, a civilian Internet connection was hard to find. I checked the updates of those who normally announce bad news, but there was silence from the Chiefs and commanders. As a last resort, I rechecked the wife network, for offers of vague support and prayers. Nothing from Amanda, but her husband was still recovering from his gunshot wound. Nothing from Monica and Aleesha. Jenny had been silent for hours, which was unlike her, and her husband Matt was deployed. Had he called and told her who it was?


Then a new status update popped up. “Fuck you Afghanistan.”


This was from Pinkham, a much closer friend. My chest clenched. A choke collar around my neck tightened.


Then immediately a direct message to me, from one of the few female techs I served with, Angela Olguin: “I assume u r in the know?”


No, I wanted to shout, I was not in the know. The crossover potential between Pinkham and Angela was small. We had all been assigned to the same unit in New Mexico, a small company of sixteen. Who was there? Kermit, already killed in Iraq. Bill Hailer, retired. Dee, retired. Garet, in Japan. Beau, shot and home. Hamski, already killed. Pinkham, Angela. Matthew Schwartz, who was deployed. Wes Leaverton, was he deployed? Laz? I thought he was in Guam. Piontek, no he got out. Burns too. Who else?


I grew agitated and fidgety, broke the Christmas tree spell.


“What is it?” asked Jessica as she sat up, wary, defensive, holding the blanket to her chin.


“We, we lost someone,” I fumbled.


“Please don’t let it be Matt,” my wife pleaded.


Why Matt? Why did she say Matt?


As fast as my fingers and thumbs could work, I messaged back to Angela: “No, fuck, what happened?”


My mind raced. Who was it? Who’s the worst it could be? Imagine him or her, imagine the worst outcome, and then whoever it is will be a relief.


In our job, we knew there would be casualties. Well, not at first, not when Afghanistan started. But eventually we grew to understand that while our vocation had provided a new family of brothers and sisters, it did so on the condition that too often they would die young. We had all by now learned how to lose acquaintances, a guy you had trained with, a guy you met once on a range clearance or Secret Service mission, a guy whose face appeared in every group photo.


But in time most of us developed a list, buried in the subconscious until moments like these. Five or ten names. The guys you can’t lose. The guys that have to make it back. It is a bargain with Satan. If I have to lose brothers, you tell yourself, I can bear it all as long as you spare these few. Matt and Josh and Phillips and don’t make me say them all. Please, just don’t take this small list that I am hiding in a place I am terrified to look.


Why did Jessie say Matt? Why did she have to say his name out loud?


I sat and shook and repeated my names like a mantra, and Jessie clutched the blanket, and I stared at my phone until it rang not ten seconds later. It was Angela.


“Matt died this morning,” she sobbed.


I nodded my head to Jessie. Her face broke into a thousand pieces, and she collapsed on the floor in front of the glowing Christmas tree.


* * *


There are so many ways to die, and right away, from the first moment, I wanted to know how Matt died, every last detail. It’s a basic human response magnified by my professional calling. It was January of 2012. We thought Iraq was over, but Afghanistan was still bloody, and Matt was just the latest in a terrible string of killed and crippled. Fifteen of my fellow EOD brothers had died in the previous twelve months, a killed-in-action rate of 5 percent, over ten times the average for American soldiers at the time. The year before that had been even worse, and I had lost track of the number of amputees created. For a while there, it seemed like every few days you heard someone lost a leg.


Some of us slip through the war unscathed, and some are lost to it, and some step up to the brink and then are pulled back from the abyss.



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 15:30

“The Looming Tower”: A pre-9/11 thriller for the post-facts world

The Looming Tower

Peter Sarsgaard in "The Looming Tower" (Credit: Hulu)


Given the clockwork precision with which media junkets are typically run, it’s rare, and a bit weird, for a journalist to be pulled back into a hotel suite after their allotted time has expired.


But Dan Futterman, writer and executive producer of Hulu’s new 9/11 procedural “The Looming Tower,” does just that, yanking me into a lavishly decorated antechamber in the mega-luxe Regent Hotel in Berlin, where “The Looming Tower” has just premiered to audiences at the annual Berlin International Film Festival. “I wanted to say something else,” says Futterman, a bit flustered. “It’s this: if there is pushback, it better come with information.”


Futterman is aware that “The Looming Tower,” an espionage drama adapted from Lawrence Wright’s Pulitzer-winning 2006 nonfiction book “The Looming Tower: Al-Qaeda and the Road to 9/11,” is high-voltage material. Tracking the FBI and CIA’s mid-’90s turn toward the mounting seriousness of Islamic extremism, and the breakdowns (and betrayals) that left America vulnerable, “The Looming Tower” is a wholly politicized work. And such works tend to play right into the hands of ideologues, eager to shift blame between government institutions, or across political aisles.


Just think back to 2012, when director Kathryn Bigelow and screenwriter Mark Boal’s thriller “Zero Dark Thirty” catalyzed a debate about the use of torture in capturing Osama bin Laden. “That was at the very center of a big fight between the FBI and the CIA,” says Futterman of the “Zero Dark Thirty” controversy. “That was a big fucking deal. So if you’re going to make a movie about that, you have to get that right. I mean I love [Bigelow and Boal] as filmmakers, but that’s a big deal.”


Despite fictionalizing a dense nonfiction account, and succumbing to the necessary process of dramatization (creating composite characters, condensing timelines, etc.), Futterman’s anxiety is eased by what he sees as the show’s fealty to the book. Author Wright also serves as an executive producer. “When his book came out, it was hard for people to find a fault with it,” Futterman explains. “And we tell essentially the same story.”


One of the miniseries’ key narrative inventions is Martin Schmidt (played by Peter Sarsgaard), a shrewd CIA al-Qaeda analyst composited from several real-life agents. As Futterman puts it, Schmidt “represents a certain attitude in the CIA.” He’s frequently seen clashing with Jeff Daniels’ FBI counterterrorism honcho, guided by a belief that the CIA is the only institution with the know-how to put down foreign terrorist threats.


A deeply intelligent actor with a long list of credits playing the smartest guy in the room — an influential sexologist in “Kinsey,” a callous know-it-all husband in the 2009 horror flick “Orphan,” the cold-as-ice enviro-terrorist in Kelly Reichardt’s “Night Moves,” and the super villain in DC’s much-maligned 2011 “Green Lantern” movie, in which he plays a guy whose brain is literally swelling — Sarsgaard seems right at home with another character who plays his cards close to the vest, confident (and a bit arrogant) in his ability to outsmart all comers. “He is the smartest guy in the room!” Sarsgaard laughs over early morning coffee. “That’s the thing: Just because you know the answer doesn’t mean you have to act that way.”


Bickering, infighting and double-dealing between intergovernmental agencies may not seem like the stuff of high drama. But “The Looming Tower” feels daring. There is a longstanding lie in American politics that everyone devoted to public life is, in their own ways and despite cosmetic differences, working toward the same end. The institutional drama of “The Looming Tower” puts this lie to rest, showing how much of politics and statecraft, especially among unelected officials, is about plain power-jockeying.


It’s also a highly confrontational work. This is, after all, a mainstream television event with the verve to analyze America’s failure in preventing 9/11. “It’s been 17 years,” says Sarsgaard. “Ready or not, here we come.”


Films made in the immediate aftermath of the cataclysmic terrorist attack tended toward the funerary (Oliver Stone’s “World Trade Center”) or the immersive (Peter Greengrass’s “United 93”). Outside of conspiratorial “truther” docs and essay films, the idea of even examining the geopolitical forces that converged on 9/11 seemed not just unpatriotic, but gauche, like speaking ill of a recently deceased statesman, or poking their still-cooling corpse with a stick. Art and entertainment made in the early years of “post-9/11” were very much stories about The Event. “The Looming Tower” zooms out, subjecting the attacks to the historical long view. “I don’t think the actual event is that interesting,” Futterman confesses. “I don’t say that disdainfully for what happened that day. We know what happened that day. It’s the ‘how?’ and ‘why?’ of what happened that day.”


“In a time of grief, questioning things is not high on the priority list,” says Sarsgaard. “I’ve likened it to a relative who died and we don’t talk about the fact that he was addicted to drugs and that’s why he died. It’s like, ‘Now is not the time. We’ll talk about it later.’ It was like that for a very long time in the States.” Just look at Sarsgaard’s wife, actor Maggie Gyllenhaal, who was denounced in the media in 2005 for merely suggesting the United States was “responsible in some way” for the attacks. “The Looming Tower” brings this long-verboten suggestion to the fore. It’s all about this burden of responsibility.


This is not to say the series downplays Islamic extremism. By the end of the first episode, we see the 1998 al Qaeda bombing of the U.S. embassy in Kenya, an episode that drew attention to the rising threat of anti-Western jihadism. But one structural difference between the new miniseries and Wright’s source text is its early focus. Wright’s “Looming Tower” exhaustively detailed the life of Egyptian author and Islamist martyr Sayuid Qutb, whose anti-secular, anti-Western writing is credited with seeding contemporary Islamic extremism. As Futterman describes him, Qutb “was this strange little guy who was seemingly happy in America. Then had a very, very vile reaction to it.”


For the production team, a focus on Islamic extremism from the Islamic perspective might have proved too challenging for Western, and particularly American, viewers. “It actually drives me crazy,” says Sarsgaard, visibly exasperated. “If you did a show based on the first half of a the book, it’d be all brown faces. And it’d be tough to find the hero for an American audience. But it would be fascinating. The story of Qutb is one that every American should know.”


“The Looming Tower” is also unique in that it vests its tense, docu-dramatic thrills in the hands of a genuine documentarian, director Alex Gibney. As the director of “Going Clear: Scientology and the Prison of Belief” (also based on a Lawrence Wright book), “Client 9: The Rise and Fall of Elliot Spitzer,” and the little-seen film about contemporary cyberwarfare “Zero Days,” Gibney has proven a skilled explorer of knotty, institutional stories that come to shape the culture at large. Sarsgaard calls Gibney “very serious-minded” and “deeply investigative,” suggesting a hardened just-the-facts mentality that risks hampering what is ostensibly a dramatic television series.


That’s where Futterman comes in. As a twice-Oscar-nominated screenwriter (for 2005’s “Capote” and 2014’s “Foxcatcher”) he draws out the human drama that he originally recognized in Lawrence Wright’s source text. “Larry’s interested in people,” he explains. “He’s interested in how people shape history. A lot of the story details personal relationships. Yes, there were institutional issues between the FBI and the CIA; they were on missions that were not compatible. But there were human beings at the center of that. There was personal animosity that fit into that. And then at a certain point, human beings lied to other human beings, and withheld information.”


Futterman seems to hope — ambitiously, and a bit anxiously — that “The Looming Tower” will expose the American public to this fateful internal intrigue. While the FBI and CIA are agencies that rely, in some meaningful respect, on secrecy, he says there’s a tendency for them to smother the very debate the series hopes to stimulate under a weighted blanket of silence, with government higher-ups demanding a trust from the public that they’ve already squandered. In an era of “fake news” and “post-facts,” the recourse to verifiable information seems particularly essential.


“People say, ‘just trust us! It’s classified!’” Futterman says, frustrated. “This is a conversation that anyone’s willing to have, but it has to come with information.”



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 15:00

Bill Maher lays into Republicans for deciding that “treason is better than working with Democrats”

Bill Maher

Bill Maher (Credit: Getty Images)


During his most recent episode of “Real Time with Bill Maher,” the titular talk show host called out Republicans for siding with Russia — that is, committing treason — over Democrats regarding President Donald Trump’s alleged collusion with the Russian government.


“We’re supposed to still be part of the American tribe,” Maher opined to Amy Chua, author of the book “Political Tribes: Group Instinct and the Fate of Nations,” while interviewing him on Friday. “You know, they always said if Mars attacked, then we and Russia would get together. But as it is now, Mars hasn’t attacked, so we’re still supposed to be against Russia as Americans. It seems to have gotten a little out of whack here when the Republicans hate the Democrats, that tribe, so much that they’re with the Russian tribe over the Democrats.”


He added that the Republican attitude seemed to be, “Treason is better than working with Democrats.”


It wasn’t the first time on Friday night that Maher explicitly accused Trump and his supporters of treason. During a panel segment with former attorney general Eric Holder and historian Jon Meacham, Maher commented that Democrats should accuse Republicans of treason as a campaign strategy in the 2018 midterm elections.


“So why can’t the Democrats make treason a campaign issue then?” Maher asked Holder and Meacham after the three discussed the Trump administration’s “ostentatious” corruption. “I mean if it’s that open — and I think it is — I don’t know why, they don’t seem to be making it an issue. They seem to be shrinking from it as an issue.” Maher then quoted a recent editorial by billionaire hedge fund manager Tom Steyer which described how the Justice Department indicted 13 Russians for the “electoral attack” that occurred during the 2016 presidential election and how Trump has done nothing in response.


“Trump is derelict in his duties,” Holder said in agreement with Maher and Steyer. “We were attacked. We were attacked. I mean, it wasn’t a physical attack, it was an electronic attack on the most vital of our systems, and he has done absolutely nothing to prepare us for what is to come. ‘Cause they’re still coming, they’re gonna come in 2018, they’re gonna come in 2020. And he’s done nothing to hold the Russians accountable in spite of the fact that, in this dysfunctional world we have, this dysfunctional Congress passed sanctions that he has refused to implement.”


Maher agreed, noting again during the panel segment that Trump had not ordered his subordinated to do anything about the Russian subversion of America’s electoral system.


Maher did not limit his criticisms of Republicans to the Russia issue. He devoted his opening monologue to the resignation of former White House communications director Hope Hicks, the numerous security-related scandals that have emerged regarding Trump’s son-in-law and adviser Jared Kushner and the president’s flip-flopping positions on gun control. Later during the program, he commended two of the Marjory Stoneman Douglas high school shooting survivors for standing up to conservative politicians on the issue of gun control.




 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 15:00

Stevie Wonder: Signed, concealed and delivered

Stevie Wonder

(Credit: AP/Bloomsbury Publishing/Salon)


Excerpted from “Stevie Wonder’s Songs in the Key of Life” by Zeth Lundy (Continuum, 2007). Reprinted with permission from Bloomsbury Publishing.



Stevie Wonder’s unique adolescence was compounded by Motown’s interest. His childhood was a prolonged preening for lifelong stardom at the will of Motown president Berry Gordy. Motown, aka “The Sound of Young America”: a charm school for the pop charts; a fantastical place for a formative upbringing, to be surrounded by the likes of Smokey Robinson, Marvin Gaye, Diana Ross and James Jamerson. Wonder’s professional name, given to him by Gordy (the exact story has morphed into a number of possible truths), called attention to the near-miraculous talents of a blind youth like the frantic hype-monger peddling a sideshow attraction at a traveling carnival. Little Stevie Wonder, the twelve-year-old genius! Motown, the emphatic barker and Wonder its logic-defying prize.


Through proximate osmosis, Wonder absorbed the styles of his label mates: Gaye’s paced delivery, Ross’s effervescence, Robinson’s delicacy with the minutiae of song structure. His impersonations transcended parlor-tricked mimicry; they became so legendary among Motown’s artists that the imitated and imitator were hard to separate. The perpetual cheerfulness that Wonder exuded became a sort of omnipresent characteristic that added another generational dimension to the familial togetherness of the label’s public edifice. At a time in his adolescent life when thinking creatively wasn’t necessarily a viable option, he instead sought out an idiosyncratic presence in the established molds that had been set up by his employers. In less than a year from the date of his first single release, that idiosyncratic mold secured Motown the second #1 song in its history with 1963’s “Fingertips, Pt. 2″; the album it came from, “The 12 Year Old Genius” (featuring Gaye on drums), would also be the label’s first #1 LP.


In truth, the deal struck between artist and label reaped only one-sided benefits for almost ten years. The contract that Wonder’s mother signed not only permitted a company lawyer to act as his legal guardian, it tied up the majority of his earnings in a trust fund until he turned twenty-one. This wasn’t an uncommon practice at Motown; most of its acts signed away publishing and performance royalties when they came aboard. Wonder’s deal afforded him the rather paltry allowance of $2.50 per week. So when “Fingertips, Pt. 2″ went to #1 in 1963, when “Uptight (Everything’s Alright)” peaked at #3 in 1966, when “I Was Made to Love Her” shot to #2 in 1967, when “For Once in My Life” hit #2 in 1968 and “Signed, Sealed, Delivered I’m Yours” made it to #3 in 1970, Wonder only collected a couple of bucks. Although that weekly allowance was hardly the most generous of rewards, the untouchable accumulation of royalties would end up serving an unexpected (or, perhaps, premeditated) purpose when Wonder came of age.


Like Orson Welles before him, Wonder was declared a child genius, a public designation that has arguably affected the way in which he’s worked (and, consequently, what has been expected of that work). Just as Heisenberg’s Uncertainty Principle illustrates that an observed object is inextricably changed by the mere observation, so do people change when they’re pronounced as unique. Both were young prodigies who were handed the keys to their respective kingdoms at early ages and, perhaps due to the freedom granted to cultivate their abnormal minds, released their most potent creative steam while still in their twenties. Where Welles quickly torched his own career in a vehemently proud blaze of glory, Wonder could do no wrong. Surely RKO execs rued the day they ever greenlighted Welles’s every wish on “Citizen Kane”; conversely, Gordy probably remembers writing that multimillion dollar check fondly. Affording creativity (or, genius) the means to realize its dormant ambition is a calculated risk—a risk made justifiable when the public offered eager validation with its wallet.


Wonder had an admirable string of hits in the 60s, roughly one smash per year during the latter half of the decade. There could have been more—Wonder would have bloomed much earlier in the public eye, but he made a conscious decision to withhold many of his best ideas from Motown beginning around the time he turned eighteen. He recognized that his ideas were, according to the legal documents, not really his at all, but Motown’s, and he had the foresight to compartmentalize. He stockpiled songs in his head, revelatory secrets kept away from those who sought to profit off them. Wonder’s artistic transformation was occurring in utter privacy. Everyone, including his employer, Would have to wait until he was no longer a minor before they got a glimpse.


In the meantime, Motown was readying Wonder to be, for all intents and purposes, a Vegas act. Wonder’s live act was in step with Motown’s dinner-theater aspirations (à la the Supremes’ “The Supremes Sing Rodgers & Hart” [1967]); he was leading big bands through uptight versions of hit singles the Detroit brass hoped he could pawn off in perpetuity. As his moment of official maturity approached, Wonder made his artistic intentions incredibly clear in both his words and his actions: he pointed to the Temptations’ psych-funk single “Cloud Nine,” released in late 1968, as an ideal creative direction for both himself and Motown; and, in 1970, he released his version of the Beatles” “We Can Work It Out,” the first track he performed, produced and arranged completely by himself. Though no one yet knew it, the former song provided a conceptual template for his work throughout the 70s, and the latter demonstrated how his hands-on style would function. Even Motown’s stylistic adjustments in the 70s, spurred by the efforts of artists like Wonder and sometimes referred to as “the greening of Motown,” was emblematic of its own transition away from an adolescent model it had held onto for far too long.


Before Wonder engaged in a symbolic contractual standoff that would usher him into the groundbreaking work of his early adulthood, he released “Where I’m Coming From” (1971). It was the first LP Wonder had produced and performed entirely on his own, and the first time he was granted complete creative control from Gordy. Both its lyrical content (inspired by the ongoing Vietnam debacle) and musical tenor (distanced from the lo-fi Motown aesthetic) indicated a major shift in the ideas and sounds Wonder had become interested in pursuing. Though it was far from Wonder’s most successful album, “Where I’m Coming From” was extremely important as a precursor to a new deal. It completely obliterated any semblance of Little Stevie Wonder that remained on someone’s antiquated radar screen.


On May 13, 1971, the day Wonder turned twenty-one—the day he officially came of age and could, in the eyes of the law, leave adolescence forever—Gordy hosted a big birthday party to celebrate the occasion. The next morning, Wonder had his lawyer notify Gordy that he was terminating all of his Motown contracts and taking control of the eight-year trust fund, which had grown up to over $1 million. By the time the message was delivered, Wonder had already fled Detroit for New York City, where he had booked a room at the Holiday Inn on West 57th and studio time at Jimi Hendrix’s Electric Lady Studios. His collected earnings were dumped into the production of what would become “Music of My Mind.” For the first time in his life, he was making a record that would be funded and nurtured entirely by himself.



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 14:30

The tech moguls driving mass layoffs in the economy want universal basic income as a cure

Facebook CEO Mark Zuckerberg

Facebook CEO Mark Zuckerberg (Credit: Getty/Justin Sullivan)


AlterNet


Driverless carsIBM WatsonNews-writing robots! Amazon Go! The future is here, friends, and it apparently excludes humans. People are preparing for the next mass extinction — an evaporation, if you will — not of humans, polar bears or other creatures, but of jobs.


This prospect has been sending shivers through humanity, from New York City to Kenya. How will people earn enough money to support themselves and their families when all the jobs are taken by robots? And how to keep from pointing the proverbial finger at the overlords of Silicon Valley?


Well, the answer certainly doesn’t seem to be “build the wall” — after all, that will probably be done by robots, too.


No, the idea is something more obvious and perhaps even more controversial: universal basic income (UBI).


Simply, UBI is a system where the government gives everyone a set amount of money every month, regardless of position on the socio-economic ladder. You could be homeless or a CEO, or teetering between the two; no matter what, you’d receive a monthly payment. Champions of UBI tend to speak of it in terms of an equalizing floor from which everyone can start, rather than the safety net previous generations talked about.


The general number that’s been thrown out in the U.S. for decades is $1,000 a month, though some experts suggest less, and some suggest more.


UBI has been gestating for years, even popping up in the late-’60s as a “Family Assistance” proposal from President Richard Nixon, not long after Dr. Martin Luther King Jr. proposed it publicly (Nixon’s proposal failed). The go-go ’80s of the Reagan years pushed it into the nether regions of cultural memory. But with tech titans like Elon Musk and Mark Zuckerberg trying to relate to the poor and hungry masses that are being replaced by the tech they and their social networks are creating, UBI is enjoying a moment of reconsideration in the upper echelons of thought leadership.


It is, seemingly, a simple solution, though if you listen to UBI’s proponents gush, it sounds almost romantically opulent in its ability to solve the future crises of poverty and inequality. Who wouldn’t want to accept their government check and then surf all day, or raise one’s children, rather than work?


The true beauty of UBI, though, lies in its capitalist welfare: Tech giants keep profits high, while the government pays people not to work or to continue to work for ever-lower wages in a gig economy. It’s a check to keep the masses, well, in check.


Ask a tech overlord though, and he’s likely to tout the innovation a monthly government check can foster. Sam Altman, a tech guru and investor in multiple companies including Airbnb, Reddit, Dropbox, and Change.org, supports a UBI as a stimulation of new ideas and wealth. While he guesses that perhaps 90 percent of people receiving UBI would “go smoke pot and play video games,” still, “if 10 percent of people go create new products and services and new wealth, that’s still a huge net win.” He calls UBI a “floor” that everyone can start from. Altman’s company, Y Combinator, is actively researching the idea, going so far as to develop a randomized controlled trial testing the idea’s potential, in conjunction with the Stanford Center on Poverty and Inequality.


Facebook co-founder Mark Zuckerberg, during his 2017 commencement speech at Harvard, optimistically called UBI a “cushion” that would enable everyone to try new ideas that could change the world.


Many UBI proposals include the elimination of other social programs such as SNAP, WIC and Section 8. Folding multiple bureaucracies into one would, the argument goes, make everything streamlined, and again, eliminate jobs and entire agencies. The data say it’s a great idea. UBI, in that regard, would disrupt the safety net that is supposed to catch people whose lives have been disrupted by disruptive technology.


Yet, if all those government agencies have been eliminated and their financing redistributed to UBI, that means money meant for the poor will actually be redistributed upward: Universal means universal. Everyone will potentially get a piece of the poverty pie regardless of income. Call it “luxury communism” and embrace it!


And so, many of the richest men in the world — who are inevitably rich because they are skilled at concentrating wealth into their own hands — are telling the public UBI could ensure dignity and self-esteem.


They’re also saying it’s inevitable. Global business leaders Richard Branson and Elon Musk say a UBI will become a necessity as tech replaces jobs. “[T]he sense of self-esteem that universal basic income could provide to people…can help people struggling just to survive and allow them to get on their feet, be entrepreneurial and be more creative,” Branson wrote on his blog.


Elon Musk told CNBC that “there is a pretty good chance we end up with a universal basic income, or something like that, due to automation.” He also admitted he isn’t “sure what else one would do.”


Erik Brynjolfsson, director of the MIT Initiative on the Digital Economy and a research associate at the National Bureau of Economic Research, suggests UBI is an idea whose time will inevitably come in the future, and that everyone will be able to live in leisure comfort (“assuming we don’t blow ourselves up first”). A positive outcome of a UBI plan will be dependent in part, he says, on how well we adjust “the way we get meaning from life.”


The problem — besides the idea that we must change our very perception of meaning in life — is that the championing of UBI is so loud from Silicon Valley that it drowns out the nuances of conversation that must occur in order for any society to foster a healthy public policy. In fact, the feudal lords of Silicon Valley are actually pre-empting a meaningful societal conversation about the economic and social ramifications of UBI by setting and controlling the dialogue through their channels. The trickle-down theory, in this case, might just be working for the proliferation of an idea that is serving its biggest proponents more than the poor people it claims to be positioned to help. When the only information given makes it seem like the solution to poverty and inequality, it’s tempting for even the most thoughtful leaders to hop on the bandwagon. Even liberal sex columnist Dan Savage seems to be enamored with the idea of UBI, telling Joe Rogan in 2015, “All these programs that we have to address poverty, we could bundle them all together, eliminate them, and give people a guaranteed minimum income.”


Among the ideas touted was a small study out of Canada that demonstrated UBI can give new mothers more time at home with their children, and high school-aged students the ability to stay in school. Back in the 1970s, the Canadian government experimented with giving folks in a small Manitoba town a basic income. According to the Guardian, over the course of four years, researchers discovered that while work habits didn’t change, “new mothers…took longer maternity leaves” and teenage boys were “more likely to stay in high school” than drop out to go to work. The experiment, which ended due to lack of funding, demonstrated that “the monthly income became a source of stability, buffering residents from financial ruin in the case of sudden illness, disability or unpredictable economic events. Hospitalizations dropped, as did injuries and mental health issues.”


Now the Canadian government is trying it again, launching a pilot UBI experiment this summer in Ontario. Participants will receive an annual UBI of nearly $17,000 CAD annually. The government of Finland (the only country in the E.U. actively experimenting with UBI) has also started giving 2,000 citizens a UBI of 560 euros every month. Already, they are seeing results of decreased stress. Yet Finland and Canada are fundamentally quite different from the U.S., from population size to the effectiveness of their universal health care programs. The Swiss proposed a UBI a few years ago, and their democratic initiative failed spectacularly.


Perhaps one of the loudest drumbeats for UBI comes from Dutch historian and author Rutger Bregman, who reminds us that while the idea of UBI for all may seem utopian, ending slavery was once considered a “utopian fantasy,” too. He makes other good points in his TED talk, mainly that, “Poverty isn’t a lack of character. Poverty is a lack of cash.” He calls UBI “venture capital for the people” and says it won’t just free the poor, “but also the rest of us.”


Bernie Sanders has said he does “absolutely support” UBI, but that we aren’t ready for it in the U.S. yet. In response to a Vox question, he said he was “absolutely sympathetic to UBI” and that “in the wealthiest nation in the history of the world, the top one-tenth of 1 percent should not own almost as much wealth as the bottom 90%. Everybody in this country should in fact have at least a minimum and dignified standard of living.” Perhaps Sanders was thinking of the older model of UBI as a so-called negative tax, and not the tech titan idea.


Economist Milton Friedman developed the idea of a negative income tax to harness the power of the tax system: “Low-income filers would receive checks from the government rather than pay taxes; as their earnings increased, so would their tax burden, but also the total amount the filer took home.” (Though it’s important to note he also believed in the elimination of other welfare programs at the same time.)


Of course, there are suggestions of UBI programs that are integrated with social welfare programs already in existence, or have income caps to ensure a multi-millionaire isn’t necessarily directly getting UBI checks. Others, like the Roosevelt Institute, argue that “Corporations have attained power over the economy and over our society, and we will not be healthy, economically, democratically, or socially, until that threat is confronted and dealt with. That requires a robust antitrust policy, and it also requires a robust ‘knowledge policy’: a return to the principles of the public good that once powered our national conversation and policy debate.” They suggest using the UBI to grow the economy and gain economic independence from our corporate overlords, by increasing federal debt.


There is yet another idea on how to generate a UBI for all: Alaska pays out dividends to its residents; the rest of the U.S. could, too. Alaska owns assets in oil and gas, and pays each resident a dividend via the Alaska Permanent Fund. Rather than increasing debt, the federal government could build up a “wealth fund that owns capital assets. Those capital assets would deliver returns and then the returns would be parceled out as a social dividend,” argues Matt Bruenig, a policy analyst.


And why not? If universal basic income is going to be on the table, a hearty discussion of all the options needs to be there, too — and everyone, especially working-class people, needs to be at the table. After all, the one thing almost universally avoided in UBI discussions is the idea that people want to contribute to their communities using their skills in a meaningful way. Often that manifests in work, but not necessarily. The important point is that the discussion should be inclusive, and bottom up. UBI is one issue that’s too important to let Zuckerberg and his social network determine it for us.



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 14:29

Man shoots himself to death in front of the White House

White House Police

Law enforcement officer near the White House in Washington, Saturday, March 3, 2018. (Credit: AP/Pablo Martinez Monsivais)


An unidentified man fatally shot himself to death near the White House on Saturday as a large crowd fled from the scene.


President Donald Trump was at his Mar-a-Lago estate in Florida instead of the White House at the time of the shooting, according to BBC. The unidentified shooter approached the White House fence on Pennsylvania Avenue shortly before 12 PM EST, then took out a handgun and fired several rounds. One of those shots ultimately took the gunman’s own life, but no one else was injured during the shooting.


The Secret Service provided updates about the shooting on their Twitter account.


BREAKING: Secret Service personnel are responding to reports of a person who allegedly suffered a self-inflicted gun shot wound along the north fence line of @WhiteHouse.


— U.S. Secret Service (@SecretService) March 3, 2018




UPDATE: No other reported injuries related to the incident at @WhiteHouse.


— U.S. Secret Service (@SecretService) March 3, 2018




UPDATE: Pedestrian and Vehicular traffic around the @WhiteHouse is being impacted due to the incident.


— U.S. Secret Service (@SecretService) March 3, 2018




UPDATE: Medical Personnel are responding to the male victim.


— U.S. Secret Service (@SecretService) March 3, 2018




Secret Service statement on shooting incident near the @WhiteHouse fence line this morning: https://t.co/0d4KvI1BjW


— U.S. Secret Service (@SecretService) March 3, 2018




In a statement issued after the shooting, the Secret Service explained that “the deceased has been identified by Secret Service and MPD authorities – name intentionally withheld pending next of kin notifications.” It also identified the shooter as a “white male” and added that, although he fired several rounds, none of them “appear at this time to have been directed towards the White House.” It also explained that the Washington, DC, Metropolitan Police Department is taking the lead on the investigation.


“We are aware of the incident. The President has been briefed. I refer you to the Secret Service for any more information,” explained deputy press secretary Hogan Gidley, according to CNN.


There have been at least two other incidents in which individuals near the White House presented a security risk. In March 2017, a man managed to stay on the White House grounds for more than 16 minutes after jumping the fence (he was only armed with pepper spray). Last month, a 35-year-old woman was detained after she drove her car through a security barrier outside the White House.



 •  0 comments  •  flag
Share on Twitter
Published on March 03, 2018 14:00

March 2, 2018

Are the Academy Awards ready for #MeToo?

Ryan Seacrest

Ryan Seacrest (Credit: AP/Evan Agostini)


The most conspicuous, speculated-over person at Sunday’s Academy Awards will likely be the person who’s never been nominated for an Oscar. And Ryan Seacrest is just the start of it.


One year ago, legions of women across the world saw themselves in Brie Larson. They watched as, right near the end of the 89th annual Academy Awards, an accomplished, hard-working woman had to stand up and give an accolade — and a perfunctory hug — to a man of dubious reputation. Actor Casey Affleck, as everyone that Oscar night knew, had been named in two sexual harassment lawsuits. He had been accused of inappropriate behavior ranging from boasting on set of his sexual exploits to climbing into a female colleague’s bed while she was sleeping. As Larson later put it, her grim demeanor as she handed him the award for best actor — and her quiet refusal to applaud — “spoke for itself.” But now, as we approach another Oscar night, what a difference 12 months makes. 


We’ve come a very long way from child rapist Roman Polanski’s victory for “The Pianist” earning a standing ovation from an Academy Awards crowd that included Harvey Weinstein back in 2003. Yet the Oscars have always been about rich, out of touch people celebrating their own, no matter how wildly, offensively out of touch their behavior may be. The rest of the world may move, however haltingly, forward; Oscar keeps lavishing prizes on guys like Woody Allen. In 2014, Cate Blanchett won Best Actress for “Blue Jasmine,” just one month after Allen’s daughter came forward with her version of longstanding accusations that he had molested her as a child. For decades, the Academy similarly showered its love on films touched by Harvey Weinstein — including his wins for “Shakespeare in Love” and “Gangs of New York” — all while his predatory behind-the-scenes behavior was an apparent open secret.


And then there’s Seacrest. Even with supporters like Kelly Ripa rallying behind him, the red carpet mainstay has lately faced escalating accusations of misconduct. Days ago, allegations emerged from Seacrest’s former stylist of him “grinding his erect penis against her while clad only in his underwear, groping her vagina and at one point slapping her buttock so hard that it left a large welt still visible hours later.” On Wednesday, a former coworker of the woman backed up her version of events, saying he had witnessed some of the behavior. So now, what are the celebrities who usually smile cheerfully and banter with him to do on Sunday night? An anonymous publicist told CNN Wednesday, “I don’t think [Seacrest is] going to have a great time on the carpet.” Oscar winner Jennifer Lawrence said she is “not sure” if she will stop to chat with him this time around. 



The January Golden Globes were a relatively straightforward opportunity for the #MeToo and #TimesUp movements (with the notable exceptions of James Franco and Kirk Douglas) to amplify and celebrate the cause of progress. But the Globes have always prided themselves on being a more outside-the-industry event. The Oscars, meanwhile, are all about Hollywood patting itself on the back, no matter how dirty that back may be.


This year, however, things might truly be a little different. Allen’s “Wonder Wheel” and star Kate Winslet were not nominated. Casey Affleck, who, per tradition, would normally present this year’s award for Best Actress, has announced he is sitting the ceremony out. And Harvey Weinstein was expelled from the Academy in October, shortly after that dam-bursting New York Times story — the one that began with an anecdote of Ashley Judd asking herself, “How do I get out of the room as fast as possible without alienating Harvey Weinstein?” On Sunday evening, Weinstein will not be in attendance, but Judd will, as a presenter. It has taken decades, but the room — and the carpet that leads to it — are changing. And maybe, finally, it is not the women who feel like they need to leave it.


 •  0 comments  •  flag
Share on Twitter
Published on March 02, 2018 16:00