Helen H. Moore's Blog, page 336

August 9, 2017

Babies are losing sleep — and touch screens are to blame

Mother and Baby

(Credit: Getty/martinedoucet)


AlterNet


The sight of tiny babies playing with touch screens is not exactly a new phenomenon. Toddlers strapped in strollers watch cartoons on their parents’ smartphones. Young children with headphones larger than their heads bump into you on the sidewalk. And it’s no longer unusual to walk into a restaurant and notice that at many tables, every family member is chatting or texting away on their phones. But for all of the convenience and entertainment the digital age has given us, there are significant setbacks with lasting consequences.


A study by Scientific Reports published last year revealed that there is a significant correlation between the use of touch screens with sleep problems in infants and toddlers. Traditional screen time, such as television and video games, is well known to affect sleep, but with the ubiquitousness of touch screen devices — phones, tablets, cameras — the link between media and loss of sleep may be consequential for infants and toddlers as well. This is particularly worrisome because when infants are unable to get the amount of sleep they need, their cognitive development is threatened.


The study collected data from 715 parents using an online survey that asked questions about media use (daily exposure to television and the use of touch screens) and sleep patterns (nighttime and daytime sleep duration, sleep onset, and frequencies of night awakenings). They discovered that 75 percent of toddlers between 6 months and three years use a touch screen on a daily basis, but that this number increases wildly across the age bracket. Fifty-one percent of babies between 6 and 11 months use a screen. However, nearly all of those babies, a whopping 90.05 percent, are partaking in the same activity when they are 25 to 36 months of age.


Every additional hour of tablet use by these infants was associated with 15.6 minutes less total sleep per night — 95 hours of less sleep a year.


Scientists suggest there may be several reasons why screens affect sleep. Time spent on media displaces time that children have available for sleep, pushing back bedtimes and causing a shorter duration of sleep. The content of the media may also activate psychological and physiological arousal, making it more difficult for children to fall asleep and limiting their potential to get a good night’s rest. Additionally, the bright blue lights from screens can interrupt the circadian rhythm by suppressing the release of melatonin.


Without enough sleep, babies will suffer later on in life. It is well-recorded that sleep allows us to build and strengthen connections between the left and the right hemispheres of our cerebrum. These connections are particularly important when the brain is still forming. Without them, without sleep, babies are losing the opportunity to reach their full potential of cognitive development.


As a silver lining to the widespread use of touch screens, the study also found the devices can help to improve babies’ motor skills. Celeste Cheung, coauthor of the study, said parents need not be overly concerned, but should “be aware of the potential impact of touch screen devices — both positive and negative.”


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 16:08

Trump’s desperate Vietnam gamble

[image error]

(Credit: AP/Evan Vucci)


I’m 70 years old. Donald Trump is 71. That means we have in common with every male in our generation one important, if not dominating thing: Vietnam.


We turned 18 and became eligible for the draft in the mid 1960s: ’64 for him, ’65 for me. Turning 18 in either New York, where he lived, or the Washington D.C. area, where I lived, was something we all looked forward to. You could throw away your fake ID and go downtown to a bar and drink anytime you wanted. Freedom! But like so many things in late adolescence, it came with qualifications: the possibility of losing your freedom to the draft was right around the corner.


1964 and 1965 were incredibly important years as the war in Vietnam reached maturity. There were 112,000 young men drafted in 1964, more than twice as many — 230,000 — in 1965. At the end of 1964, there were 23,000 Americans serving in Vietnam. By December of 1965, eight times as many were in the combat zone — 184,000. In 1964, 216 Americans lost their lives in Vietnam. In 1965, the number killed was raised tenfold to nearly 2,000. All in all, 28 million young men reached draft age during the Vietnam years. Nine million served in uniform, and 2.7 million went to Vietnam. Only about 800,000 actually saw combat.


By 1965, Vietnam had become a meat grinder, and with President Lyndon Johnson fearing that he would be accused of “losing” the war to the commies, he was turning the war into a senseless cesspool of slaughter. As Johnson fed the beast, an entire generation of young American men faced a dilemma no other American generation had confronted before. With the war already being criticized as foolhardy and misbegotten, if not criminal, did you don or did you dodge? — pun absolutely intended. Don dodged. I donned.


But it’s not a moral distinction I’m drawing here between me and our erstwhile president. Morals entered into it, of course. Some guys were strongly opposed to the war on religious or various moral grounds. But for others, getting drafted and possibly facing an order to Vietnam had simple, human consequences. For almost everyone in those years and for the years before, going into the service was an abrupt and usually unwanted interruption of your life. Whether you went to college and had a draft deferment for four years, or went straight to work out of high school, getting drafted was going to take at least two years out of your life, one way or another. And with Vietnam looming over there across the Pacific, the draft might cost you more than two years. It might cost you all the rest of the years you had on this earth.


There were other complications. Women weren’t subject to the draft back then, so they didn’t face the same dilemma. This gave them a freedom we didn’t have, to oppose the war on any grounds whatsoever without consequences. If we opposed the war and refused induction, we could go to jail. No such danger for women. This created a split between the sexes that had never existed before. I knew plenty of guys who had girlfriends who felt so strongly about the war they threatened to break up with the guys if they didn’t dodge the draft. When I was in West Point between 1965 and 1969, it was hard to find college girls who would go out with a cadet, much less a Spec 4 with orders to the combat zone. The women weren’t to blame. The war was a moral shithole, a consumer of American and Vietnamese bodies that never should have happened. The war was a problem between us, not of us.


As the years moved on, the complications multiplied. Guys who started out in ’64 or ’65 with an “American right or wrong” attitude gradually lost their innocence as the war dragged on, the bodies piled up and scandals like My Lai were forced out of military secrecy into the sunlight. Not one thing about that war was good. By the time Trump was in grad school and I was graduating from West Point, massive demonstrations were influencing public opinion against the war. LIFE magazine famously ran an absolutely jaw-dropping issue with thumbnail photographs of every one of the approximately 500 men who had been killed in a single week. Suddenly the war had 500 faces, and they were so young, every one of them with a future to look forward to!


Now they were dead.


Support for the war went from being reflexive to mixed to weak. As casualties mounted, Johnson, and then Nixon, began pulling out troops and substituting them with massive bombing campaigns that didn’t work any better than the 500,000 troops we had there at the war’s peak.


Young American men didn’t just watch this process, they were in it. They were in many ways its subjects. The split within the generations was blasted into sharp relief at Kent State when a platoon of reservists, all of them young, all of them male, gunned down students protesting the war. The war had come home.


Trump reacted to all of this as many men did. He had four student deferments as he completed undergraduate and graduate college work, and when they ran out, he got a doctor to certify him as damaged goods due to spurious bone spurs. It was well known at the time that a check for $1,000 would get you out of the draft. I knew a lawyer in New York whose entire practice was getting guys out of the draft. What about those who didn’t have a grand or so handy, to spend on a lawyer or doctor? Tough luck. This produced an entirely unbalanced and unfair military, of course, with young men of color or poor backgrounds far more likely to be the ones under fire.


You wouldn’t think it, but it wasn’t much different for West Point graduates at the time. There were 800 guys in my class. Some 90 of them were high school class presidents, 200 in the honor societies of their schools, several dozen Eagle Scouts — all in all, as accomplished and intelligent a student body as you could find in a college anywhere. Every morning The New York Times was delivered to our rooms. Cadets could read as well as everyone else. We saw the war disintegrate before our eyes on the nightly news and in the pages of the Times. At West Point, we saw the effects of the war on the officer corps. It was disastrous. Officers who had lied about body counts and covered up war crimes were promoted into positions of authority at West Point. They brought the effects of that corrupt war with them. Resignations skyrocketed in my class, with more than 100 leaving in one year alone. In 1965, I wanted nothing more in the world than to be an Army officer. By 1969, I wasn’t so sure.


My father had been relieved of his battalion command in Vietnam in the 9th Infantry Division for refusing to shell a village of civilians with his mortars one night, an act that would have been a war crime no less savage than My Lai. Doing the right thing instead of the wrong thing effectively ended the career of a good man who had served his country with honor for more than 20 years. Friends of ours in the Army returned from the war with horror story after horror story. Drugs. Random shootings of civilians for sport. Financial corruption on an industrial scale.


It does the word “disillusioned” no favor to use it to describe me by the time I became a Second Lieutenant in the Infantry. When I became a platoon leader at Fort Carson, Colorado (to make a long, complicated story short), I discovered that approximately 15 to 20 percent of my unit was addicted to heroin, most of them vets just returned from Vietnam who brought their habits home with them. I tried to get an amnesty program established to deal with the rampaging problem of heroin addiction with treatment instead of incarceration. For that crime, a Lieutenant General ordered me punitively to Vietnam. I refused that order and was expelled from the Army — for being gay! — with a bad discharge. I know I’ve said it before, but that tattered discharge certificate somewhere in my files is my silver star. I was 23 years old.


But like everything else about that godforsaken time, it was complicated. I had very, very mixed feelings about the Army and about what I did. I didn’t feel like I was letting down my family, because I had already seen what the war did to my father. But I felt in a very basic, gut-level way that I was letting down my soldiers at Fort Carson, because I was a really good platoon leader. I would have been a good platoon leader in Vietnam as well. And under other circumstances, I would have gone. Other guys from my class at West Point went through their own private and not so private conflicts. Two guys became the second and third West Point officers in history to be granted conscientious objector status and allowed to resign from the Army. Other guys figured out various ways to avoid Vietnam by finagling their career assignments. When the class of 1969 reached the end of its obligation to serve after five years, nearly 50 percent of the class resigned at once. The reason: Vietnam.


So I don’t judge harshly Donald Trump and his history with avoiding the draft, and I doubt that many in our generation do. I came to describe that time as the “damned if you did, damned if you didn’t” years. If you allowed yourself to get drafted and served, you were damned by those who opposed the war. If you dodged the draft and avoided Vietnam — for any reason, moral or otherwise — there were those who damned you as not living up to your obligations as a citizen and moreover as a man. This produced an incredibly confused split in our generation which more than 50 years has not lessened. Our history is so strife torn and riven with contradictions and pain that none of us should be using it to harm one another.  It doesn’t accomplish anything, it’s not fair, and it’s just not right, so of course it’s exactly what Trump is doing.


Trump is reflexively using that split over Vietnam in the service of his career as president, most recently in his unhinged attack on Senator Richard Blumenthal for his mixed record on Vietnam. “Interesting to watch Senator Richard Blumenthal of Connecticut talking about hoax Russian collusion when he was a phony Vietnam con artist!” Trump tweeted on Monday morning after watching Blumenthal supporting the Russia investigation on CNN.


Trump’s thumbs continued to thwack: “Never in U. S. history has anyone lied or defrauded voters like Senator Richard Blumenthal. He told stories about his Vietnam battles and . . . conquests, how brave he was, and it was all a lie. He cried like a baby and begged for forgiveness like a child.” Apart from the fact that Trump was attacking Blumenthal for advocating upholding the law, and the fact that much of Trump’s attack is an outright lie — Blumenthal said that he had served in the war when in fact he had remained stateside while serving in the Marine Reserves — Blumenthal never “told stories about his battles and conquests” or “how brave he was,” much less “cried like a baby and begged for forgiveness like a child.” On the contrary, when Blumenthal was caught out with his misstatement about his career — almost eight years ago — he apologized, and until Monday, he and everyone else in the world had moved on.


Not Trump. He used Vietnam as a bludgeon to clobber Blumenthal, as usual lying about him outrageously as he did it. Why? That’s the question I have. The war in Vietnam is the third rail for our generation, politically and in every other way. Remember the attacks on Clinton for dodging the draft? They went after him with a howitzer when it was revealed that Clinton had attended a demonstration against the war overseas when he was at Oxford. Heavens! Not only was he a draft dodger, he wasn’t a patriot! It hardly bears repeating to go over Trump’s various unpatriotic acts and statements while overseas, not to mention the role overseas Russians played in his election campaign. So let’s leave that to history.


What we shouldn’t ignore, however, is the reason Trump completely lost it on Monday morning with his insane tweets attacking Senator Blumenthal. He is so frightened of the investigation of himself and his campaign by Special Counsel Mueller that he actually conflated Blumenthal’s military record with Russian “collusion.” Huh? That’s not reaching; that’s pure unfettered desperation. Our president is like a rabid raccoon, lashing out wildly, teeth bared, apparently prepared to fight to the death against the investigation which threatens his presidency more and more every day.


The thing about raccoons is, they can climb trees to avoid being caught. Trump is marooned on the earth, where he finds himself in the crosshairs of Mueller’s investigation. Poor guy. He should have put on the uniform and gone through basic training and even taken that long dreadful flight to the combat zone whether he would carry a rifle when he got there or not. Might have given him some of what the Army used to call “character,” because he — and we — sure as hell could use some right now.


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 16:00

Daniel Destin Cretton makes another sincere beauty with “The Glass Castle”


"The Glass Castle" (Credit: Lionsgate Films/Jake Giles Netter)


When filmmaker Daniel Destin Cretton read Jeannette Walls’s 2005 memoir, “The Glass Castle,” about Walls’s experience being raised by eccentric, free-spirited, sometimes neglectful and abusive parents, he thought, “Oh, some of these stories are so extreme, they might be slight fictionalization of memory.”


In the book, Walls recalls the family’s dramatic, sometimes dark, escapades, which include fleeing bill collectors, hospitals and child workers, galling parenting decisions (letting an unaccompanied three-year-old boil hotdogs) and touching gestures (Jeannette’s father, Rex, giving her the planet Venus as a Christmas gift); she renders her father a complicated giant, equal parts seductive and destructive. The Walls children may have lacked for basic care, but never for adventure.


Walls’s childhood was so rich in drama and her storytelling so deft, that the memoir became a quick and undeniable hit. It received glowing reviews. And it spent 261 weeks on The New York Times Bestseller list. That’s more than seven years. It didn’t take but one thought for the book to be optioned as a movie. In 2005, Paramount Pictures and Brad Pitt’s Plan B productions acquired the rights to the story.


Twelve years have passed since the book’s original publication and the story’s initial film optioning. During that time, several screenwriters have taken a pass at the story and the cast has gone through numerous incarnations. Jennifer Lawrence was initially slated to star as Jeannette, with Mark Ruffalo once rumored to play Rex and Claire Danes to play Jeannette’s mother, Rose Mary.


Sometimes a film’s intermediate stages are more compelling than the finished product. What would Edgar Wright’s “Ant Man” have looked like? What about a Francis Ford Coppola’s “The Great Gatsby”? Other times, the finished film seems like the one that was destined, and it renders all intermediate incarnations unimaginable. That is the case with “The Glass Castle.”


Daniel Destin Cretton adapted the book with Andrew Lanham. The pair, according to Walls, were “really smart about getting at the heart of the book.” She said: “A couple of other screenwriters had taken a stab at it, and they were good screenplays, but Destin immediately said, ‘This is about the relationship between the daughter and the father,’ and he went into that, and I thought he cracked it right open.”


Cretton is best known for his second feature, “Short Term 12,” which starred Brie Larson as a supervisor at a residential treatment facility, who has to confront her children’s problems as well as her own. Besides starring Larson, both films are affecting dramas that show Cretton to be a profoundly earnest filmmaker whose work tugs at the heartstrings and doesn’t send the eyes rolling in their sockets. Subtle humor is essential to his success. So are his actors’ performances.


Rex Walls (Woody Harrelson), who is not the protagonist but is the star of the film, is a mercurial character prone to grand statements and gestures. He is his children’s self-appointed teacher and he tells them things like, “You learn from living. Everything else is a damn lie” and, when teaching Jeannette how to swim, “I can’t let you cling to the side your whole life because you’re afraid to swim.” Harrelson voices these lines with such conviction and charisma that they don’t sound grandiloquent or phony. The character is as enchanting to the audience as he is to the family. So when his family forgives him for acts that seem unforgivable, it makes sense.  


Larson and Naomi Watts play more subtle roles, but each portrays their character multidimensionally and naturally. Each is a great actor, so perhaps it is a given that they would render their characters familiar, practically tactile. But each — Harrelson too — is as good as ever under Cretton’s direction. And part of the reason may be due to his shooting style. Cretton has said that he shoots handheld “so that it loosens it up a bit and we can respond to things.” As poetically written as some of the dialogue is, the style makes it all seem spontaneous.


Cretton did fictionalize parts of the story, adding dramatic embellishments and narrative tissue. But his greatest feat may have been telling the story in such a way that viewer doesn’t leave the theater going, “Oh, some of these stories are so extreme, they might be slight fictionalization.” They’re too consumed by the ride.


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 15:59

Louisiana congressman helping lead GOP effort to limit federal definition of gender

New Orleans Pride parade

New Orleans Pride parade (Credit: AP/Gerald Herbert)


AlterNet


Back in June, in the very first week of LGBT Pride Month, Rep. Ralph Abraham’s, R-La., subscribers read in his newsletter that their congressman was working diligently to restore congressional authority over how the U.S. government can define “sex” and “gender.” Abraham is one of five co-sponsors — all Republican —for the House Resolution formally named the “Civil Rights Uniformity Act of 2017.” The resolution is a response to a controversial Obama-era mandate issued last September that withheld federal funds from school districts if they prohibited students from using bathrooms or locker rooms that matched their gender identity, regardless of their biological sex. 


If this resolution passes, “sex” or “gender” could not be interpreted to mean “gender identity” at the federal level. It also requires “man” or “woman” refer exclusively to a person’s genetic sex when interpreting certain federal laws or regulations.


Under the resolution, one cannot interpret a federal civil rights law to treat gender identity or transgender status as a protected class unless “gender identity” or “transgender status” are explicitly stated as protected classes.


The bill was referred to the Subcommittee on the Constitution and Civil Justice last month. It now awaits first debate in the Civil Justice Committee, where Republicans outnumber Democrats 8-3. If it passes there, it goes to be voted on by the House and Senate, which are both majority GOP.


The term “gender identity” has been a contentious topic of debate across the country, and Louisiana is not immune. At the state level, Louisiana’s legislature this year shot down a bill that would protect LGBT residents from employment discrimination —  just as it has repeatedly for the past 24 years, as different lawmakers have tried time and again to pass it through a majority Republican body. In July, Rep. Abraham declared he backed Trump’s proposed ban on transgender men and women in the military “100 percent.”


Louisiana Gov. John Bel Edwards, the lone Democratic governor in the Gulf South, issued a statewide executive order last year that would have protected state workers on the basis of sexuality and gender identity. But that order was challenged by the state attorney general and more than a dozen Republican state lawmakers, partly because they were unclear on what the term “gender identity” meant. The order was eventually ruled unconstitutional in court — not for its “gender identity” language but for alleged governmental overreach. Edwards and the order’s main challenger, state Attorney General Jeff Landry, head to an appeals court in August to rehash the issue.


Right now, there is no federal law protecting workers on the basis of sexuality or gender identity. Without a state or local law in place to prevent it, it is entirely legal to be fired or denied a job if the employee or applicant is gay, transgender or some other member of the LGBT community.


But some Louisiana lobbyists and lawmakers have expressed willingness to change their position against changing state law to protect LGBT workers, as long as federal law is changed with it.


President Barack Obama issued an executive order similar to the Louisiana governor’s in 2014. It protected federal workers based on sexual orientation and gender identity. President Donald Trump revoked that order in March as part of his “two-for-one” pledge to eliminate two federal regulations with every new one introduced.


New Orleans Mayor Mitch Landrieu’s office estimated this past spring that New Orleans gained $100 million for hosting the 2017 NBA All-Star Game on Feb. 19. The city was the NBA’s second choice to host the lucrative event, which had local hotels at 99 percent occupancy that weekend. Charlotte, N.C., had been the first choice. But it was bumped off the list because of a later-repealed state law that forbid transgender people to use the bathroom of their identified gender — what is now famously known as the controversial “bathroom bill.”


In 2017, 16 states have considered, but not passed, something similar. Louisiana was not one of them. Reports suggest the NBA All-Star business boom contributed to the state’s lack of one this year.


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 15:58

Slouching toward Mar-a-Lago

Donald Trump, Melania Trump, Shinzo Abe, Akie Abe

(Credit: AP Photo/Susan Walsh)


Like it or not, the president of the United States embodies America itself. The individual inhabiting the White House has become the preeminent symbol of who we are and what we represent as a nation and a people. In a fundamental sense, he is us.


It was not always so. Millard Fillmore, the 13th president (1850-1853), presided over but did not personify the American republic.  He was merely the federal chief executive.  Contemporary observers did not refer to his term in office as the Age of Fillmore.  With occasional exceptions, Abraham Lincoln in particular, much the same could be said of Fillmore’s successors.  They brought to office low expectations, which they rarely exceeded.  So when Chester A. Arthur (1881-1885) or William Howard Taft (1909-1913) left the White House, there was no rush to immortalize them by erecting gaudy shrines — now known as “presidential libraries” — to the glory of their presidencies.  In those distant days, ex-presidents went back home or somewhere else where they could find work.


Over the course of the past century, all that has changed.  Ours is a republic that has long since taken on the trappings of a monarchy, with the president inhabiting rarified space as our king-emperor.  The Brits have their woman in Buckingham Palace.  We have our man in the White House.


Nominally, the Constitution assigns responsibilities and allocates prerogatives to three co-equal branches of government.  In practice, the executive branch enjoys primacy.  Prompted by a seemingly endless series of crises since the Great Depression and World War II, presidents have accumulated ever-greater authority, partly through usurpation, but more often than not through forfeiture.


At the same time, they also took on various extra-constitutional responsibilities.  By the beginning of the present century, Americans took it for granted that the occupant of the Oval Office should function as prophet, moral philosopher, style-setter, interpreter of the prevailing zeitgeist, and — last but hardly least — celebrity-in-chief.  In short, POTUS was the bright star at the center of the American solar system.


As recently as a year ago, few saw in this cult of the presidency cause for complaint.  On odd occasions, some particularly egregious bit of executive tomfoolery might trigger grumbling about an “imperial presidency.” Yet rarely did such complaints lead to effective remedial action.  The War Powers Resolution of 1973 might be considered the exception that proves the rule.  Inspired by the disaster of the Vietnam War and intended to constrain presidents from using force without congressional buy-in and support, that particular piece of legislation ranks alongside the Volstead Act of 1919(enacted to enforce Prohibition) as among the least effective ever to become law.


In truth, influential American institutions — investment banks and multinational corporations, churches and universities, big city newspapers and TV networks, the bloated national security apparatus and both major political parties — have found reason aplenty to endorse a system that elevates the president to the status of demigod.  By and large, it’s been good for business, whatever that business happens to be.


Furthermore, it’s our president — not some foreign dude — who is, by common consent, the most powerful person in the universe.  For inhabitants of a nation that considers itself both “exceptional” and “indispensable,” this seems only right and proper.  So Americans generally like it that their president is the acknowledged Leader of the Free World rather than some fresh-faced pretender from France or Canada.


Then came the Great Hysteria.  Arriving with a Pearl Harbor-like shock, it erupted on the night of November 8, 2016, just as the news that Hillary Clinton was losing Florida and appeared certain to lose much else besides became apparent.


Suddenly, all the habits and precedents that had contributed to empowering the modern American presidency no longer made sense.  That a single deeply flawed individual along with a handful of unelected associates and family members should be entrusted with determining the fate of the planet suddenly seemed the very definition of madness.


Emotion-laden upheavals producing behavior that is not entirely rational are hardly unknown in the American experience.  Indeed, they recur with some frequency.  The Great Awakenings of the eighteenth and early nineteenthcenturies are examples of the phenomenon.  So also are the two Red Scares of the twentieth century, the first in the early 1920s and the second, commonly known as “McCarthyism,” coinciding with the onset of the Cold War.


Yet the response to Donald Trump’s election, combining as it has fear, anger, bewilderment, disgust, and something akin to despair, qualifies as an upheaval without precedent.  History itself had seemingly gone off the rails.  The crude Andrew Jackson’s 1828 ousting of an impeccably pedigreed president, John Quincy Adams, was nothing compared to the vulgar Donald Trump’s defeat of an impeccably credentialed graduate of Wellesley and Yale who had served as first lady, United States senator, and secretary of state.  A self-evidently inconceivable outcome — all the smart people agreed on that point — had somehow happened anyway.


A vulgar, bombastic, thrice-married real-estate tycoon and reality TV host as prophet, moral philosopher, style-setter, interpreter of the prevailing zeitgeist, and chief celebrity?  The very idea seemed both absurd and intolerable.


If we have, as innumerable commentators assert, embarked upon the Age of Trump, the defining feature of that age might well be the single-minded determination of those horrified and intent on ensuring its prompt termination. In 2016, TIME magazine chose Trump as its person of the year.  In 2017, when it comes to dominating the news, that “person” might turn out to be a group — all those fixated on cleansing the White House of Trump’s defiling presence.


Egged on and abetted in every way by Trump himself, the anti-Trump resistance has made itself the Big Story.  Lies, hate, collusion, conspiracy, fascism:  rarely has the everyday vocabulary of American politics been as ominous and forbidding as over the past six months.  Take resistance rhetoric at face value and you might conclude that Donald Trump is indeed the fifth horseman of the Apocalypse, his presence in the presidential saddle eclipsing all other concerns.  Pestilence, War, Famine, and Death will just have to wait.


The unspoken assumption of those most determined to banish him from public life appears to be this: once he’s gone, history will be returned to its intended path, humankind will breathe a collective sigh of relief, and all will be well again.  Yet such an assumption strikes me as remarkably wrongheaded — and not merely because, should Trump prematurely depart from office, Mike Pence will succeed him.  Expectations that Trump’s ouster will restore normalcy ignore the very factors that first handed him the Republican nomination (with a slew of competitors wondering what hit them) and then put him in the Oval Office (with a vastly more seasoned and disciplined, if uninspiring, opponent left to bemoan the injustice of it all).


Not all, but many of Trump’s supporters voted for him for the same reason that people buy lottery tickets: Why not?  In their estimation, they had little to lose.  Their loathing of the status quo is such that they may well stick with Trump even as it becomes increasingly obvious that his promise of salvation — an America made “great again” — is not going to materialize.


Yet those who imagine that Trump’s removal will put things right are likewise deluding themselves.  To persist in thinking that he defines the problem is to commit an error of the first order.  Trump is not cause, but consequence.


For too long, the cult of the presidency has provided an excuse for treating politics as a melodrama staged at four-year intervals and centering on hopes of another Roosevelt or Kennedy or Reagan appearing as the agent of American deliverance.  Donald Trump’s ascent to the office once inhabited by those worthies should demolish such fantasies once and for all.


How is it that someone like Trump could become president in the first place?  Blame sexism, Fox News, James Comey, Russian meddling, and Hillary’s failure to visit Wisconsin all you want, but a more fundamental explanation is this: the election of 2016 constituted a de facto referendum on the course of recent American history.  That referendum rendered a definitive judgment: the underlying consensus informing U.S. policy since the end of the Cold War has collapsed.  Precepts that members of the policy elite have long treated as self-evident no longer command the backing or assent of the American people. Put simply: it’s the ideas, stupid.


Rabbit poses a question


“Without the Cold War, what’s the point of being an American?”  As the long twilight struggle was finally winding down, Harry “Rabbit” Angstrom, novelist John Updike’s late-twentieth-century Everyman, pondered that question. In short order, Rabbit got his answer.  So, too, after only perfunctory consultation, did his fellow citizens.


The passing of the Cold War offered cause for celebration.  On that point all agreed.  Yet, as it turned out, it did not require reflection from the public at large.  Policy elites professed to have matters well in hand.  The dawning era, they believed, summoned Americans not to think anew, but to keep doing precisely what they were accustomed to doing, albeit without fretting further about Communist takeovers or the risks of nuclear Armageddon.  In a world where a “single superpower” was calling the shots, utopia was right around the corner.  All that was needed was for the United States to demonstrate the requisite confidence and resolve.


Three specific propositions made up the elite consensus that coalesced during the initial decade of the post-Cold-War era.  According to the first, the globalization of corporate capitalism held the key to wealth creation on a hitherto unimaginable scale.  According to the second, jettisoning norms derived from Judeo-Christian religious traditions held the key to the further expansion of personal freedom.  According to the third, muscular global leadership exercised by the United States held the key to promoting a stable and humane international order.


Unfettered neoliberalism plus the unencumbered self plus unabashed American assertiveness: these defined the elements of the post-Cold-War consensus that formed during the first half of the 1990s — plus what enthusiasts called the information revolution.  The miracle of that “revolution,” gathering momentum just as the Soviet Union was going down for the count, provided the secret sauce that infused the emerging consensus with a sense of historical inevitability.


The Cold War itself had fostered notable improvements in computational speed and capacity, new modes of communication, and techniques for storing, accessing, and manipulating information.  Yet, however impressive, such developments remained subsidiary to the larger East-West competition.  Only as the Cold War receded did they move from background to forefront.  For true believers, information technology came to serve a quasi-theological function, promising answers to life’s ultimate questions.  Although God might be dead, Americans found in Bill Gates and Steve Jobs nerdy but compelling idols.


More immediately, in the eyes of the policy elite, the information revolution meshed with and reinforced the policy consensus.  For those focused on the political economy, it greased the wheels of globalized capitalism, creating vast new opportunities for trade and investment.  For those looking to shed constraints on personal freedom, information promised empowerment, making identity itself something to choose, discard, or modify.  For members of the national security apparatus, the information revolution seemed certain to endow the United States with seemingly unassailable military capabilities.  That these various enhancements would combine to improve the human condition was taken for granted; that they would, in due course, align everybody — from Afghans to Zimbabweans — with American values and the American way of life seemed more or less inevitable.


The three presidents of the post-Cold-War era — Bill Clinton, George W. Bush, and Barack Obama — put these several propositions to the test.  Politics-as-theater requires us to pretend that our 42nd, 43rd, and 44th presidents differed in fundamental ways.  In practice, however, their similarities greatly outweighed any of those differences.  Taken together, the administrations over which they presided collaborated in pursuing a common agenda, each intent on proving that the post-Cold-War consensus could work in the face of mounting evidence to the contrary.


To be fair, it did work for some. “Globalization” made some people very rich indeed.  In doing so, however, it greatly exacerbated inequality, while doing nothing to alleviate the condition of the American working class and underclass.


The emphasis on diversity and multiculturalism improved the status of groups long subjected to discrimination.  Yet these advances have done remarkably little to reduce the alienation and despair pervading a society suffering from epidemics of chronic substance abuse, morbid obesity, teen suicide, and similar afflictions.  Throw in the world’s highest incarceration rate, a seemingly endless appetite for porn, urban school systems mired in permanent crisis, and mass shootings that occur with metronomic regularity, and what you have is something other than the profile of a healthy society.


As for militarized American global leadership, it has indeed resulted in various bad actors meeting richly deserved fates.  Goodbye, Saddam.  Good riddance, Osama.  Yet it has also embroiled the United States in a series of costly, senseless, unsuccessful, and ultimately counterproductive wars.  As for the vaunted information revolution, its impact has been ambiguous at best, even if those with eyeballs glued to their personal electronic devices can’t tolerate being offline long enough to assess the actual costs of being perpetually connected.


In November 2016, Americans who consider themselves ill served by the post-Cold-War consensus signaled that they had had enough.  Voters not persuaded that neoliberal economic policies, a culture taking its motto from the Outback steakhouse chain, and a national security strategy that employs the U.S. military as a global police force were working to their benefit provided a crucial margin in the election of Donald Trump.


The response of the political establishment to this extraordinary repudiation testifies to the extent of its bankruptcy.  The Republican Party still clings to the notion that reducing taxes, cutting government red tape, restricting abortion, curbing immigration, prohibiting flag-burning, and increasing military spending will alleviate all that ails the country.  Meanwhile, to judge by the promises contained in their recently unveiled (and instantly forgotten) program for a “Better Deal,” Democrats believe that raising the minimum wage, capping the cost of prescription drugs, and creating apprenticeship programs for the unemployed will return their party to the good graces of the American electorate.


In both parties embarrassingly small-bore thinking prevails, with Republicans and Democrats equally bereft of fresh ideas.  Each party is led by aging hacks.  Neither has devised an antidote to the crisis in American politics signified by the nomination and election of Donald Trump.


While our emperor tweets, Rome itself fiddles.


Starting over


I am by temperament a conservative and a traditionalist, wary of revolutionary movements that more often than not end up being hijacked by nefarious plotters more interested in satisfying their own ambitions than in pursuing high ideals.  Yet even I am prepared to admit that the status quo appears increasingly untenable. Incremental change will not suffice.  The challenge of the moment is to embrace radicalism without succumbing to irresponsibility.


The one good thing we can say about the election of Donald Trump — to borrow an image from Thomas Jefferson — is this: it ought to serve as a fire bell in the night.  If Americans have an ounce of sense, the Trump presidency will cure them once and for all of the illusion that from the White House comes redemption.  By now we ought to have had enough of de facto monarchy.


By extension, Americans should come to see as intolerable the meanness, corruption, and partisan dysfunction so much in evidence at the opposite end of Pennsylvania Avenue.  We need not wax sentimental over the days when Lyndon Johnson and Everett Dirksen presided over the Senate to conclude that Mitch McConnell and Chuck Schumer represent something other than progress.  If Congress continues to behave as contemptibly as it has in recent years (and in recent weeks), it will, by default, allow the conditions that have produced Trump and his cronies to prevail.


So it’s time to take another stab at an approach to governance worthy of a democratic republic.  Where to begin?  I submit that Rabbit Angstrom’s question offers a place to start:  What’s the point of being an American?


Authentic progressives and principled conservatives will offer different answers to Rabbit’s query.  My own answer is rooted in an abiding conviction that our problems are less quantitative than qualitative.  Rather than simply more — yet more wealth, more freedom, more attempts at global leadership — the times call for different.  In my view, the point of being an American is to participate in creating a society that strikes a balance between wants and needs, that exists in harmony with nature and the rest of humankind, and that is rooted in an agreed upon conception of the common good.


My own prescription for how to act upon that statement of purpose is unlikely to find favor with most readers of TomDispatch.  But therein lies the basis for an interesting debate, one that is essential to prospects for stemming the accelerating decay of American civic life.


Initiating such a debate, and so bringing into focus core issues, will remain next to impossible, however, without first clearing away the accumulated debris of the post-Cold-War era.  Preliminary steps in that direction, listed in no particular order, ought to include the following:


First, abolish the Electoral College.  Doing so will preclude any further occurrence of the circumstances that twice in recent decades cast doubt on the outcome of national elections and thereby did far more than any foreign interference to undermine the legitimacy of American politics.


Second, rollback gerrymandering.  Doing so will help restore competitive elections and make incumbency more tenuous.


Third, limit the impact of corporate money on elections at all levels, if need be by amending the Constitution.


Fourth, mandate a balanced federal budget, thereby demolishing the pretense that Americans need not choose between guns and butter.


Fifth, implement a program of national service, thereby eliminating the All-Volunteer military and restoring the tradition of the citizen-soldier.  Doing so will help close the gap between the military and society and enrich the prevailing conception of citizenship.  It might even encourage members of Congress to think twice before signing off on wars that the commander-in-chief wants to fight.


Sixth, enact tax policies that will promote greater income equality.


Seventh, increase public funding for public higher education, thereby ensuring that college remains an option for those who are not well-to-do.


Eighth, beyond mere “job” creation, attend to the growing challenges of providing meaningful work — employment that is both rewarding and reasonably remunerative — for those without advanced STEM degrees.


Ninth, end the thumb-twiddling on climate change and start treating it as the first-order national security priority that it is.


Tenth, absent evident progress on the above, create a new party system, breaking the current duopoly in which Republicans and Democrats tacitly collaborate to dictate the policy agenda and restrict the range of policy options deemed permissible.


These are not particularly original proposals and I do not offer them as a panacea.  They may, however, represent preliminary steps toward devising some new paradigm to replace a post-Cold-War consensus that, in promoting transnational corporate greed, mistaking libertinism for liberty, and embracing militarized neo-imperialism as the essence of statecraft, has paved the way for the presidency of Donald Trump.


We can and must do better. But doing so will require that we come up with better and truer ideas to serve as a foundation for American politics.


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 01:00

Why the NAACP said “enough” to school privatization

Back to School Things to Know

(Credit: AP Photo/Seth Perlman, File)


AlterNet


The reaction to the NAACP’s hard-hitting new report  on charter schools, calling for tighter regulation and an end to for-profit schools, was swift and furious. Charter advocates and school choice proponents painted the NAACP as out of touch, or worse, doing the bidding of the teachers unions. These critics are missing what’s most important about the civil rights group’s strong statement. School privatization has allowed state governments to avoid their obligation to educate children of color, especially the poor. The NAACP said “enough” this week.


First, some background. Last year, the NAACP passed a resolution calling for a moratorium on the expansion of charter schools until problems with accountability and the loss of funding from traditional public schools are addressed. The civil rights organization then formed an education task force that spent the year visiting cities, including New Haven, Memphis, New Orleans and Detroit. The report issued this week expands on the previous resolution and reflects the testimony of parents and practitioners. Among the task force’s recommendations: tighter regulations and oversight for existing charters, a ban on for-profit charters, and a reinvestment in traditional public schools.


In response, charter advocates were quick to take their case to the press, citing school performance data and polls on the popularity of charter schools among Black parents. Think pieces and commentaries have called out the NAACP as irrelevant, under the sway of the nation’s two teachers unions, and too willing to ignore the failings of public education. As school choice advocate Chris Stewart wrote in his response to the NAACP report: “If the NAACP is honest, it will apply the same demands to all public schools. If it cares about the education of Black kids, it will stay focused on the inequities in the system.”


I’m convinced that the NAACP is doing just that. The conversation about the failings of public education is worth having. However, too many folks are missing the larger point that the NAACP is making: state governments must be held accountable for the education of Black and Brown children. School privatization has allowed state governments to avoid their obligation to educate children of color, especially poor children.  The erosion of the public commitment to educating all kids is sadly ironic when you consider that it was Black people who pushed hardest to make free public education a reality for all.


Here is what the abandonment of the public responsibility for educating all kids looks like. State policymakers declare themselves fed up with overseeing underperforming public schools in poor Black and Brown neighborhoods. Some policymakers, at the behest of their constituents, rather than seeking solutions to improve these schools, tell families that the schools are so bad that they can’t be improved. Families are then told that “experts” will be invited to improve the education of their children. This “strategy” takes the burden of educating poor children of color off of the state, which many believe is a waste of tax money considering the continuous underperformance of city schools. This strategy also paints education policymakers in an innovative light; they look as though they are thinking outside the box to attack a problem largely created as a result of the state’s own negligence.


According to the NAACP task force, charter schools are exacerbating racial segregation, a topic that critics of the civil rights group have been largely dismissive of. The larger issue is that outsourcing the education of students of color abandons the spirit of integration. School choice and the proliferation of charter schools prevent people of color from holding state governments to the obligation required of them as per Brown v. Board of Education. The NAACP report documents the consequences of this abandonment: inadequate funding of urban schools, a lack of accountability and oversight for charter school, most of which are concentrated in urban communities, the disproportionate exclusionary discipline of Black students, high teacher turnover, and an absence of teachers of color in both charters and traditional public schools.


The “failure” of public education is not the fault of the poor children of color who attend underperforming schools, but the failure of government to get integration right. Had school integration been a true process involving genuine fellowship of Black and White students and a co-laboring of Black and White practitioners, our children wouldn’t be subject to the injustices they deal with on a daily basis. But instead, Black students were sent to White schools that had no desire to affirm their identity, while Black teachers and administrators were sent away by the thousands in the wake of Brown v. Board.


Some critics are painting the NAACP as being in opposition to the efforts of Black and Brown families to determine the education of their own children by choosing to send them to charter schools. What these folks fail to see is that school competition isn’t competition at all. It is simply the passing of the buck away from public responsibility. As a former charter school teacher, I believe that some charters can succeed where public schools haven’t. But let’s not fool ourselves. We can’t outsource our way to the physical and psychological liberation of Black and Brown students. Privatization and school choice let public officials walk away from the responsibility of educating all kids. Advocates will make the claim that charters and voucher programs offer poor students of color the same opportunities for access and success that students in wealthier communities enjoy. But no one is talking about the state abandoning its responsibility to affluent students.


With its powerful statement this week, the NAACP sent a message to elected officials and state and local governments — that they are accountable for educating children of color, and need to get back on the job. I am not sure why anyone would have a problem with that.


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 00:59

How affordable housing can chip away at residential segregation

housing_rescue_rect

(Credit: Reuters/Mark Blinch)


With the health care debate stalling, Republicans are beginning to make more noise about tax reform. President Donald Trump has promised to make his bid to alter the code his next big battle, as has House Speaker Paul Ryan.


Though the low-income housing tax credit could land on the chopping block, it’s probably safe due to its history of bipartisan support. Along with politicians from both sides of the aisle, developers and many banks and nonprofits embrace it because the tax credit makes creating new affordable housing units financially feasible and less risky. Yet the program, which is the only significant federal subsidy for building affordable housing, could be in jeopardy as lawmakers seek to close tax loopholes and lower tax rates.


As a tax law researcher who has studied where properties built with this tax credit are located, I see good reasons to preserve it. Above all, this program has the untapped potential to help solve the intractable problems of residential segregation by race, ethnicity and class.


Affordable housing


Each year, the federal government delivers approximately US$8 billion in low-income housing tax credits to housing developers that agree to set aside a certain number of units as rent-controlled affordable housing for qualified tenants. Since it began in 1986, the program has helped create at least 45,905 affordable housing projects with nearly three million units.


Some recent research suggests that the affordable housing properties built with the tax credits help to integrate and revitalize otherwise poverty-stricken neighborhoods.


Notwithstanding these encouraging findings, I still worry that the program may reinforce racial and economic segregation in some cities. For example, affordable housing advocates have voiced concern about the program’s harmful effects on neighborhood choice in Dallas and New York City. In my own research about low-income tenants in Philadelphia, I have noted that they have few options to live in low-income housing tax credit projects outside of high-poverty neighborhoods where most residents are people of color.


A new mandate


Here’s one possible explanation for the disagreement. Because there are no geographic restrictions on where affordable housing may go for builders to qualify for the tax credit, and there is no mandate that eligible projects help break up pockets of poverty, its impact inevitably varies.


Instead of leaving outcomes to chance, some affordable housing advocates and I are suggesting a solution: Housing authorities — which determine which affordable housing projects will be awarded the tax credits – should approve only properties consistent with new, broader anti-poverty and anti-segregation objectives.


For more than a decade, researchers have noted that affordable housing properties boosted by this tax credit are disproportionately located in low-income neighborhoods. Perhaps for this reason, an important line of research has sought to understand the tax credit’s impact on the communities surrounding new affordable housing projects. Though findings have varied, several researchers have found positive spillover effects.


In a recent report, Rebecca Diamond and Tim McQuade from Stanford’s Graduate School of Business offered new empirical evidence to support the view that the tax-subsidized properties benefit surrounding areas. They found that the projects increased property values, lowered the crime rate and spurred economic and racial integration – as long as the buildings were located in low-income neighborhoods where more than half the population was black or Latino.


The study didn’t detect these benefits, however, for affordable housing located in more affluent, predominantly white areas. Does that mean builders should go out of their way to site projects in low-income neighborhoods? Not necessarily.


A Philadelphia case study


Establishing affordable housing in low-income neighborhoods may give surrounding areas a small boost. But doing so exclusively may severely restrict housing options available to low-income tenants, leaving many without opportunities to live in other kinds of places.


My own research on siting patterns has focused on Philadelphia, a city with a history of residential housing segregation that still persists. I found that the number of low-income housing tax credit properties in a Philadelphia ZIP code is strongly correlated with the ZIP code’s poverty rate, or the percentage of residents below the poverty line, which varies according to family size. (Families of four earning less than $24,755, for example, fall into this category.)


My findings suggest that the projects have been — intentionally or not — clustered in low-income neighborhoods. In fact, 40 percent of Philadelphia’s 465 low-income housing tax credit properties built or rehabilitated since 1987 were located in just five low-income ZIP codes.


Since my study didn’t look at neighborhood change, I can’t say with certainty whether siting 184 low-income housing tax credit projects in those five ZIP codes has increased the racial and economic diversity of those neighborhoods over the past few decades. I don’t know whether the neighborhood homeowners benefited from having 30 or more low-income housing tax credit properties down the street.


What I can say is that the average poverty rate in those five ZIP codes was still 43 percent in 2015 (the city average is 26 percent), and 83 percent of the residents were nonwhite, compared with 58.3 percent of all Philadelphia residents. Meanwhile, most of these low-income housing tax credit properties are zoned for highly segregated public schools.



These are troubling statistics, given recent findings by New York University sociologist Patrick Sharkey that children who live in high-poverty, racially segregated neighborhoods are more likely to be even poorer than their parents when they grow up. This effect takes a toll on the generation of children living there and the next generation.


Mixed-income properties


Given the risks tied to living in overwhelmingly segregated neighborhoods, housing policies should encourage builders to construct affordable housing in more affluent areas.


Though the tax code calls for a larger tax credit for projects located in certain high-poverty census tracts, it lacks geographic restrictions or guidance on where they should go. In other words, the federal tax law is designed to increase the supply of affordable housing without saying where to put it.


Without siting mandates, the tax credit is relatively flexible and could, at least theoretically, help make poverty less concentrated. One possibility is to draw higher-income tenants to low-income neighborhoods through low-income housing tax credit-financed mixed-income housing.


The tax law allows for mixed-income projects, but Yale Law professor Robert Ellickson has noted that more than 80 percent of low-income housing tax credit properties are exclusively low-income. For the tax program to support a mixed-income strategy, developers would have to reserve fewer units for poor tenants. And that change that might undermine the program’s primary goal.


Housing vouchers


For this reason and others, policymakers should instead look to the program’s potential to aid other housing programs. For instance the $18 billion-per-year Housing Choice Voucher program is designed to give low-income renters choices about where they will live — including places where poverty is less concentrated than where they currently reside.


This program gives low-income tenants vouchers to help pay their rent. They agree to spend up to 30 percent of their income on rent, state housing authorities pick up the rest of the tab, and the federal government reimburses the states for that expense.


Many landlords won’t accept vouchers, sometimes because they worry that low-income tenants won’t pay their rent. Even the landlords who take vouchers can get skittish over compliance and inspection requirements.


But landlords renting out affordable housing units built through the low-income housing tax credit program aren’t allowed to refuse to lease to tenants merely because they plan to use vouchers. Disproportionately siting projects in poor neighborhoods may limit the tax law’s capacity to make the most out of this federal program.


The ConversationIn contrast, encouraging builders to place affordable housing in more affluent neighborhoods with this tax credit may give low-income renters more housing location options. For parents facing economic hardship, the ability to move to an affluent neighborhood may make it more likely that their kids will grow up to be better off.


Michelle D. Layser, Research Fellow, Adjunct Professor of Law, Georgetown University


 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2017 00:58

August 8, 2017

Who becomes a saint in the Catholic Church, and is that changing?

APTOPIX Vatican Pope Canonizations

A large screen for public display shows Pope John Paul II after sunset outside St. Peter's Square at the Vatican, Saturday, April 26, 2014. (Credit: AP Photo/Vadim Ghirda)


Pope Francis has created a new category for beatification, the level immediately below sainthood, in the Catholic Church: those who give their lives for others. This is called “oblatio vitae,” the “offer of life” for the well-being of another person.


Martyrs, a special category of saint, also offer up their lives, but they do so for their “Christian faith.” And so, the pope’s decision raises the question: Is the Catholic understanding of sainthood changing?


Who’s a “saint”?


Most people use the word “saint” to refer to someone who is exceptionally good or “holy.” In the Catholic Church, however, a “saint” has a more specific meaning: someone who has led a life of “heroic virtue.”


This definition includes the four “cardinal” virtues: prudence, temperance, fortitude and justice; as well as the “theological” virtues: faith, hope and charity. A saint displays these qualities in a consistent and exceptional way.


When someone is proclaimed a saint by the pope — which can happen only after death — public devotion to the saint, called a “cultus,” is authorized for Catholics throughout the world.


Canonization


The process for being named a saint in the Catholic Church is called “canonization,” the word “canon” meaning an authoritative list. Persons who are named “saints” are listed in the “canon” as saints and given a special day, called a “feast,” in the Catholic calendar.


Before approximately the year 1000, saints were named by the local bishop. For example, St. Peter the Apostle and St. Patrick of Ireland were considered “saints” long before any formal procedures had been established. But as the papacy increased its power, it claimed the exclusive authority to name a saint.


The investigation


Today there are four stages in canonization.


Any Catholic or group of Catholics can request that the bishop open a case. They will need to name a formal intermediary, called the “postulator,” who will promote the cause of the saint. At this point, the candidate is called “a servant of God.”


A formal investigation examines “servant of God’s” life. Those who knew the candidate are interviewed, and affidavits for and against the candidate are reviewed. Also, the candidate’s writings – if any exist – are examined for consistency with Catholic doctrine. A “promoter of justice” named by the local bishop ensures that proper procedures are followed and a notary certifies the documentation.


The proceedings of the investigation, called “Acta” or “The Acts,” are forwarded to the Congregation for the Causes of the Saints in Rome. The Congregation for the Causes of the Saints is large, with a prefect, a secretary, undersecretary and a staff of 23 people. There are also over 30 cardinals and bishops associated with the congregation’s work at various stages.


The Congregation for the Causes of the Saints appoints a “relator” (one of five who currently work for the congregation) who supervises the postulator in writing a position paper called a “positio.” The positio argues for the virtues of the servant of God and can be thousands of pages long. The congregation examines the positio and members vote “yes” or “no” on the cause. “Yes” votes must be unanimous.


The final decision lies with the pope. When he signs a “Decree of Heroic Virtue,” the person becomes “venerable.” Then two stages remain: beatification and sainthood.


Throughout most of Catholic history, the canonization process was rigorous. One of the key figures in the investigation in the Vatican was the “devil’s advocate,” who functioned like an opposing attorney by challenging the candidate’s holiness. This is the origin of the often-used English phrase referring to someone who takes a position to challenge another person to prove a point more fully.


Few people have received the title of “saint,” although there are more than 10,000 that the Catholic Church venerates. Even 15th-century famous spiritual writer German Thomas à Kempis didn’t make it through the process. His body was exhumed and examined during his case for sainthood. There are stories that there were scratch marks on the inside of his coffin and splinters of wood under his fingernails. These discoveries suggested an escape attempt after being buried alive. The issue would have been that Thomas à Kempis did not peacefully accept death as a saint should. His case did not move forward.


Changes to the process


In the early ‘70’s, Pope Paul VI revised the canon of the saints to exclude those whose historical existence could not be verified. For example, St. Christopher, the protector of travelers, was removed, although many Catholics still have a St. Christopher medal in their automobiles.


In 1983, John Paul II, who would become a saint himself, changed the waiting period from 50 to five years after the candidate’s death. He also reduced the role of the “devil’s advocate.”


These changes led to criticism that the Vatican had become “a saints’ factory.” This quicker process, however, has not reduced the six-figure costs necessary for those who support the cause to fund an investigation and hire a postulator.


Types of saints


While the title “saint” is used for all those who are canonized, there are different categories of saints, such as “martyr” and “confessor.”


A “martyr” has been killed for his or her Christian beliefs; a “confessor” has been tortured or persecuted for his or her faith, but not killed. If a saint had been a bishop, a widow or a virgin, that becomes part of their title as well.


For example, St. Blaise is both a bishop and a martyr. Katherine Drexel of Philadelphia has the title “St. Katherine Drexel, Virgin.” St. Katherine Drexel was the second American-born saint and founder of Xavier University of Louisiana, the only American Catholic university established primarily for African-Americans.


At this point, it is unclear whether a special title is associated with the new category of saint declared by Pope Francis.


Miracles and martyrs


Miracles are an important part of canonization.


A miracle is an event that cannot be explained by reason or natural causes. To be named “blessed,” one miracle has to be proved as having taken place under the influence of the candidate for sainthood. The process begins with a person praying to the saint who “intercedes” with God, usually to cure an illness. The potential miracle is then investigated by a medical board of nine members, who are sworn to secrecy. They can be paid for their work only through bank transfer, a rule to prevent under-the-table payments that could corrupt the process.


After the occurrence of a second miracle is established, the candidate’s title will change from “blessed” to “saint.” With St. John Paul II, this happened in the record time of nine years. First, there was a French nun who was cured of Parkinson’s disease. Then there was the healing of a Costa Rican woman from a brain aneurysm.


Martyrs have a different path to sainthood. They become “blessed” when the pope makes a “Decree of Martyrdom.” After a single miracle, martyrs are “raised to the glory of the Altars,” a phrase that refers to the public ceremony in which a person is formally named a saint.


A new kind of saint?


Given this complex history of Catholic sainthood, it’s fair to ask whether Pope Francis is doing anything new.


The pope’s declaration makes it clear that someone who gives his life for others should demonstrate virtue “at least as ordinarily possible” throughout life. This means that someone can become “blessed” not just by living a life of heroic virtue, but also by performing a single heroic act of sacrifice.


Such heroism might include dying while trying to save someone who is drowning or losing one’s life attempting to rescue a family from a burning building. A single miracle, after death, is still necessary for beatification. Now saints can be persons who lead a fairly ordinary life until an extraordinary moment of supreme self-sacrifice.


The ConversationFrom my perspective as a Catholic scholar of religion, this is an expansion of the Catholic understanding of sainthood, and yet another step toward Pope Francis making the papacy and the Catholic Church more relevant to the experiences of ordinary Catholics.


Mathew Schmalz, Associate Professor of Religion, College of the Holy Cross


 •  0 comments  •  flag
Share on Twitter
Published on August 08, 2017 16:10

The ugly, pseudoscientific history behind that sexist Google manifesto

Housewife in the kitchen

(Credit: Getty/cyano66)


If you haven’t read the full text of this leaked memo that now-fired Google software engineer James Damore sent around to his co-workers, here’s the Cliff’s Notes version: A pervasive “left” bias at Google has “created a politically correct monoculture that maintains its hold by shaming dissenters into silence,” Damore claims. He states his belief that the reason that the company doesn’t have “50% representation of women in tech and leadership” may be because of “biological differences.”


“The distribution of preferences and abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership,” Damore writes. “We need to stop assuming that gender gaps imply sexism.”


Damore continues by suggesting that the reason that there are few women in “top leadership positions” may be because of biological reasons, namely, “men’s higher drive for status.” His recommendation is that they accept these biological differences and assign men and women to different roles: “Women on average look for more work-life balance while men have a higher drive for status on average,” Damore writes.


The reasons that these “facts” of his have been ignored, he writes, is because “We all have biases and use motivated reasoning to dismiss ideas that run counter to our internal values.” He suggests that “the Left tends to deny science concerning biological differences between people (e.g., IQ and sex differences).”


You might be keen to ask: are Damore’s claims entirely false? Damore is good with rhetoric — to the layperson, or to anyone who doesn’t follow cultural politics or scientific debates, his ideas unfold quite rationally.


Way back in 1984, three scientists — biologist and zoologist R.C. Lewontin, biologist Steven Rose, and psychologist Leon J. Kamin — published a book called “Not In Our Genes” that debunked the myth that biological sex determined interests and behavior, a belief sometimes called “biological determinism.” There is a long history of screeds akin to Damore’s being penned and taken seriously; in fact, so common are these types of manifesti, that Lewontin, Rose, and Kamin described their common rhetorical through-line:


The biological determinist argument follows a by now familiar structure: It begins with the citation of “evidence,” the “facts” of differences between men and women … These “facts,” which are taken as unquestioned, are seen as depending on prior psychological tendencies which in turn are accounted for by underlying biological differences between males and females at the level of brain structure or hormones. Biological determinism then shows that male-female differences in behavior among humans are paralleled by those found in nonhuman societies — among primates or rodents or birds . . .  giving them an apparent universality that cannot be gainsaid by merely wishing things were different or fairer. . . And finally, the determinist argument endeavors to weld all currently observed differences together on the basis of the now familiar and Panglossian sociobiological arguments: that sexual divisions have emerged adaptively by natural selection, as a result of the different biological roles in reproduction of the two sexes . . . the inequalities are not merely inevitable but functional too.



Sound familiar? Damore’s arguments are nothing new. And because they follow such a well-trod pattern, these three biologists were debunking his argument — in 1984. Let’s go through and pick apart Damore’s fallacies, shall we? 


Men are better at certain fields like engineering


Let’s start with the idea that women or men or better disposed for STEM. Lewontin, Kamin and Rose note that an “exhaustive review of the literature on sex differences and performance” by Hugh Fairweather revealed that, “despite persistent claims to the contrary,” there is


“no substantial sex differences in verbal sub-tests of IQ tests; in reading; in para-reading skills; and in early linguistic output; in articulatory competence; in vocabulary; and in laboratory studies of the handling of verbal concepts and processing of verbal materials.”



Yet even if there were substantial sex differences in the test results, the authors note that this wouldn’t necessarily disprove anything: children are socialized from such a young age that gendered expectations are imprinted on them very early on. “All cultures must generate expectations of behavior among parents and hence ensure that certain types of behavior are going to be consciously or unconsciously reinforced or discouraged from the beginning,” they write. In other words, social imprinting starts young. 


Hormones make us who we are


Another of Damore’s persistent claims is that the “biological differences” between men and women “have clear biological causes and links to prenatal testosterone.” This myth, too, has been debunked.


“Insofar as sex differences are determined by hormones, they are not a consequence of the activities of uniquely male or female hormones, but rather probably of fluctuating differences in the ratios of these hormones and their interactions with target organs,” Lewontin, Rose and Kamin write. In other words: there isn’t hard science that shows that “testosterone = drive for leadership.”


Career choices prove biological difference


And finally, we come to Damore’s claim that men and women’s biological differences are “universal across human cultures.” This can be easily disproven by noting how different professions are gendered around the world — sure evidence that our interests and the careers we are steered to stem more from how we are socialized. Doctors in the United States are more likely to be male than female at a 2:1 ratio, but in the Soviet Union in the 1980s the situation was the opposite. This women-biased gender difference persists in former Soviet satellites like Estonia, which has about 70% women doctors.


In other fields, previously-theorized “inherent” gender differences have been overcome. The number of women earning biology PhDs steadily rose to the point that today, more than half of all biology PhDs in the United States are awarded to women. These are but a few examples.


The dark history of biological determinism


Many of the ideas that Damore presents as self-evident “facts” — that men have a “higher drive for status,” or that “women on average are more cooperative”  — seem remarkably convenient to the narrative of his own success in life. His thesis is quite self-assured: not only does he deny the need for affirmative action or the existence of any systemic bias, but he simultaneously asserts his own position and status in society as emergent from no systemic privilege whatsoever. In effect, he asserts his own victimhood. (He even claims systemic bias against “conservatives.”)


As far as deterministic claims go, Damore’s are redundant — he could have just copy-pasted the text of one of thousands of these written in the early 20th century — and also milder than many. In the past few centuries, the same line of argument has been used to argue for the racial superiority of whites, the inferiority of women, and to justify transphobia. (Indeed, Damore does present sex as binary and rigid — there’s no discussion that the mere idea of two sexes may be a myth, a fact that is supported by science.)


Damore doesn’t go as far as to advocate for eugenics, to his credit; he argues that men and women have differences that need to be respected, and rather doesn’t say men are “superior” — even if in his estimation, men are predisposed to be traditionally successful by our current social metric. Still, biologically deterministic arguments like his can easily slip into eugenicist doctrines of yore.


 •  0 comments  •  flag
Share on Twitter
Published on August 08, 2017 16:00

TCA Report: The Fall TV outlook for the broadcast networks isn’t pretty

Young Sheldon; The Gifted; The Mayor

Young Sheldon; The Gifted; The Mayor (Credit: CBS/FOX/ABC)


Officially the broadcast fall TV season begin the week after the Emmys telecast, and for the reader’s reference that airs Sunday, September 17 on CBS. I have to make that qualification – “officially” — because a number of series for which viewers harbor affection and enthusiasm premiere before that date. But those series run on cable or streaming services, among them “BoJack Horseman” (debuting Friday, Sept. 8 on Netflix), “Outlander” (Sunday, Sept. 10 on Starz) and “Broad City” (Wednesday, Sept. 13 on Comedy Central).


In truth, programming television schedules has been a true year-round endeavor for more than a decade, though efforts to eliminate the concept of seasons have been going on for even longer that this. Regardless, the surge of series releases via various channels and services (but mainly on Netflix, let’s call it like we see it) makes the concept of season a bit nebulous. Americans will continue to follow the rhythms of fall television regardless of the industry’s evolution; it coincides with the beginning of the school year, for one thing, and besides, it’s a convenient way to keep track of when established series return.


And for ABC, NBC, CBS, Fox and The CW, the series worth anticipating aren’t the new crop but the established winners. ABC has the final season of “Scandal” coming up. CBS and its slew of procedurals keep middle-America happy, as does its reliable comedy veterans headed, of course, by “The Big Bang Theory.”


Fox keeps chugging along with “Empire.” The CW has critically-acclaimed awards bait in “My Crazy Ex-Girlfriend” and “Jane the Virgin” while continuing to please fans of the D.C. comic book universe in the face of a tsunami of Marvel-related titles across television, including on ABC, Fox and Netflix. NBC maintains its path of resurgence via Dick Wolf’s “Chicago” branded procedurals and the multiple-Emmy Award nominated “This Is Us.”


In other words, there’s plenty of sitcoms and dramas that are known factors and that somehow survived the prior season that proved difficult for any new contenders to cut through the static created by a loud and vicious election campaign cycle and everything that followed in its wake.


At least the new fall shows won’t have to struggle against that soul-sucking energy. Oh, we still feel like the world’s events are slowing draining our life force out of our pores. We’ve simply gotten used to it.


A greater problem facing series debuting in the 2017-2018 season is that they’re not particularly good. “Spectacularly mediocre” is the most apt way to describe this new fall slate, host to a number of shows that are long on flash and short on feeling.


To wit, both ABC and Fox are rolling out new Marvel projects — “Inhumans” on ABC, “The Gifted” on Fox — short on quality and substance. “Inhumans” is getting an early debut in IMAX theaters on Sept. 1, which probably isn’t the wisest moves given how ridiculously amateurish it looks in its current state.


Marvel’s head of television Jeph Loeb is well aware that the pilot resembles a flaming bin of trash because when a reporter politely asked him how he felt about it, he testily replied, “I can tell you that it was written on the material that you were given that the show that you have seen is not the finished product. If you’re asking me whether or not it was done, it’s not. So to be perfectly honest, I don’t understand your question.”


NBC’s grand move consists of bringing back “Will & Grace” and underscoring its belief that audiences will love, love, love these new episodes of a show that’s been dead for a decade by greenlighting a second season before the first premieres. The comedy’s debut . . . return? . . . comes before ABC, the new home of Ryan Seacrest, raises “American Idol” from its gilded grave.


Fox is countering “Idol” with its own singing competition tentatively titled “The Four,” which may or may not resemble “The Voice” in some respects. With music stars launching and building their careers on YouTube these days a person may rightly wonder what value these talent competitions have in today’s marketplace anyway.


Network television has long been an easy target, and slapping them around doesn’t provide much of an upper body workout relative to the sundry complex problems plaguing the industry at large. Instead here are five new shows that could actually be entertaining, but honestly, there are no clear guarantees of this.


As stated in our previous story on cable’s better fall slate, in-depth reviews of many if not all of these titles will be published closer to their premieres.


“Young Sheldon.” Premieres 8:30 p.m. Monday, September 25 on CBS. Jim Parsons narrates this single camera nostalgia fest that leads us through the life and times of his younger self, played by the precocious Iain Armitage. The series begins with nine-year-old Sheldon entering high school in his small Texas town, and its charming portrait of the familiar character’s youth as well as his relationship with his mother (Zoe Perry) gives the series a warmth that’s reminiscent of “The Wonder Years.”


“Young Sheldon” also is the first single-camera comedy Chuck Lorre has produced. “It’s more intimate. The pacing, obviously, is very different,” he said at a press conference for the show. “The actors aren’t having to hold for laughs. They’re not playing to the proscenium. They’re not playing out. They’re working with one another. You know, a four camera show is played like a theatrical presentation. They’re playing to the audience, and it changes the tone and the pitch and the pacing.”


“Law & Order True Crime: The Menendez Brothers.” Premieres 10 p.m. Tuesday, September 26 on NBC. Procedural king Dick Wolf gets in on the true crime craze by dramatizing the celebrity murder case that pre-dated the O.J. Simpson trial, involving a pair of rich kids accused of murdering their parents. The headliner here is “Sopranos” star Edie Falco as defense attorney Leslie Abramson.


And Wolf unabashedly admits his show has an agenda.  “It’s absolutely horrible, but when you see the information, I think people are going to realize, well, yeah, they did it, but it wasn’t first degree murder, with no possibility of parole,” he told critics. “They probably should have been out eight or 10 years ago, because they should have been convicted of first degree manslaughter, which is a different punishment than first degree murder.”


“The Gifted.”  Premieres 9 p.m. Monday, October 2 on Fox.  You may have noticed that in the above spate of abuse for “Inhumans,” I did not mention Fox’s series, set within the “X-Men” universe. That’s because it’s not terrible. The pilot’s not particularly great either, but it has a decent cast in Stephen Moyer (“True Blood”), Jamie Chung (“Gotham”) and Amy Acker (“Angel,” “Person of Interest”) and a talented showrunner in Matt Nix, the brains behind USA action hit “Burn Notice.”


Bryan Singer directed pilot, and while that doesn’t save it from its stumbles, at least it looks pretty good.  And remember, many decent series took lessons from launching with middling pilots; this one could actually recover from its shaky launch.


“The Mayor.” Premieres 9:30 p.m. Tuesday, October 3 on ABC. This is one of the series that was touted as part the Trump voter’s influence on fall TV, but it flouts the preconceptions of what such a series might look like. The premise places a young rapper Courtney Rose (newcomer Brandon Micheal Hall) running for mayor of his small California hometown as a publicity stunt and actually winning, much to his horror.


In contrast to what’s unspooling on the national stage in the real world, Courtney’s family is actually an asset to his political career: his mother Dina Rose (Yvette Nicole Brown) keeps him on the right track, and his competitor’s campaign manager (Lea Michele) steps into his camp to help him succeed.


Executive producer Jeremy Bronson spent seven and a half years as a producer for “Hardball with Chris Matthews.” Based on his experience with national politics, “I knew that I wanted to do some show that tackled a community, people coming together in a nonpartisan way to figure out if we all fundamentally care about improving the lot of our people, what can we do?” he said to critics.


“Now, obviously, given sort of the politics of the past year, it’s helped everybody, I think. I should say everybody is a lot more focused on what they can do, what we could all do, to sort of improve the country, improve our situations. So it’s given us a lot of inspiration for the show.”


“Dynasty.” Premieres 9 p.m. Wednesday, October 11 on The CW. Speaking of reboots and era-appropriate programming, “Gossip Girl” executive producers Josh Schwartz and Stephanie Savage have rekindled one of the most opulent and tacky ‘80s primetime soaps with a slew of young actors in re-imagined versions of familiar roles, and with “Melrose Place” star Grant Show cast in the role of Blake Carrington. General consensus is that this new “Dynasty might qualify as so-awful-it’s-brilliant, although the pilot itself wasn’t warmly received by most. That said, one thing many viewers will be amazed by is how kind the years have been to Show (to quote a very prominent critic in the room, “Grant Show can still get it”) which may provide enough of a reason to at least sample this bad candy. And maybe that’s the best that a network can expect in this binge-crazed world of ours.


 •  0 comments  •  flag
Share on Twitter
Published on August 08, 2017 15:59