Oxford University Press's Blog, page 136

August 17, 2020

Human rights must be the foundation of any COVID-19 response

The escalating Coronavirus Disease (COVID-19) pandemic has challenged global health as never before. Within months, the disease swept across every country, exposing the fragility of our globalized world. Unlike anything seen since the Influenza pandemic of 1918, health systems have faltered under the strain of this pandemic, with cascading disruptions as borders closed, businesses shuttered, and daily life unraveled. As governments and international institutions seek to find a way out of the darkness, human rights provide a light to guide national responses and global solidarity in facing this unprecedented threat.

Human rights are central to safeguarding global health, offering global standards to articulate government obligations under international law. Advancing human rights under international law as a basis for public health, human rights have evolved to provide a normative justification and political catalyst to advance global health.

The right to health, codified under the 1966 International Covenant on Economic, Social and Cultural Rights, requires national governments to take steps for the “prevention, treatment and control of epidemic, endemic, occupational and other diseases” and create conditions to assure “medical service and medical attention in the event of sickness.” Seeking a state of complete physical, mental, and social wellbeing, the right to health extends beyond medical care, with a wide range of health-related human rights necessary to uphold determinants of public health. Complementing these substantive obligations, human rights frame fair processes in public health practice, with the human rights-based approach to health requiring that policymakers pursue equality, support participatory engagement, and facilitate accountability for rights violations.

These health-related human rights obligations provide a foundation for public health prevention, healthcare services, and global health solidarity in the COVID-19 response.

Implementing human rights in the pandemic response, states must take immediate and progressive steps to prevent the spread of COVID-19. As in the early years of the AIDS response, it is marginalized populations that are most vulnerable to infection. Yet protecting these populations will require restrictions on individual liberties to mitigate public health threats—as seen through mandatory lockdowns, mask requirements, and contact tracing. Where it is necessary to limit individual freedoms to address this public health emergency, governments must ensure that any human rights limitations are reasonable, proportionate, non-discriminatory, and grounded in law.

Beyond rights limitations, it is also crucial to consider underlying determinants of health amidst disease prevention efforts. Highlighting the interdependence of all human rights, physical distancing measures have posed inequitable challenges for a range of human rights that underlie public health, including rights to food and nutrition, safe housing, gender equality, and water, sanitation, and hygiene. The interconnected nature of health-related human rights requires a comprehensive response to secure the livelihoods of impoverished populations, adapt working conditions to lessen the risk of infection, and alleviate the impact of the pandemic on basic health needs.

For those who are infected, medical treatment will have fundamental implications for the realization of the right to health, raising an imperative to ensure appropriate COVID-related health care. With budget cuts in many countries eroding national capacity to provide essential care, COVID-19 patients have overwhelmed health centers. It is crucial that governments devote the maximum available resources to the progressive realization of the right to health, developing a coordinated healthcare response for treatment and recovery and mobilizing resources to secure necessary health care for patients (including ventilators and oxygen) and personal protective equipment for healthcare workers and other frontline staff.

Policymakers must be held accountable for ensuring universal health coverage in the COVID-19 response. As governments falter in meeting their international obligations, accountability can be facilitated through human rights advocacy to defend marginalized populations, litigation of rights violations in national courts, and monitoring and review of state actions by the human rights system. These national and international mechanisms can ensure accountability for government efforts to realize human rights throughout a prolonged pandemic response.

However, no nation can do this alone. COVID-19 is a global public health crisis that calls for global solidarity, yet some governments have responded with nationalist approaches that ignore scientific evidence, repress vulnerable populations, and neglect coordinated action in facing this common threat. Despite repeated pleas for global solidarity, the rise of populist nationalism has curtailed international assistance to countries in need, weakened global health governance through the World Health Organization, and threatened the health and human rights of the most marginalized in the world.

The global health and human rights challenge of the COVID-19 pandemic requires a dramatic shift toward global solidarity in coordinating the pandemic response. Human rights law has long recognized that international assistance and cooperation is an obligation of all states in a position to assist. International support for the World Health Organization remains essential, as global governance will remain central to advancing human rights in global health. Given that this pandemic threat will only truly end with the development of an effective vaccine, rights-based governance will continue to be crucial in progressively realizing universal access to this prospective scientific breakthrough, bringing the world together through human rights to assure the highest attainable standard of health for all.

Featured Image Credit:by caniceus via Pixabay

The post Human rights must be the foundation of any COVID-19 response appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 17, 2020 02:30

August 16, 2020

Winston Churchill and the media in the 1945 British general election

Seventy-five years ago this week, the House of Commons in Britain began debating the legislative programme of Clement Attlee’s Labour government, elected by a landslide at the end of the previous month. John Freeman, one of the fresh intake of socialist MPs, declared boldly: “Today, we go into action. Today may rightly be regarded as “D-Day” in the Battle of the New Britain.” The roots of Labour’s victory can be traced back to the point where the party joined Winston Churchill’s Coalition government in May 1940. However, Churchill and the Conservatives made a series of campaigning missteps and failed to control the media narrative – despite the considerable press firepower that was available to them.

In the aftermath of VE Day in May 1945, Labour and the Liberals withdrew from the coalition; Churchill formed a new caretaker government which was to run the country until the upcoming general election. On 4 June, he launched the campaign with a broadcast in which he made the notorious claim that, if elected, a Labour government would have to “to fall back on some form of Gestapo, no doubt very humanely directed in the first instance.”  This, like all the other election broadcasts that year, was recorded and retransmitted at four different times the following day via short-wave, in order that overseas servicemen would have the same opportunities to listen as voters at home.  All speakers were asked to provide the BBC with advance copies of their scripts, but the BBC checked the recordings carefully too. “The value of this precaution was proved when an enquiry was received from No. 10 Downing Street as to what had actually been said in a very contentious passage in the prime minister’s speech of 30th June.”  This was probably a reference to Churchill’s claim that socialist ministers might be obliged to disclose state secrets to the Labour Party’s National Executive Committee, which he portrayed as a sinister and undemocratic body.

Churchill’s decision to focus on the rather obscure issue of Labour’s constitution may seem surprising. But he had been handed a weapon by Professor Harold Laski, a well-known socialist intellectual who happened to hold the rotating chairmanship of the party’s National Executive Committee in this particular year. Out of courtesy, Churchill had extended an invitation to Attlee, as Labour leader, to attend the Potsdam meeting with Joseph Stalin and President Harry Truman, which would begin after polling day but before the election results were announced. Laski put out a press statement which said that the Labour Party could not be committed to any decisions taken at the conference, as it would be discussing matters that the National Executive Committee and the Parliamentary Labour Party had not yet debated. Laski was much less important than he thought he was or than the Tory press professed to believe; but his action gave Churchill the chance to paint him as Attlee’s puppet-master.  The battle was fought out to a great extent in letters between Attlee and Churchill published in the press. On one occasion Churchill released his text shortly before 11.30 pm, apparently to ensure that his opponent’s reply could not be published side-by-side the next day.

Churchill’s friend and ministerial colleague Lord Beaverbrook, a powerful press baron, lent his papers’ full-throated support. Beaverbrook was a major figure in the campaign, which he thought should concentrate on exploiting Churchill’s personal popularity with the public.  Journalist Albert Hird noted in his diary: “We at the D.E. [Daily Express] are apparently to take the same (or even a more prominent) part in this election as the Mail in old Rothermere’s time did after the last war, for we are taking on extra reporters to cope with the work – a quite unnecessary thing if we were merely playing the part of an ordinary newspaper.” Moreover: “The Beaver is having a great time. He is stumping up and down the country and most of the opposition credits him with being the real leader of the Tory party and the evil (that’s what they call it) genius behind Churchill. All of which he finds entirely to his liking.”

The support of the Daily Mail was somewhat less ardent than that of the Express, but Churchill could also depend on the backing of a range of other Conservative newspapers. However, in contrast to previous contests, Labour now had truly substantial press support too. Pro-Labour London-based papers between them had six million readers, and Tory ones 6,800,000; The Times, for its part, “was peculiarly detached about the election.” Churchill, who obtained much coverage for his apparently triumphal national election tour, himself published one newspaper article during the final days of the campaign. This was in the News of the World, and he sought to explode the popular notion that people could vote Labour and nonetheless somehow retain him as prime minister. Churchill’s newsreel election pitch – which may have been seen by as many as 20 million people – revealed, if nothing else, his sheer physical exhaustion. Attlee’s much snappier effort was no rhetorical masterpiece, but it demonstrated the no non-nonsense competence of a man who had long been overshadowed by rivals within his party.

As Conservative opinion poll ratings actually improved during the course of the election battle, one should not be too harsh about the party’s strategy – except to say that by the time the campaign started the Tories had already lost the initiative to Labour on the key issue of post-war reconstruction. There were many factors that contributed to the election result, but the most important was Churchill’s inability to find a compelling message to counter the themes of modernity and hope put forward by an ideologically astute and media canny Labour Party.

Featured image: public domain via Unsplash.

The post Winston Churchill and the media in the 1945 British general election appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 16, 2020 02:30

August 15, 2020

Coping with COVID deaths and what cinema tells us

It has come to this. We have reached an arbitrary new landmark in COVID-19 deaths in the United States. Inexorably oncoming, some respected epidemiologists are spooked by the specter of more waves and say we may go to 1 million. Such numbers would not make this pandemic any more unique. These large numbers, as any large number in catastrophes, are depersonalized, except for the health care workers who see them. Some commentators are good at minimizing these enormities, and we built defenses saying it is only the vulnerable. Then the statistic changes, however, and nobody is spared.

Before COVID-19, when death was imminent, families could be at the bedside and they could stay overnight. The state of palliation in the intensive care unit was unrestrained and accommodating. Families would even sometimes negotiate to keep their loved ones alive so grandchildren could visit from afar or “don’t let him die on his birthday.” We understood the sentiment and complied. It was horrible enough and if we could adapt, we did not hesitate.

Negotiations to be with a loved one during the COVID-19 surge did not cease despite quite sensible (although non-evidenced with proper use of protective garments) hospital restrictions. Even further aggravated by talking to masked healthcare workers who could hardly be identified (its use unquestionably evidenced). Most relatives were not allowed to be physically present. This was further hampered by travel restrictions. I have seen it first-hand and felt awful about it. For a family to not be present means no time to say goodbye. Families have literally begged me for an exception to the rule, and not allowing family to be present went against all I stand for; grief and mourning gone awry. These bereaved already have a name: the “COVID-19 mourners.”  Although the dying person is often unaware of what is going on around them, for loved ones, the experience of being nearby is very helpful in the grieving process. How will families cope with this experience? With death and dying in intensive care units, there should be bereavement support for families. Bereavement-associated post-traumatic stress disorder is frequent in families of patients who die in the intensive care unit and lasts many months. Can we console them later?

It brought my thoughts back to cinema. With all this depersonification, cinema found a way to personify death. In addition to violent death from viruses, aliens, climate change, and nuclear Armageddon, studios made serious movies about disease and dying. Early European filmmakers wondered if we could ask death a favor. Victor Sjöström and Ingmar Bergman asked if negotiation with the Grim Reaper was an option. In The Phantom Carriage and The Seventh Seal, we see the personification of death to great effect, and invariably it is a cloaked and hooded elderly male. The dying sister in The Phantom Carriage begs for one more opportunity to see if the drunkard she cared for has cleaned up his act. Crusader Block in The Seventh Seal, challenges death to a chess game, and Death, a better player, agrees. To postpone the inevitable, Block proposes a deal:   “If I checkmate you, I go free.” Postponement through negotiation was a major theme. Equally sadistic was Fritz Lang’s film Destiny and,  if true, was based on a nightmare. A woman in love can only get her dying lover back if she saves a life or finds another soul to replace her lover in death, even going so far as to offer her child, but she fails. Cinema has dealt with grief, and lack of resolution is common in recent films (Don’t Look Now, Three Colors, Blue, Manchester by the Sea, Antichrist). Death in the movies is agonizing, and many explore common themes of despair, the randomness of fate, and everything associated with trying to come to grips with it.  In cinema, death from illness is unavoidable and non-negotiable. Our current reality is not much better.

Featured image credit: Pixabay

The post Coping with COVID deaths and what cinema tells us appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 15, 2020 02:30

August 13, 2020

Cyntoia Brown and the legacy of racism for children in the legal system

In 2004, 16-year-old Cyntoia Brown shot and killed a man who paid her for sex – a position she was forced into by an older man who took advantage of her. Brown never denied shooting the man (in fact, she was the one who called the police the next day), but she claimed it was an act of self-defense because she believed the man was grabbing a gun to shoot her. Brown was charged and convicted of aggravated robbery and first-degree murder and sentenced to life in prison. In 2017 her case gained national attention when celebrities and others petitioned for her clemency. In 2019 the Governor of Tennessee granted clemency and Brown was released from prison.

As a Black girl, Brown ’s case, and others like hers, ultimately reflect the legacy of racism in the United States—a long legacy that can shape the life trajectory of racial minority youth, leading them at a vastly disproportionate rate toward involvement in the legal system.

A history of race-based oppression, intergenerational trauma, and resulting poverty puts racial minorities at risk for becoming victims. Child sex traffickers prey on already-vulnerable child victims, convincing them that they have no other options in life than to sell their bodies and that no one else will want them. Brown’s childhood was difficult. She was in and out of the juvenile justice system and eventually ran away and began living on the streets. There she began a relationship with an older man who abused, raped, and sold her as a prostitute. She was manipulated and disparaged by this man, who told her “some people were born whores, and that [she] was one… and nobody’d want [her] but him.” Fearing the beating and rape she expected if she returned to her abuser without money, Brown stole from the man she shot in self-defense.

Once racial minority juveniles come into contact with the legal system, racism can exacerbate negative outcomes at many levels. For example, like laypeople, police officers’ stereotypes can influence how they perceive and respond to juveniles of color, including the way officers interrogate those suspected of committing crimes. Racial minorities suffer from the added stress and anxiety associated with stereotype threat (i.e., concerns about being stereotyped as criminal because of their race), which may impair their ability to understand their Miranda Rights and pressure them to waive their rights—exactly what Brown did. She did not understand the value of having an attorney during her initial interrogations.

Racism’s legacy has also been fear and suspicion—the fear that white Americans have of African Americans in particular, even youngsters. Compared to white adolescents, racial minorities are often perceived as older and more mature than they actually are, and they are in turn blamed more for their actions. Psychological research has established that cognitive and psychosocial development is not fully developed until adulthood, leaving youth of all races more likely to engage in impulsive, risky decision making without considering the long-term consequences of their actions. But whereas white juveniles’ actions may be attributed to these developmental limitations, minority youth do not receive the same benefit of the doubt. Brown was not only Black and mature looking, but she also suffered from vulnerabilities that further impaired her cognitive functioning: Alcohol-Related Neurodevelopmental Disorder (a form of Fetal Alcohol Spectrum Disorder), which impaired and delayed the development of various cognitive abilities, including her ability to control her behavior and to understand the consequences of her actions. She was cognitively not even as old as she actually was, and far younger than she appeared.

Not only are racial minority youth perceived as older and more culpable than white youth, but minority girls are stereotyped as promiscuous and deviant. As a result, girls of color who are subjected to sexual exploitation are often blamed for their own victimization and labeled as offenders, not victims. Moreover, when they are charged with crimes, they are also more likely to be tried in adult criminal court rather than a rehabilitation-oriented juvenile court. Brown’s case was no different—rather than consider her a child victim of sexual exploitation, Brown was labeled a teen prostitute and tried as an adult.

Cyntoia Brown’s case illustrates the multifaceted ways in which racism influences minority youth’s life experiences as well as their interactions with the legal system. Although much progress has been made to end racism in the United States, there is still much that needs to be done to ensure that all children, regardless of race or ethnicity, have an equal opportunity to live and thrive in society. Meanwhile, social science has much to offer in exposing the legacies of racism in the legal context, and developing ways to combat it.

Featured Image Credit: Image by Fifaliana-joy via Pixabay

The post Cyntoia Brown and the legacy of racism for children in the legal system appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 13, 2020 02:30

August 12, 2020

Whatever happens, the Oxford Etymologist will never jump ship!

One does not have to be a linguist to know that English is full of naval metaphors and phrases. How else could it be in the language of a seafaring nation?! Dozens, if not hundreds of metaphors going back to sailors’ life and experience crop up in our daily speech, and we don’t realize their origin. Nor should we, for speakers are not expected to think of the etymology of the words and collocations they use. Contributors to the excellent periodical The Mariner’s Mirror noted the role of naval phrases in our daily speech and sometimes composed short stories with them. Here is a sample of one of their texts: “Smith is sailing under false colors, or perhaps he has only lost his bearings and would be taken aback or might even fall foul of you, if you told him the truth,” and so on. Try to translate this passage into French, Spanish, or German and see what will happen. You will easily find equivalents, but their naval basis will probably change, and the stylistic coloring will be gone.

In my database, naval phrases are many. The most curious among them are such as do not at once reveal their origin or leave us puzzled. Some others are fully or partially transparent and reflect, among other things, seamen’s attitude toward life. For example, reliability has always been considered a great virtue. A story was told of a British captain, a devoted ship keeper, who, to a lieutenant remonstrating on the little privilege of leave enjoyed by junior officers, replied: “Sir, when I and the sheet anchor go ashore, you may go with us.” Hence, allegedly, the idiom to go ashore with the sheet anchor “the ultimate expression of attention to duty.”

One need not have absolute trust in such tales, because they are often invented in retrospect. The idiom may go back to somebody’s joke. Perhaps that captain never existed. We’ll never know. Yet the idiom had some currency in the navy a century ago. Perhaps it is still known. Compare the phrase to swallow the anchor. At the beginning of the twentieth century, the inference was one of giving up the career or premature departure from sea-going, rather than of normal retirement. This metaphor (like, for example, to swallow a bitter pill) is fairly transparent.

A monument to the man who nailed the colors to the mast in Mowbury Park, Sunderland. Photo by Craigy144 via Wikipedia. CC-by-SA 3.0.

The existence of stories and legends that supposedly gave rise to proverbial sayings should be investigated, rather than denied, and the same is true of all etiological legends (those purporting to explain the origin of some facts or phenomena). They may be based on true events, contain echoes of what was true years or centuries ago, or be pure fiction. Hardly anyone believed that Troy had existed until Heinrich Schliemann excavated it.

Here is an example of a tale we can probably believe, even though the veracity of all sayings based on alliteration (in this case, j…j and even j…j…j) arouses suspicion. There was a certain John Jackson, we are told, who nearly wrecked the ship in 1787 because he refused to listen to the pilot’s advice. The ship struck heavily, and someone on board asked ironically: “How does she go now, Jackson?” Among seamen, the phrase jammed like Jackson became proverbial in situations when obstinacy leads to disaster. Perhaps the man’s name (John Jackson) was added later, but the episode described here must have happened. Also, heroism on board a ship could not but leave its traces in language. To nail the colors to the mast means “to stick to one’s position; refuse to budge.” In 1797, in the Battle of Camperdown, fought between the British and the Dutch, Jack Crawford did nail the colors to the mast of the ship. The episode became famous, and the OED has references to it going back to 1800. The saying don’t give up the ship!, that is, “fight to the end,” also goes back to a real battle that took place in 1813.

The Battle of Camperdown, 11 October 1797. By Thomas Whitcombe, 1798. © National Maritime Museum Collections.

I cannot judge how popular such phrases are (perhaps their use is limited to professional circles), but the popularity of all idioms is a problem no one has ever investigated. Some such phrases have always been local, while others are known to many but used by few. I have often dealt with the situation in which I use a proverb or an idiom and find to my surprise that no one in the audience understands it. I once had a passing acquaintance with a handyman who opened a business under the name “Let George do it.” His name was George, and I found the sign on his shop funny and witty, because let George do it means “let someone else do it” (in this case, George!), but I have not yet met anyone among my students who recognizes the idiom. While compiling an explanatory and etymological dictionary of idioms, I often tried similar experiments, and I wonder how many of George’s customers smiled at the joke.

Admiral Nelson. His heroic life is a subject always worthy of broaching. Hoppner, after John, circa 1823-24. © National Maritime Museum Collections.

But back to our subject. Sailors have never been teetotal. Half-seas-over means “drunk,” though at one time, the phrase meant “half across the sea(s).” Its connotation remains a matter of debate. According to one opinion (just an opinion!), the reference is to sea sickness; other people insisted that the phrase referred to “semi-intoxication.” Nor is it entirely clear whether seas is indeed seas or sea’s (if sea’s, then “half of the sea”). In the publications dealing with this phrase, one sometimes finds references to a non-existing Dutch phrase as the source (such references are the curse of etymological studies; people repeat what they have read in unreliable books and don’t bother to look up such “Russian,” “Hebrew,” “Yiddish”, or “Dutch” words and phrases in a dictionary of those languages). Be that as it may, the phrase does mean “drunk” and must have something to do with sea voyages. We face the familiar problem: what shall we do with a drunken sailor?

More enigmatic is the phrase to broach (or tap) the admiral, also meaning “to get drunk.” An apocryphal story exists that, when the body of Lord Nelson was brought home for burial, it was preserved in a cask of rum, but the sailors had, before the arrival of the corpse, drained the cask completely dry by means of a straw. Like so much slang, the origin of broach the admiral remains unknown, but the story inspires little confidence.

Pirates were of course sailors too, and I wonder how many people know what fifteen men have to do with a dead man’s chest. The phrase, made famous by Robert L. Stevenson, was borrowed from Charles Kingsley’s book At Last: A Christmas in West Indies. This is what one can read in Notes and Queries, vol. 166 (1934), p. 212: “I have always understood that a ‘chest’ in the West Indies means a small uninhabited island, so that ‘the dead man’s chest’ = ‘Treasure Island’.” If this is how the song originated, many admirers of Stevenson’s novel will feel grateful to the author of the short letter to the editor of the once immensely popular periodical.

A drunken sailor, or three sheets in the wind. Sailors drinking in a crypt. Coloured etching by W. Elmes. Credit: Wellcome Collection. CC BY 4.0.

And now something for a showy finale. Jibber the kibber means “to lure a vessel to destruction by giving a false signal from the shore.” The signal, according to folklore, was tied to a horse. Jibber is indeed slang for “horse.” But nothing else is known about the origin of this saying. Its sounding is ominous and almost too good to be true. Yet my information about it, like almost everything I have written above, comes from The Mariner’s Mirror and should be taken with respect.

If such posts, devoted to thematic idioms, are interesting to read, leave a comment, and I’ll go on in the same vein. If they are not, I’ll also be grateful for your opinion.

Feature image credit: from “Ilios: the city and country of the Trojans” by Heinrich Schliemann, 1880. No known copyright restrictions; via the Internet Archive Book Images on Flickr.

The post Whatever happens, the Oxford Etymologist will never jump ship! appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 12, 2020 05:30

Petrostates in a post-carbon world

“This is our biggest compliment yet.” Greta Thunberg answered with these words to the comments by OPEC’s Secretary General Mohammed Barkindo that climate concerns were becoming the organization’s “greatest threat.” An increasing number of people view fossil fuels, and petroleum in particular, as the key cause of climate change and thus as the greatest threat to Humanity.

The Organization of the Petroleum Exporting Countries (OPEC) will turn 60 in September and is facing unprecedented challenges as oil demand is experiencing its most significant drop since petroleum become the world’s most important primary energy source. The price of petroleum is falling due both to booming oil production, particularly in the United States where the shale industry has increased production from 5 million barrels a day in 2008 to nearly 13 million barrels a day in 2019, and reduced demand because of COVID-19.

When OPEC was created in 1960 its five founding members, Iran, Iraq, Kuwait, Saudi Arabia and Venezuela, extracted almost all of the petroleum exported globally. Oil production in these countries was entirely controlled by an oligopoly of Anglo-American oil companies. Petroleum was just about to become the most important global energy source, overtaking coal in the midst of an age that saw a massive boost in the consumption of fossil fuels and thus in the release of CO2 emissions.

Today the Vienna-based organization has relatively smaller share of global oil production (30%) and of global exports (about 40%), even though its members still hold 80% of global crude oil reserves. Most of the oil production in petrostates is now controlled by national oil companies that compete on equal terms with international oil companies such as Exxon or Shell. A key difference from 1960 is that today climate scientists, environmental activists, policy-makers, and financial institutions are actively promoting a transition away from fossil fuels. Not only are OPEC countries relatively weaker in the global oil market, thanks to the rise of shale oil in the United States, they are challenged in their very survival as petrostates. This is because these nations are vitally dependent on the exports of a natural resource the demand for which might have peaked this year..

When viewed through the lenses of the Anthropocene and of rising CO2 emissions, the history of OPEC in the 20th century is far more interesting than the familiar story of rich oil sheiks with their shameless wealth. OPEC has in fact been the only organization that made an effort to control and reduce oil production. Very few people know today that OPEC’s founding father, Venezuelan Petroleum minster Juan Pablo Pérez Alfonzo, came to define petroleum as the “Devil’s Excrement” and was a consistent promoter of conservationist policies for the Venezuelan oil industry. Pérez Alfonzo kept a rusty British Singer automobile in the garden of his villa in Caracas as a reminder of the dangers of consumerism and waste.

John Maynard Keynes had repeatedly warned about the need for global coordination to stabilize the price of commodities. Stable oil prices are important to plan a speedy transition away from fossil fuels, while avoiding at the same the political and economic collapse of oil-producing countries. The United States, as well as OPEC and non-OPEC states such as Russia, Mexico, and Brazil need to begin serious discussion to pro-ration oil production at a global level.

Whatever its format and however difficult it may be to change a neoliberal ideology that rules out state-led regulation of production, the time for a global dialogue on production levels and oil prices has come. Deregulation of the energy market must give way to a new era of regulation of the oil industry at both national and international levels.

The alternative will leave commercially oriented oil companies, both national and international, free to engage in a destructive price war that will maximize environmental degradation and squander natural resources. This will ultimately endanger decarbonization efforts (car-markers are already pressing governments to relax emissions standards) and increase political and economic instability in OPEC countries that are key regional actors.

Featured Image Credit: by Zbynek Burival via Unsplash

The post Petrostates in a post-carbon world appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 12, 2020 02:30

August 10, 2020

How the healthcare system is failing people with eating disorders

One death every 52 minutes occurs in the United States as a direct result of an eating disorder, according to a report by the Strategic Training Initiative for the Prevention of Eating Disorders, the Academy for Eating Disorders, and Deloitte Access Economics. I have studied eating disorders for over 30 years, and I was shocked by this finding. From 2018-2019, roughly 6,910 females and 3,290 males between the ages of 15 and 64 years lost their lives due to an eating disorder. The personal tragedy for each family member and friend is beyond the scope of even the most comprehensive study of social and economic costs of eating disorders. But the economic cost of these deaths came to $8.8 billion dollars ($8,800,000,000.00), dwarfing the $4.6 billion dollar investment our nation makes to understand and treat eating disorders. If I were an economist, I might try to explain how the difference between investment and costs represents a margin, but I’m not. Instead, I want to focus on how these deaths occur in the margins of our health care system and the steps needed to prevent these fatalities.

The report includes a case study of Hannah and demonstrates the need for better health care coverage for eating disorders. When Hannah was 13-years old, she developed anorexia, lost a third of her body weight and developed a serious heart problem that required hospitalization. After approximately six months of inpatient treatment, she was discharged and experienced a relapse that required intensive outpatient care. This was not covered by her parents’ insurance. Her mother estimated spending $10,000 in a single month to keep her daughter alive. Hannah is now recovered and advocates for better insurance coverage for eating disorder treatment, particularly for military families, like her own.

This story does not represent an isolated case. Two weeks after the report was released, the following crossed my Twitter feed: “Friend diagnosed with AN (female in midlife). Kaiser wrongfully denying her authorization to a higher LOC [level of care]. Although she’s suffered for many years, she is not treatment resistant, as Kaiser claims. Resting heart rate is 35.”

A resting heart rate of 35 beats per minute meets criteria for hospitalization. Low heart rate signifies that the body is shutting down. Denying medical care to an eating disorder patient in medical crisis is wrong. But it happens too often in a health care system that interprets parity for mental health as only applying to brain-based disorders. The myth that people choose to have an eating disorder is as misguided as the idea that someone would choose to have cancer.

Researchers and advocates have made huge efforts to increase recognition that eating disorders are biologically-based disorders, with genetic make-up contributing as much to their development as it does to the development of schizophrenia or asthma. If this stops a single person from being denied treatment to save their lives, good. But I’m still waiting for any explanation for why the cause of an eating disorder has anything to do with whether or not we should try to save a person’s life. In one study, we found that the leading cause of death in anorexia nervosa was suicide. If treatment prevents a person from taking her life, that saved life is easily worth the investment. It makes no difference whether the prevented death would have been caused by self-poisoning, self-starvation, or self-induced vomiting.

We need more money in research funding to find better treatments, preventions, and cures for eating disorders. In 2019, the single largest funder of research, the National Institutes of Health, invested roughly $8/person with an eating disorder compared to $60,831/person with tuberculosis. The gap between funding for mental health and almost any other domain of health is well-known. But beyond that gap, eating disorders are marginalized within the field of mental health, receiving $8/person compared to $33/person for depression and $69/person for schizophrenia research in 2019. This funding gap contributes to the gaping difference between 28 FDA-approved medications for depression versus 0 for anorexia. Without greater funding to produce better treatments, we won’t see better outcomes. And the best time intervene is at the first sign of a problem. Catching any disease in its earliest stages produces the best outcomes.

We need universal screening to promote early identification and intervention before the eating disorder progresses to a critical stage. While eating disorders lurk in the margins of awareness, we won’t see who is affected, we won’t see how they are affected, and we won’t know the full cost we pay for our ignorance. In the absence of universal screening, we will never know the true number of people killed by their eating disorder. No pathologist will ever “see” the eating disorder in a post-mortem exam, no matter how obvious the signs may be. In that same study where we found that suicide was the leading cause of death, we found that an eating disorder was never listed as the cause of death on the death certificate. Instead, we saw causes such as heart and liver failure in a 39-year-old woman who weighed less than half what a woman her age and height should weigh – less than half. In the absence of universal screening, we lose our best opportunity to prevent those deaths.

We need to recognize eating disorders as serious mental illnesses that pose a threat to the health of our nation.

We need better health care coverage.

We need more funding for research.

We need universal screening.

Featured Image Credit: by Daan Stevens via Unsplash

The post How the healthcare system is failing people with eating disorders appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 10, 2020 05:30

We hear Beethoven’s music as autobiography, but that wasn’t always the case

At a pre-COVID live performance of one of Ludwig van Beethoven’s cello sonatas, I was in the front row and had a great view of the musicians. Beethoven was watching, too, in the form of a scowling bust at the back of the stage. And the two just didn’t match. The music was playful and jesting, but Beethoven looked deeply unhappy. It struck me at that moment that the assumption of composers expressing themselves in their music was not as straightforward as it is often taken to be.

The Beethoven Syndrome is my name for the inclination of listeners to hear music—particularly instrumental music—as the projection of a composer’s innermost self. It’s a common response. People often hear Beethoven’s music as a kind of sonic diary, as a soundtrack of his own life, so to speak. But that’s not how his contemporaries heard it. They knew very little about him as a person, and it was only after his death that they began hearing his music as an outpouring of his soul. Yet by 1850, just a few decades after his death, audiences were hearing not just Beethoven’s music but pretty much all instrumental music by any composer as a form of autobiography.

Why was that? Why did listening habits change so radically in the years around 1830?

A big part of the answer had to do with changing concepts of expression. During the lifetimes of Joseph Haydn, Wolfgang Amadeus Mozart, and Beethoven, listeners and composers alike thought of expression as the representation of emotions, not as the revelation of the composer’s personal feelings. Music was regarded as a rhetorical art: It was the responsibility of composers to create works that listeners could follow and that would move those listeners emotionally. Haydn and Mozart didn’t want to puzzle their audiences. They made unusual moves in their music from time to time, but they didn’t want to write in a way that left people scratching their heads and wondering what that was all about?

In this sense, listeners of that time regarded composers more in the way we think of actors today: They inhabit their roles and portray them convincingly and with ease. What distinguishes great actors from character actors, after all, is that the great ones can take on a wide variety of roles, while the lesser ones are limited in their range. Beethoven was a great actor. This doesn’t mean he was insincere. It means that he could enter into—and convincingly portray—a great many facets of the human condition.

The fact is that Beethoven had an amazing ability to step outside himself. And that was something his contemporaries valued. One reviewer from the time proudly called him “Our Proteus,” comparing him to that shapeshifting figure of Greek mythology. That’s one of the reasons why the music that doesn’t fit the later image of Beethoven as the scowling, storming Titan falls by the wayside. Too often it’s perceived as not being the real Beethoven.

Why did Beethoven write particular pieces the way he did? It was long assumed—but again, only after his death—that the “Moonlight” Sonata, op. 27, no. 2, was his response to a passionate love affair (of which Beethoven had quite a few). And maybe it was. It’s easy to map this onto the piece, with its brooding first movement and turbulent finale. But then how do we explain its companion piece published alongside it, the Piano Sonata op. 27, no. 1, which conveys a completely different mood?

The Fifth and Sixth Symphonies offer another good case in point: Beethoven wrote them at more or less the same time, yet it’s hard to imagine two more different symphonies. Early audiences, moreover, heard the “Pastoral” Symphony as a depiction of nature while later ones heard it as Beethoven’s response to nature.

From Hector Berlioz onward, it became standard practice to hear the music of all composers, not just Beethoven, as a form of autobiography. Berlioz’s Symphonie fantastique, premiered in 1830, three years after Beethoven’s death, wears its autobiographical heart on its sleeve. And audiences were grateful for the prose program Berlioz issued with the work, because it provided them a decoder ring of sorts, a way into this very strange new symphony. Listeners might or might not like the work, but they could now hear it as its composer’s way of expressing himself.

This was a new way of composing and a new way of listening. Listeners embraced the idea of music as autobiography because it gave them something to hang on to. And composers liked it as well: it helped make their life story part of their marketing appeal. Composer biographies became all the rage. It’s hard to imagine, but there were no biographies of Haydn, Mozart, or Beethoven until after they had died. But those biographies soon became an important key for unlocking the apparent meaning of their music. In fact, it’s in the middle of the nineteenth century that the whole industry of music appreciation really gets rolling. Journals start publishing accounts of new works and their living composers, which sends the message: “Listen for this. This is how you can understand what you’re hearing.”

What’s really interesting is that modernist composers of the early twentieth century like Arnold Schoenberg, Igor Stravinsky, and Claude Debussy began to distance themselves from the idea of self-expression. Musical expression, they maintained, was objective, and the “New Objectivity” that was all the rage in the 1920s and 30s proved foundational for the mid-century modernism of composers like John Cage, Milton Babbitt, and Elliott Carter. In recent decades, at least some composers (the “New Romantics” or “Postmodernists” as they’re sometimes called) have been more willing to promote their music as an expression of their experiences and of their inner selves. Plus ça change….

Featured image by Maria Lupan on Unsplash

The post We hear Beethoven’s music as autobiography, but that wasn’t always the case appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 10, 2020 02:30

Nine books on philosophy and race [reading list]

Featuring a selection of new titles from leading voices, and major works from across the discipline, the OUP Philosophy team has selected several of its important books exploring race from different philosophical perspectives.

From David Livingstone Smith’s On Inhumanity, which provides an unflinching guide to the phenomenon of dehumanization, to Naomi Zack’s The Oxford Handbook of Philosophy and Race, containing a wealth of voices addressing a comprehensive range of key topics, all the books featured here are thoughtful, powerful, and important contributions to thinking, speaking, and acting positively about race.

Visible Identities: Race, Gender, and the Self  by Linda Martín Alcoff
Drawing on philosophical sources, as well as theories and empirical studies in the social sciences, the author argues that identities are not like special interests, nor are they doomed to oppositional politics, nor do they inevitably lead to conformism, essentialism, or reductive approaches to judging others. Read the introduction here. Unmuted: Conversations on Prejudice, Oppression, and Social Justice  by Myisha Cherry
Focussing on subjects too often omitted from mainstream philosophy, this book presents a collection of interviews the author conducted with a vibrant and diverse group of philosophers her podcast, UnMute. The author and her interviewees cover issues including social protests, Black Lives Matter, climate change, education, integration, LGBTQ issues, and the Me Too movement. This collection of 31 interviews shows what philosophy can contribute in divisive times. What is Race? Four Philosophical Views  by Joshua Glasgow, Sally Haslanger, Chike Jeffers, and Quayshawn Spencer
Four prominent philosophers and race theorists debate how best to answer a series of difficult questions: Do we know what race is? Is it a social construct or a biological object? Is it a bankrupt holdover from a time before sophisticated scientific understanding and genetics; or, is race real? They apply philosophical tools and the principles of social justice to cutting-edge findings from the biological and social sciences. Read a chapter here. The Color of Our Shame: Race and Justice in Our Time  by Christopher J. Lebron
The author argues that it is the duty of political thought to address the moral problems that attend racial inequality and to make those problems salient to a democratic polity. Thus he asks two questions: Given the success of the Civil Rights Act and the sharp decline in overt racist norms, how can we explain the persistence of systemic racial inequality? Once we have settled on an explanation, what might political philosophy have to offer in terms of a solution? Read the introduction here. Structural Injustice: Power, Advantage, and Human Rights  by Madison Powers and Ruth Faden
The authors here develop a theory of structural injustice that links human rights norms and fairness norms, explaining how human rights violations and structurally unfair patterns of power and advantage are so often interconnected. The theory they put forth here is informed by and responsive to critical perspectives in social justice movements, including Black Lives Matter and Me Too. Read the introduction here. Socially Undocumented: Identity and Immigration Justice  by Amy Reed-Sandoval
What does it really mean to be undocumented? Many define the term “undocumented migrant” legalistically, in terms of lacking legal authorization to live and work in one’s current country of residence. In this book, the author challenges this understanding by arguing that being socially undocumented is to possess a real, visible, and embodied social identity that does not always track one’s legal status. Read the introduction here. On Inhumanity: Dehumanization and How to Resist It  by David Livingstone Smith
The Rwandan genocide, the Holocaust, the lynching of African Americans, and the colonial slave trade are horrific episodes of mass violence spawned from racism and hatred. We like to think that we could never see such evils again–that we would stand up and fight. But something deep in the human psyche–deeper than prejudice itself–leads people to persecute the other: dehumanization, or the human propensity to think of others as less than human. The award-winning author takes an unflinching look at the mechanisms of the mind that encourage us to see someone as less than human. On Race: 34 Conversations in a Time of Crisis  by George Yancy
The need for clarity surrounding the significance of race and racism in the United States is more pressing than ever. In a series of interviews originally conducted for The Stone, the New York Times’s online philosophy series, with such major thinkers as bell hooks, Judith Butler, Cornel West, Kwame Anthony Appiah, Peter Singer, and Noam Chomsky, the author probes the historical origins, social constructions, and lived reality of race along political and economic lines. He interrogates fully race’s expressions, its transcendence of Black/white binaries, and its link to neo-liberalism, its epistemological and ethical implications, and, ultimately, its future. The Oxford Handbook of the Philosophy of Race  edited by Naomi Zack
The handbook provides up-to-date explanation and analyses by scholars of contemporary issues in African American philosophy and philosophy of race. These original essays encompass the major topics and approaches that supports demographic inclusion and diversity while at the same time strengthening the conceptual arsenal of social and political philosophy.

 

Featured Image Credit:

The post Nine books on philosophy and race [reading list] appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 10, 2020 02:30

August 9, 2020

How to prepare students for jobs in the 21st century

A common goal for educators is to identify, and then teach, cognitive skills that are needed for the workplace. In 2017 a group of investigators at the Educational Testing Service in Princeton, New Jersey, investigated which skills are needed as a result of the rapid changes occurring as the United States shifted from an industrial to an information-based economy. The organization analyzed 142,000 online job advertisements that were posted between February and April of that year. The most highly requested skills were oral communication (28%), written communication (23%), collaboration (22%), and problem solving (19%). The Educational Testing Service decided to label them “21st-century skills.” These skills are a combination of cognitive (nonroutine problem solving, critical thinking, metacognition), interpersonal (social), and intrapersonal (emotional, self-regulation) talents.

The World Economic Forum’s Future of Jobs Report 2018 took a different approach by asking executives of some of the world’s largest employers to report on the latest employment, skills, and human investment trends across their industries. The survey revealed an accelerating demand for new specialist roles related to understanding and using the latest emerging technologies – AI and machine learning specialists, big data specialists, process automation experts, information security analysts, human-machine interaction designers, and robotics engineers. Skills such as creativity, originality, critical thinking, negotiation, and complex problem solving will, according to the report, become more valuable. The World Economic Forum predicted that there will also be more demand for leadership, emotional intelligence, and social influence skills.

These cognitive skills should be incorporated within courses taught at high schools, community colleges, and universities. An introduction to technological skills can begin in high school. In 2016, the College Board’s Advanced Placement Program oversaw the largest course launch in the program’s 60-year history with the release of AP Computer Science Principles. The course introduces students to the foundational concepts of the field and challenges them to explore how computing and technology can impact the world. In the 2018-19 school year more than 5,000 schools were offering the course. A second course, AP Computer Science A, focuses on computing skills related to programming in Java. The course teaches the foundational concepts of computer science to encourage broader participation. Its big ideas are creativity, abstraction, data and information, algorithms, programming, the internet, and global impact.

The College Board announced that nearly 100,000 students took the AP Computer Science Principles Exam in 2019, more than double the participation since the course launched during 2016-17 school year. During that period the number of female students, the number of Black/African American students, and the number of Hispanic/Latino students more than doubled.

A challenge for community colleges and universities is to design instruction for the increasing number of students who enter higher education without a declared major. A 2019 article in The Chronicle of Higher Education discussed how some institutions are preventing these students from wandering among courses. One solution is to ask incoming students to select an academic focus area or meta-major to help them explore topics that should appeal to them. In the fall of 2019, the entire University System of Georgia asked incoming freshmen to declare an academic focus area if unable to select a major. The University of Houston has a similar program based on the same concept.  These programs have been influenced by studies that find college students have trouble succeeding because they are faced with an overwhelming number of options.

Community colleges are evaluating a related project. The Association of American Colleges and Universities announced that it had selected 20 colleges to participate in a two-year evaluation project. The goals of the project are to map students’ goals to pathways, help students choose and enter a program pathway, keep students on the path, and ensure that students are learning.

A guided approach can also be helpful for reducing changes in majors. Within three years of their initial enrollment approximately 30% of undergraduates in associate and bachelor’s degree programs change their major at least once. But Georgia State University reduced changes in a major by 30 percent since it began its meta-majors program seven years ago. Those students who did change majors were more likely to have crossover credits.

Although clustering courses around a meta-major may be helpful for students with undeclared majors, the courses do not necessarily teach skills that are important for the future workplace. These skills include oral communication, written communication, collaboration, and problem solving. They also include creativity, originality, critical thinking, negotiation, and complex problem solving. These cognitive skills should be incorporated within cluster courses to prepare freshmen for skills that are likely to be in demand when they graduate. The creation of generic courses for all majors should also focus on these skills because they are relevant to all college graduates. Such generic courses have the advantage that they do not require students to select a major or even a cluster area.

Featured Image credit: Image by Gerd Altmann via Pixabay.

The post How to prepare students for jobs in the 21st century appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on August 09, 2020 02:30

Oxford University Press's Blog

Oxford University Press
Oxford University Press isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Oxford University Press's blog with rss.