Oxford University Press's Blog, page 275

February 28, 2018

Playerless playtesting: AI and user experience evaluation

Over the past few decades, the digital games industry has taken the entertainment market by storm, transforming a niche into a multi-billion-dollar market and captivating the hearts of millions along the way. Today, the once-deserted space is overcome with cascades of new games clamouring for recognition. Competition is especially fierce in emerging spaces, such as the mobile and free-to-play markets, where designers must provide an engaging experience or face having their games forgotten. This has put an increasing strain on developers to pursue the utmost creative and technical precision in their releases.


Given this incredibly competitive climate, it’s no small wonder that Games User Research (GUR) has emerged as a field of key commercial and academic importance. Game development, after all, demands a deep understanding of human factors, user experience (UX), and player behaviour. Gone are the days of closed-environment, single-cycle development. Modern game development is an inherently iterative process, and one that is necessarily player-centric if it intends to be successful. For developers, UX evaluation is crucial, aiming to bolster the quality of finished products through insights acquired from analysing the actions and reactions of players within a game’s target audience.


Many different user research methodologies can be applied to the process of UX evaluation, from straightforward gameplay observation to complex experimental set-ups featuring physiological sensors, eye-tracking, and configurable recording equipment. Aggregated from several players over a given testing period, data obtained from multiple methods can be analysed to derive a range of insights, from identifying basic usability issues to providing a near-complete profile of the “average” player’s experience.


Practically speaking, the user evaluation process poses several open challenges for developers and researchers. Real-world user populations boast significant diversity in motivation, behaviour, and past experiences, which is astonishingly difficult to replicate when recruiting participants. Furthermore, the practice of testing early and often—while favourable to the quality of the finished product—can be prohibitively expensive and time-consuming, particularly for small- and medium-sized studios. But what if we could lessen this burden while improving our ability to conduct more representative and comprehensive testing sessions? And what if we managed to accomplish this by cutting the player, at least temporarily, out of the equation? What if we developed a UX evaluation system driven by artificial intelligence?



“Modern game development is an inherently iterative process, and one that is necessarily player-centric if it intends to be successful.”



We’re still a few years out from developing computer-controlled agents that might be able to pick up and play any game as a human might, but AI has already overtaken human skill in many complex games. For the purposes of UX evaluation, we’re less interested in developing an ideal AI agent, and more interested in maximizing its tendency to behave like a human player—complete with an imperfect memory, at-times flawed spatial reasoning, and goals that can diverge from a game’s intent based on its own simulated motivations. Then, we clone that agent into thousands of variants, all representing different “human” combinations reflecting a population with richly diverse demographics, experience levels, and playing styles. The resulting system could serve to test a few thousand “players” overnight, rather than perhaps a few dozen participants over the course of several weeks. And while such a system wouldn’t aim to replace current user evaluation methodologies, it might serve as a supplementary technique in the early stages of development. But how might such a framework be used in practical terms?


Imagine you’re a developer looking to identify any glaring flaws or opportunities for optimisation in level prototypes —corridors where players might get lost, easily missed objectives, and so on. Ideally, you’d want a dozen participants to play through each level, tracking their navigation and noting any unexpected behaviour. But this is expensive, time-consuming, and hardly repeatable for every single design change. The solution? An AI-driven framework capable of standing in for human participants. You set up a population of AI “players” representing your target demographic and instruct the system to simulate a number of trials. Within a few minutes, the results of the test have been logged, and after a few more clicks, you’ve brought up an overlay showing the aggregate navigation data of a hundred different AI agents. You note some interesting insights; an area where agents have trouble finding their way out; an objective trigger that might be a bit too difficult to reach. After a few tweaks, you’re ready to test again, and when the time comes for trials with human players, your more sophisticated UX evaluation won’t be bogged down with basic frustrations like players getting lost or missing key areas.


At a high level, current and future AI-driven approaches in user evaluation rely heavily on the adaptation of knowledge from existing research on subjects including machine learning, computer vision, and modelling human behaviour. While we might still be several years away from having an AI capable of predicting the finer complexities of user experience, the field as a whole is moving towards complete integration with computer modelling and analytical techniques alongside qualitative approaches. If this trend continues, playerless playtesting might just be the next great frontier in games user research.


Featured image credit: “K I N G S” by Jeswin Thomas. Public Domain via Unsplash.


The post Playerless playtesting: AI and user experience evaluation appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 28, 2018 03:30

February 27, 2018

Dr. Victor Sidel: a leader for health, peace, and social justice

Victor (Vic) Sidel, M.D., who died in late January, was a national and international champion for health, peace, and social justice. Among his numerous activities, he co-edited with me six books on war, terrorism, and social injustice that were published by Oxford University Press. Vic left an extensive legacy in the residents and students whom he trained, in the organizations that he strengthened, in the scholarly books and papers that he edited and wrote, and in the policies and programs that he promoted for a healthier, more peaceful, and more equitable world.


For 16 years, Vic directed the Department of Social Medicine at Montefiore Medical Center and Albert Einstein College of Medicine, which promoted health and prevented disease in the community; addressed health-related issues in housing, nutrition, education, and employment; and helped to reduce inequities in health care. Vic taught social medicine by placing residents and students in the community to observe patients’ health problems in a socioeconomic context and to identify community resources to address these problems. He trained community health workers to promote health and prevent disease in the neighborhoods where they lived.


Vic advocated for peace. He envisioned a world where conflicts are settled without violence and where the dignity of every person is respected. He resisted militarism and the diversion of resources for military purposes. He opposed international arms sales and easy access to guns and assault weapons. In 1961, Vic co-founded Physicians for Social Responsibility (PSR) to raise awareness of the threat of nuclear weapons – what he called “the final epidemic.” In the early 1980s, he helped establish the International Physicians for the Prevention of Nuclear War (IPPNW). He served as president of PSR and co-president of IPPNW, which won the Nobel Peace Prize in 1985. Although Vic used statistics as hard evidence of the impacts of militarism and war, he observed that statistics are people with the tears washed off.



Dr. Victor Sidel. Image used with permission of the author.

Vic promoted social justice and recognized that the denial of human rights adversely affects health. He supported a woman’s right to choose. He supported a person’s right to die with dignity. He asserted that health is a basic human right. He believed that government is responsible for equitable financing and equitable provision of health care. And he demonstrated the linkages among health, human rights, and civil liberties, and among social justice, environmental justice, and justice in the workplace. As president of the American Public Health Association (APHA) and the Public Health Association of New York City, Vic addressed the growing gaps between rich and poor people in the United States and between rich and poor countries.


In 1971, Vic and his wife, Ruth, were among the first Americans to visit the People’s Republic of China in 22 years. Their subsequent books and lectures enlightened many people about health and social services in China. They wrote and spoke about its key principles, including: (1) Put prevention first. (2) Train community members as “barefoot doctors” to provide health services in underserved areas. And (3) base the health system on the concept “serve the people,” in which the gratification of health workers comes from the opportunity to serve — not from status or financial reward. In many ways, Vic served the people.


As an educator, Vic instructed, mentored, and inspired residents and students. He trained thousands of physicians and other health workers, many of whom have become leaders in government, academia, civil society, and elsewhere.. Like ripples in a pond, each of them have educated and trained many more. Vic used creative teaching techniques to rivet the attention of students and others and illustrate his main points. In one of them, he held a beating metronome as he pointed out that every other beat represented a child’s death that could have been prevented and also represented $50,000 spent for military purposes. In another, he dropped hundreds of steel balls into a container, the pinging sound of each one representing an atomic bomb and the loud clanging noise of all of them together representing the global stockpile of nuclear weapons.


Vic had a profound influence on the career of many students. One student, for example, almost left medicine. But after hearing Vic speak, she discovered opportunities to make her career in medicine meaningful and consistent with her beliefs. She became a local health officer and then a national public health leader. Her work made significant contributions in reducing teen pregnancy, decreasing health disparities, and supporting health care reform.


As a scholar, Vic wrote many journal articles and gave numerous lectures with insight and vision, clarity and passion. In the early 1960s, he co-authored with Jack Geiger and Bernard Lown landmark papers in the New England Journal of Medicine and elsewhere on the health impacts of thermonuclear war. And he wrote many other papers about preventing war, reducing poverty, addressing racism, improving access to health care, and other issues related to health, peace, and social justice. As a scholar, he recognized that research had to be conducted ethically. For years, he chaired the Institutional Review Board at Montefiore Medical Center to ensure that research subjects were protected and respected.


As an organizational leader, Vic not only did things right, but he chose to do the right things. When he saw suffering, he worked to heal it. When he saw the threat of violence, he worked to prevent it. When he saw social injustice, he worked to stop it. When the Allende government in Chile was overthrown in 1973 and many health workers there were fired, imprisoned, or killed, Vic led the response by Montefiore and APHA to publicize their plight, and to save and resettle Chilean health workers.


I worked closely with Vic for more than 25 years. I remember his wisdom as we developed and co-edited books and wrote articles on war, terrorism, and social injustice. I remember his creativity as we presented lectures together. I remember his inclusiveness as we organized annual sessions on preventing war and promoting peace at APHA annual meetings. I remember his vision as we established the APHA Award for Peace.


Now, it is up to each of us and many others whom Vic inspired to carry on his work for health, peace, and social justice –- to carry on his work with commitment, solidarity, and moral courage.


Featured image credit: “surf” by Frank Mckenna. CC0 via Unsplash.



The post Dr. Victor Sidel: a leader for health, peace, and social justice appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 27, 2018 05:30

In celebration of twentieth century African American literature

Since the first poems published by former slaves Phyllis Wheatley and Jupiter Hammon around the time of the American Revolution, African American literature has played a vital role in the history and culture of the United States. The slave narratives of figures such as Frederick Douglass and Harriet Wilson became a driving force for abolitionism before the Civil War, and the tumultuous end of Reconstruction brought about the exploration of new genres and themes during the height of the Jim Crow era. The Harlem Renaissance was a particularly vibrant time for African American writers, and the mid-twentieth century saw a creative spell that has yet to wane. Most significantly, African American women have been front and center during this period.


In honor of Black History Month, we have collected facts about nine of the most important African American writers of the past century, with the hope that their works retain their pivotal place in the American literary canon.


Maya Angelou (1928-2014)


In 1993, Angelou was chosen by former US President Bill Clinton to be the poet at his inauguration, making her the first African-American and the first woman to assume this role. Her performance of “On the Pulse of the Morning” saw the largest audience for poetry in history, and increased the sales of Angelou’s acclaimed autobiography, I Know Why the Caged Bird Sings, by 500 percent.


James Baldwin (1924-1987)


Go Tell It on the Mountain, Baldwin’s semi-autobiographical novel, was nominated for the National Book Award in 1954; in a later interview, Baldwin said he had been told he didn’t win because Ralph Ellison’s Invisible Man had won the previous year, and America was not ready for two consecutive black winners.



Cover of The Crisis (volume 1 issue 5, March 1911), with black pharaoh illustration. Public Domain via Wikimedia Commons.

Octavia Butler (1947-2006)


Voracious reader Butler turned to writing in her childhood after watching Devil Girls from Mars, a “silly” science fiction movie about Martian women attempting to colonize Earth. She determined she could write better stories herself, thus beginning her career as a preeminent science fiction writer.


W.E.B. Du Bois (1868-1963)


Sociologist Du Bois, one of the co-founders of the National Association for the Advancement of Colored People (NAACP), famously described the NAACP’s periodical The Crisis in 1910 as an “organ of propaganda” that would bring about “one of the most effective assaults of liberalism upon prejudice, and reaction that the modern world has ever seen.” In 1916, he led efforts to boycott D.W. Griffith’s controversial film Birth of a Nation in the first formal anti-propaganda campaign in African American history.


Ralph Ellison (1914-1994)


National Book Award recipient Ellison was given the middle name Waldo because his father, a construction foreman, wanted to name his son after Ralph Waldo Emerson in the hope that he would grow up to be a poet.


Langston Hughes (1902-1967)


Hughes’ poetry often drew on the African American musical tradition for form and style. He experimented with jazz in Montage of a Dream Deferred and cast other poems in blues form or as spirituals. His works appeal to composers of different musical genres and have been set to music over 200 times.


Zora Neale Hurston (1891-1960)


Folklorist Hurston thrived during the Harlem Renaissance, though her more conservative political beliefs, such as her opposition to the Brown vs. Board of Education Supreme Court ruling, contributed to her exclusion from literary circles at the end of her career. She died in obscurity and was buried in an unmarked grave. Alice Walker revitalized interest in Hurston’s works in 1975, after they had been out of print for thirty-five years.



“Ralph Ellison, noted author and professor” originally by the National Archives and Records Administration. Public Domain via Wikimedia Commons.

Toni Morrison (1931- )


In 1993, Pulitzer Prize-winning novelist Toni Morrison attained the highest literary accolade possible: the Nobel Prize in Literature. Morrison was the first African-American recipient and only the eighth woman in the world to receive this honor.


Alice Walker (1944- )


Walker’s 1992 novel Possessing the Secret of Joy garnered controversy for its depiction of female circumcision, with critics arguing she was an outsider interfering with African culture. Walker defended her book, claiming she understood how it felt to be physically maimed as she had been partially blinded as a child by her brother’s BB gun. She called her blinded eye and the wounds borne by women suffering from female circumcision “warrior marks.”


Featured Image credit:  Maya Angelou reciting her poem “On the Pulse of Morning” at President Bill Clinton’s inauguration in 1993. William J. Clinton Presidential Library, Public Domain via Wikimedia Commons .


The post In celebration of twentieth century African American literature appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 27, 2018 03:30

February 26, 2018

Prohibition: A strange idea

American politics is frequently absurd, often zany, and sometimes downright crazy. Among the most outrageous past ideas was the legal Prohibition of alcohol, which was put into the US Constitution as the Eighteenth Amendment in 1920. Prohibition lasted until 1933, when the Twenty-First Amendment brought repeal and tight government regulation of alcohol.


How and why one of the world’s hardest-drinking societies embarked upon a scheme to ban alcohol is a tale worth remembering. Prohibition shows the difference between good intentions and bad results. It also offers a cautionary note to all who would propose government-mandated legal bans to other social ills; such bans may have unanticipated negative results.


After the American Revolution the widespread availability of cheap whiskey caused drinking to soar. By the 1820s the typical white American man drank a half pint of whiskey a day. Workplace drunkenness, alcohol-induced poverty, public fights, wife-beating, and child abuse led evangelical Protestants to organize the Temperance Movement. Half the population stopped drinking. To dry out the other half, reformers enacted local and then state Prohibition, which 11 states adopted during the 1850s. These laws, however, quickly disappeared, partly due to heavy-drinking immigrants. The Irish loved their whiskey and the Germans their beer, which gradually became the country’s favorite alcoholic beverage after the Civil War.


In 1874 the Woman’s Christian Temperance Union (WCTU) began to push local, state, and national Prohibition. By 1890 the WCTU’s 200,000 members made it the largest women’s organization in the world. Under the brilliant leadership of Frances Willard, the WCTU advocated for Prohibition, women’s suffrage, and many other reforms, but Willard’s “Do Everything” policy may have weakened its Prohibition mission.


In 1895 the Anti-Saloon League (ASL), the first national single-issue political action group, joined the dry crusade. The ASL elected dry legislators to pass statewide Prohibition and dry members of Congress to pass a constitutional amendment. The main political opposition came from national brewers and the thousands of saloons that they controlled. These “tied-house” saloons, which served only one brand, often were voting precincts for corrupt political machines. Many saloons harbored prostitutes and illegal gambling. Most observers, however, doubted that the ASL could ever pass a dry constitutional amendment.



“In January 1919, the amendment was ratified, and a year later legal alcohol production and sales ended, but most of the country was already dry—at least in theory.”



The First World War, which began in 1914, changed the political odds. Until 1917 the United States was neutral, but American business interests and sympathies were with the British and French rather than the Germans. Most brewers were of German ancestry, as well as one-quarter of all Americans. The German government sponsored sabotage inside the United States against American businesses that supplied the British and French. In the event that the United States joined the war against Germany, there was fear that German agents would use saloons to recruit spies and saboteurs.


By 1916 very few American politicians wanted money from brewers or their saloons, and the Anti-Saloon League won two-thirds majorities in both the House and Senate. In early 1917, German U-boats attacked American shipping in the Atlantic Ocean. As a result, President Woodrow Wilson got Congress to declare war against Germany in April 1917. Congress quickly adopted wartime anti-liquor measures. No one in uniform could be served alcohol, which was also blocked from areas near military bases or defense contractors. Congress passed the Eighteenth Amendment and sent it to the states for ratification.


Because the United States shipped large quantities of food to Europe, shortages developed. Congress banned distilling hard liquor such as whiskey from any food substance, and the legal strength of beer was cut to 2.75 percent alcohol. Eventually, Congress passed wartime Prohibition, which was designed to end alcohol consumption even before the Eighteenth Amendment went into effect. In January 1919, the amendment was ratified, and a year later legal alcohol production and sales ended, but most of the country was already dry—at least in theory.


In the early 1920s alcohol consumption may have dropped by two-thirds. Moonshine and hard liquor smuggled from Canada often replaced beer. By the mid-twenties, alcohol consumption rose, as the number of people willing to defy the law increased. Speakeasies attracted both young men and women, which marked a change in drinking patterns. By the late twenties gangsters controlled much of the illegal alcohol industry. They made millions and used political payoffs and violence to protect their business. In the early thirties the Great Depression left all governments with declining tax revenues and growing demands for social services.


In 1932, when Franklin Roosevelt ran for president, he promised to repeal Prohibition. Congress authorized 3.2 percent beer in April 1933. The Twenty-First Amendment brought total repeal eight months later. Both the federal and state governments imposed high alcohol taxes, and the substance became highly regulated. Eighteen states sold hard liquor only in state stores in order to put bootleggers out of business. Almost every state legally separated producers, wholesalers, and retailers to prevent the return of the prewar tied-house saloons.


Americans went through three distinct phases with alcohol. Before Prohibition, alcohol had been mostly unregulated with many dubious business practices and harmful results. As a consequence, a backlash had produced Prohibition, which enriched gangsters, eroded confidence in government, and deprived government of revenues. Finally, repeal brought tight government regulation and significant alcohol taxation.


Featured image credit: “Orange County Sheriff’s deputies dumping illegal booze, Santa Ana, 3-31-1932” courtesy of Orange County Archives via Creative Commons .


The post Prohibition: A strange idea appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 26, 2018 04:30

T.E. Lawrence and the forgotten men who shaped the Arab Revolt

T.E. Lawrence, known as “Lawrence of Arabia,” has provoked controversy for a hundred years. His legend was promoted in the 1920s by the American Lowell Thomas’s travelogue; renewed in 1935 through his book Seven Pillars of Wisdom; and revived in 1962 by the epic film Lawrence of Arabia. The hype should not blind us to the fact that Lawrence’s contribution to the Arab Revolt of 1916-18 against the Turks was indispensable. His skills in organizing and coordinating, his daring and courage, his intuitive grasp of guerrilla warfare and how to harness it, his influence over Emir Feisal (the leader of Arab forces in the field), and his talent for manipulating his own leaders if necessary, were all crucial to the hollow success of the revolt.


Yet Lawrence was a team player. In particular, there was a nexus of influence over the revolt that has stayed below the radar. While Lawrence and other British, Arab, and French officers were blowing up the Hejaz Railway, a forgotten band of British officers at Jeddah, far from the desert campaign, carried out vitally important diplomatic and intelligence work that prevented the revolt from collapse. This untold story centres on Colonel Cyril Edward Wilson, the British representative at the Jeddah Consulate. Wilson was a dependable officer of the old school—the antithesis of the brilliant and mercurial Lawrence. But his strong relationship with Sherif Hussein of Mecca, the leader of the revolt, drew this suspicious and controlling man back from the brink of despair, suicide, and the abandonment of the revolt. Wilson’s undervalued influence over Hussein during critical phases of the revolt was at least as important as the well-known influence of Lawrence over Emir Feisal, Hussein’s son.


Wilson’s core team included Captain Norman Bray, a highly-strung Indian Army intelligence officer who rooted out anti-British and anti-Hussein jihadists. These men were incensed that Hussein dared to rebel against the Turkish sultan, who was also the caliph (leader) of all Sunni Muslims. The stakes were high because the jihadists based at Jeddah and Mecca wanted to discredit both Hussein and the British by disrupting the Hajj—the Muslim pilgrimage—and encourage Indian pilgrims (passing through Jeddah on their way to Mecca) to rebel against British rule in their homeland. Bray helped keep the revolt on course by neutralising the jihadists, with the aid of a resourceful Persian spy named Hussein Ruhi, and had their leader deported to prison in Malta.


Ruhi is one of the most intriguing and influential players in the Arab Revolt. His cover was as Wilson’s Arabic interpreter, and he did invaluable intelligence work for the colonel in other respects too— even at times putting his life in danger.


Emir Abdullah (seated), Hussein Ruhi (far left), and Colonel Cyril Wilson (third from left) at Jeddah. Used with permission of Anthea Gray.

Wilson’s two deputies, both with intelligence backgrounds, helped him with vital diplomatic work. In the colonel’s absence, the eccentric, half-deaf Major Hugh Pearson helped steady Hussein when he lost his nerve. Later, the genial and imperturbable Colonel John Bassett stood in for Wilson, while he spent five months recovering in Cairo from life-threatening dysentery. Bassett encouraged and cajoled Hussein when Hussein fell out with his son Feisal, resigned as King of the Hejaz, spoke of suicide, and threatened to withdraw all of Feisal’s Bedouin tribesmen from the planned advance into Syria. If those fighters had returned to the Hejaz (Hussein’s territory) the revolt would have dissolved.


Sherif Hussein (seated centre) on board HMS Hardinge. Bassett is third from right and Hussein Ruhi second from right, both seated. Used with permission of Anthea Gray.

Another member of Wilson’s small team at Jeddah was a junior intelligence officer, who at first sight had less influence than his comrades at Jeddah. Yet the amiable Lieutenant Lionel Gray, who knew almost all the key British players in Arabia, helped Wilson by gaining the trust of Sherif Hussein himself and was even invited by Hussein to photograph him in his palace. Gray is also important for another reason: his hundreds of remarkable photographs, intelligence documents, and letters home, including those to his fiancée from whom he was to be parted for nearly five years. This collection offers unparalleled insights into the twists and turns of the revolt.


The compelling story of Wilson and his close-knit band points to an inescapable conclusion: the Jeddah Consulate was a vitally important hub of the revolt whose influence has been considerably undervalued. The military campaign in the desert was important, but Jeddah—with its artery to Mecca and Sherif Hussein—was the beating heart of the revolt, whose irregular rhythm needed the vital interventions of Wilson and his team. Without their quiet diplomacy and intelligence work, the revolt would have collapsed and the world would never have heard of “Lawrence of Arabia”.


Cyril Wilson was the outstanding forgotten shaper and sustainer of the revolt. Near the end of Wilson’s life, General Reginald Wingate wrote to him praising his indispensable role and his “great work” in the Arab Revolt, without which, he said, it could never have succeeded. Wilson and his circle deserve to be commemorated, a century after their vital work fell through the cracks of history. It is not unreasonable to believe that Lawrence—complex and unfathomable as he was—would have acknowledged that this was so.


Featured image credit: Used with permission of Anthea Gray. 


The post T.E. Lawrence and the forgotten men who shaped the Arab Revolt appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 26, 2018 03:30

Barristers, solicitors, and the Four Inns of Court of England

After many years of attempting to explain the need for two kinds of lawyer in the United Kingdom to exasperated and confused European colleagues – and even US ones – I have lighted on the following language. Solicitors are a primary market of legal services. They are profit-sharing organisations in which senior lawyers manage teams of junior lawyers to do almost everything their clients want. They operate just like any law firm around the world, save to the extent that their life is made easier by the existence of a secondary legal services market, which must be accessed either through them or lawyers overseas, which is known as barristers, or the Bar.


Barristers operating in the secondary market have their lives run by their clerks, non-lawyers who administer the cooperative, cost-sharing organisations called ‘chambers’, which support them in their practices. Barristers are therefore spared the distraction of running a legal business which is a necessary part of practising from a profit-sharing firm. They are also spared the exhausting business of keeping both their clients and their partners happy. That arduous work and the profit sharing entities it requires is left to solicitors in the primary market, who stand between lay client and barrister, as a ‘professional client’ of the barrister, filtering out the essentials that the barrister needs to know to be effective in court or on paper.


The result of this arrangement is that, from the beginning to the end of his career at the Bar, the barrister focuses exclusively on legal advice and advocacy in the courts. Unlike solicitors, he is not pulled away from this legal work as his career progresses, into management of other fee-earners or into developing the firm’s strategy or business development. In that sense, the independent English Bar – as barristers are collectively known – is a unique means of attracting and retaining large volumes of legal and advocacy talent to the dispute resolution market during their entire career as an advocate.


If that is a little abstract, then try this analogy. Think of a piece of litigation as a project to build a building. Solicitors (or lawyers overseas) are the building contractors who arrange for people to dig holes in the ground and to build steel frames and to clad them. Some buildings can be built just with a building firm. Other, more specialist or complex buildings benefit from the input of an architect. That is the barrister. An architect just has the ideas in his head, the books on his shelf and his means of communication: his pencil, his computer, and his voice. But the architect can’t build anything without a building contractor to work with. So we have two industries, with different roles, different cultures and, of course, different people working in them. For the solicitors, it’s probably a great relief that they can access the talents of barristers without actually having the trouble and expense of employing these unusual people.



Inns of Court, London : 1. Lincoln’s Inn, 2. Middle Temple, 3. Inner Temple, 4. Gray’s Inn. Photo by Marc Baronnet. CC BY-SA 2.5 via Wikimedia Commons.

If barristers’ non-profit chambers attract and retain talent for the entire career of an advocate, then the four Inns attract and retain legal and advocacy talent beyond the natural life-span of a barrister’s career as an advocate, both into the world of judging from the Bench and further, into semi and full retirement. The good wine, good food, good conversation, and accompanying entertainment offered by the Middle Temple and the other three Inns of Court in London gives people a reason to hang around when they move on from seeing the law as a way of earning a living.


What is the significance for the legal services industry and its users of concentrating this legal and advocacy talent from across generations in any one of four convivial places in London? Each one of the four Inns is a uniquely valuable interface between the Bar and the Bench (judiciary). This can perhaps best be explained with the following anecdote. A while ago I argued a troublesome case against an opponent to whom that adjective is also apt. Not long after that, while carrying a sorry little sandwich along Middle Temple Lane to my desk in chambers to burn yet more midnight oil, I bumped into the judge before whom he had argued rolling up from Middle Temple Hall after a dinner of countless courses.


We greeted each other, grateful for the opportunity to display a degree more intimacy to each other than appropriate in Court. We then trod a little delicately around the subject of the case. The judge’s eyes lit up and exclaimed with pleasure that his judgment had been reported and included in the footnotes several key textbooks. The judge thanked me warmly for his assistance in helping get the facts straight and the law right in a difficult case – and by doing so help the judge avoid an appeal. We waved each other a cheery goodnight. Next time I appear before the judge, I will be sure to try to be as helpful as I was last time – or more – so that I can have an even jollier and more respectful exchange when we next meet; the judge will be as pleasant to me in return in order to get the best assistance he can from the barrister before him. This human dimension provides a powerful motivation for achieving high professional standards, independent of pursuing a client’s best interests or making a living.


So the non-profit Inns help bind Bar and Bench together in the common enterprise of making and administering law. They promote civility, good sense, and mutual understanding. Where opportunities for such encounters are absent, mutual suspicion and resentment naturally flourish. When levels of aggression rise, cooperation is impeded and the quality of the legal product tends to drop. The Inns of Court are truly the institutions that caused Shakespeare to have two of his characters aspire to common law dispute resolution: ‘[a]nd do as adversaries in law,/Strive mightily, but eat and drink as friends.’ (Taming of the Shrew, Act 1 Scene 2 line 251f ).


Featured image credit: “Bookshelf old library books”, by Free-Photos. CC0 public domain via Pixabay .


The post Barristers, solicitors, and the Four Inns of Court of England appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 26, 2018 00:30

February 25, 2018

The fungus that’s worth $900 billion a year

From the dawn of history, human civilizations have prospered through partnership with the simple single-cell fungus we call yeast. It transforms sugars into alcohol, puffs up bread dough with bubbles of carbon dioxide, and is used to produce an assortment of fermented foods. It has become the workhorse of modern biotechnology as the source of life-saving medicines and industrial chemicals. And, most recently, the manufacture of ethanol as a biofuel from corn and sugarcane has launched yeast onto the frontline of our efforts to slow climate change by reducing carbon dioxide emissions.


Our bond with yeast probably began tens of thousands of years ago, with the unconscious use of the fungus to brew palm wine. This practice spread as Homo sapiens migrated from the Rift Valley and our unusual symbiosis with Saccharomyces cerevisiae (the sugar fungus) deepened and diversified. The development of brewing with cereal grains and winemaking from grapes is thought to have fostered agricultural settlement. According to this idea, civilization began in villages surrounded by golden fields of barely and rows of grapevines on the hills. The use of yeast for making bread followed when yeast escaped from a beer vat. We were tamed by yeast and it has served as the most important microbe in human culture and commerce.



“The industrial applications of this single species of fungus support more than five million American workers, or 3% of the US workforce.”



One way to measure the importance of yeast is to evaluate its vast economic impact. In the United States, for example, the Gross Domestic Product of the world’s largest economy topped $19 trillion in 2017. According to the sources detailed below, 5% of this figure—more than $900 billion—is directly reliant on the biochemical activities of yeast. Brewing, winemaking, and baking dominate this mycological enterprise, but the diverse roles of yeast in the pharmaceutical industry, biofuel production, and other market sectors complete the picture.



Brewing: $311 billion, 2.23 million jobs
Wine: $220 billion, 1.7 million jobs
Baking: $311 billion, 1.8 million jobs
Bioethanol: $44 billion estimated value
Yeast insulin: $15 billion estimated value
Other yeast products: $1 billion estimated value (Report FB2233)

The industrial applications of this single species of fungus support more than five million American workers, or 3% of the US workforce. This is more than twice the number of people employed in car manufacturing and sales.


A similar analysis could be performed on any national economy. In the United Kingdom, for example, the alcohol market represents 2.5% of GDP and employs 770,000 workers in brewing, distilling, and associated retail jobs. In the European Union, household spending on alcoholic beverages exceeded €130 billion in 2016. Yeast is important in every country where the majority of citizens do not adhere to a faith, like Islam, that prohibits alcohol consumption. There is a yin and a yang to alcoholic fermentation with the mood-elevating effect of beer and wine offset by the damage caused by overindulgence.


The value of yeast is even higher than the numbers suggest, because insulin produced by genetically modified yeast allows millions of patients with diabetes to pursue productive lives. Other bioengineered strains of the fungus produce the vaccine against the human papillomavirus (HPV) and an injectable medicine to treat eye degeneration. Capsules of yeast are sold as a probiotic treatment for all manner of health issues and seem to be effective as a therapy for traveller’s diarrhea. Yeast is the champion microorganism in biomedical research and is playing a crucial role in genetic investigations on cell development, aging, and cancer. Three Nobel Prizes for Physiology and Medicine have been awarded to yeast researchers in the last decade.


For most of our shared history we had no idea what made sweet liquids ferment and the blessings of many gods were imagined in these seemingly miraculous transformations. The gods have disappeared, as they tend to do, but we should continue to worship the brilliance of yeast. Now that we know more about the workings of yeast cells than almost anything else in the nature, we should revere the sugar fungus as much as the warmth of the sun.


Feature image credit: Bottling Prince Tuesday 2014 by Allagash Brewing. CC BY 2.0 via Flickr .


The post The fungus that’s worth $900 billion a year appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 25, 2018 04:30

Zhongguo and Tianxia: the central state and the Chinese world

China is playing an ever-increasing role on the world stage of international relations, and it is starting to bring its own vocabulary to the part. The terminology that comprises the core lexicon of international relations theory originates from Greek and Latin, and it was developed to describe and interpret the configurations of power that have been common in Western history, from ancient Athens to the British Empire. Chinese scholars are now actively mining the Chinese historical experience to develop new terms to apply both to their own past and to an ever-changing present.


Different histories call for different vocabularies. Over three millennia, China has developed a distinct vocabulary that is well-adapted to its own place at the center of the East Asian world-system. The Chinese word for China, Zhongguo, literally means ‘central state or states’ (there is no plural inflection in Chinese), giving rise to the poetic sobriquet “Middle Kingdom.” Thus, China is not the land of the Han or the empire of the Qin, but simply the country in the middle.


But in the middle of what? Again, the Chinese language has a distinctive term: tianxia, literally “sky beneath” but more idiomatically translated as “all under heaven.” Sometimes taken to mean “the world,” tianxia instead refers to “a world,” in the sense of a world-system. Ancient Chinese historians were aware of the existence of places far beyond the borders of China (like India, Persia, and even the Roman Empire) but they did not consider these places to be part of “their” world, the Chinese tianxia.


For the historian Wang Gungwu, tianxia depicts an enlightened realm that Confucian thinkers and mandarins raised to one of universal values that determined who was civilized and who was not.” Its nearest Western equivalent might be the Medieval concept of Christendom, a particular type of tianxia tied together not just by shared Christian belief but also by participation in shared rites. We might also speak of an ancient Greek tianxia in the Aegean Sea in the years before the Persian Wars, or of an ancient Indian tianxia centered on Brahmanic tradition.


The difference between China and these other tianxia candidates is that while the others were transitory, China’s tianxia seemed to be eternal—at least until the incorporation of China into the Modern World-System centered on Western Europe. From the beginning of history until the fall of the Ming dynasty in 1644, China was the central state or states, the lower-case zhongguo, of the East Asian tianxia. Chinese culture was for two millennia the mother, and measure, of East Asian civilization. That pattern only ended with the dissolution of the previously isolated Chinese tianxia into the larger, globalizing world.


 Over three millennia, China has developed a distinct vocabulary that is well-adapted to its own place at the center of the East Asian world-system.

The Chinese tianxia thus differed from other historical examples in two ways: it was a permanent (not transient) configuration, and it had a universally acknowledged center. The Mongols may have conquered China in the thirteenth century, but they didn’t attempt to govern it from Mongolia. China was the center of their world, every bit as much as it was the center of Japan’s world, Korea’s world, and Vietnam’s world. These relationships were reflected in the tributary system, which sometimes reflected Chinese power to demand tribute and at other times China’s need to beg for support, but always placed China at the center of the system.


Then came the West. From the 1600s until the end of the twentieth century, international relations vocabulary flowed in only one direction: from that larger world into China. Western military power forced China to accommodate itself to Western notions of sovereignty, to redefine itself first as an empire (under the Manchu Qing dynasty) and then as a republic (under Sun Yat-sen and later Chiang Kai-shek and Mao Zedong). Today Xi Jinping seems eager to redefine China once again, this time as a nation-state. All of these are Western terms suited to the modern world.


Drawing on pre-modern Chinese history and philosophy, the emerging Chinese School of international relations is, however, developing a new terminology of Confucian relationality. Philosopher Zhao Tingyang’s historically-inspired tianxia system sits at the center of this approach. Zhao sees the tianxia understanding of a unified world as the basis for a new form of globalism. I myself have incorporated Zhao’s ideas into an interpretation of the post-modern world-system as a new central state system focused on the United States, an American Tianxia based on a philosophy of liberal individualism.


The Chinese School of international relations is unlikely to supplant established international relations theory, but it’s certain to enrich it. The concepts of the central state or states (zhongguo) and of a world-system based on universal values (tianxia) are useful additions to our terminological toolkit for understanding the world. They may apply to historically-existing world-systems for which Western vocabulary is inadequate, or they may indeed apply to the new world-system form that is emerging from contemporary globalization.


China is certainly a key component of the twenty-first century world. Its philosophical heritage is indispensable for interpreting Chinese history to that world. It may also prove helpful for making today’s world meaningful to ourselves.


Featured image credit: Building water historical by danist07. Public domain via Unsplash .


The post Zhongguo and Tianxia: the central state and the Chinese world appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 25, 2018 03:30

February 24, 2018

Excessive gambling and gaming recognised as addictive disorders

There is no doubt that excessive gambling can cause a huge mental, personal, and financial toll for the gambler and the members of their family. The nature of excessive gambling and whether it constitutes a disorder has been the subject of much research, debate, and controversy in recent years. Originally included with impulse control disorders as “pathological gambling” in the American Diagnostic and Statistical Manual (DSM), it has in the most recent edition, DSM-5, been included in the addictive disorders section with various psychoactive substance use disorders. The criteria for what is termed “gambling disorder” in DSM-5 now share many of those with substance use disorders, including features such as priority given to gambling over other life activities and responsibilities, urges to gamble, and the need to gamble increasing amounts.


There is also increasing acceptance of withdrawal states when gambling suddenly ceases because the person is coerced to or there is no further money with which to gamble. A distinct criterion is “chasing” previous losses by gambling in an effort to try and recoup them, a phenomenon that does not have an exact counterpart in substance disorders. The associations and comorbidities of gambling disorder with predisposing factors and associated psychiatric disorders show a high level of similarity. Gambling disorder has now been included in the latest (eleventh) revision of the International Classification of Diseases (ICD-11), published by the World Health Organization.



“The criteria for what is termed “gambling disorder” in DSM-5 now share many of those with substance use disorders, including features such as priority given to gambling over other life activities and responsibilities, urges to gamble, and the need to gamble increasing amounts.”



Research into the features of excessive gaming is of more recent origin and reflects the vast array of internet-based/online games that have been devised over the past 10-15 years. These harness the power of the internet to maximise their interactivity, and offer the ability for gamers to form teams in the online environment and play against each other often for most of the waking day. An examination of the various features of online gaming indicates that excessive online gaming conforms to behavioural addictions, its central features being impaired control over gaming, priority given to it over other activities and responsibilities, and continuation of gaming despite harmful consequences—much as is seen in substance dependence. Internet gaming disorder is included in DSM-5 as a candidate diagnosis. Gaming disorder is included in the chapter of disorders in the draft version of the ICD-11due to substance use and addictive behaviours.  Physiological features of tolerance and withdrawal have been described in gaming disorder, but given the infancy of the literature specifically on these phenomena it may be premature to include them as diagnostic features at the present time.


These developments in the classification systems are important because they identify gambling and gaming disorder and their addictive nature as significant health problems. Already, treatments for substance disorders, such as relapse prevention therapies and medications working on the reward and reinforcement systems such as naltrexone, are being applied to both of these disorders. A public health perspective incorporating prevention and health-promoting public policy measures would also be valuable, as they have contributed so much to the decline in cigarette smoking and the reduction of alcohol-related harm in many countries.


In Addiction Medicine, Second Edition, new chapters have been incorporated on gambling disorder and gaming disorder, as well as a chapter examining the features, comorbidities, and clinical management approaches for a range of other behavioural disorders with addictive features. It may well be that as research advances, more evidence will emerge of other forms of excessive consumption that can be properly identified as addictive disorders.


Featured image credit: Lights by _HealthyMond. Public Domain via Unsplash.


The post Excessive gambling and gaming recognised as addictive disorders appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 24, 2018 04:30

Alain Locke, Charles S. Johnson, and the establishment of Black literature [excerpt]

 In March of 1924, Charles S. Johnson, sociologist and editor of Opportunity: A Journal of Negro Life, approached Alain Locke with a proposal: a dinner was being organized with the intention to secure interracial support for Black literature. Locke would attend the dinner as “master of ceremonies” with the responsibility of creating the bridge between Black writers and potential White allies. Both Johnson and Locke recognized that a literary movement centered on the Black experience in America would need White support in order to gain momentum. The following excerpt from The New Negro details how Locke secured a space for Young Black writers within the larger literary community.


As March 21 approached, participants became nervous. A week before the dinner, Bennett wrote Locke, “I am so glad that you have agreed to come. I feel the utter necessity of your being there.” Bennett was even more grateful he would be out front when the day of the event arrived. “You are particularly appreciated,” she wrote to Locke, “because of the tremendous and unswerving confidence that you have in us. Your faith in our utter necessity is particularly helpful as I find that my mind is not clear on the eve of this momentous event.” Even Johnson seemed nervous. After confiding in Locke that “the thing has gone over big, nothing can be allowed to go wrong now,” Johnson asked him to come up early that day to help him finalize the evening’s program. “I would like to see you as early as possible to have the first talk about plans, and probably, we shall have to do most of the arranging of the program then.” After a short introduction by Johnson, Locke would discuss the significance of these new writers, and then introduce Carl Van Doren who would outline his hopes for the Negro writer. Then, Horace Liveright, the publisher of Cane and There Is Confusion, would make a few remarks about the publishing scene for Negro books, a market-reassuring strategy most likely recommended by Johnson. After, Gwendolyn Bennett, Countee Cullen, and a few others would read poems and give testimonials. Jessie Fauset would give the closing remarks.


Work on the program concluded, two of the smallest African Americans—Johnson was only 5´2˝—proceeded to the Civic Club dinner, dressed to the nines that Friday evening. Locke began his remarks by arguing that a new sense of hope and promise energized the young writers assembled, because they “sense within their group—meaning the Negro group—a spiritual wealth which if they properly expound will be ample for a new judgment and re-appraisal of the race.” Although Locke’s optimism has led critics to claim that he promised Black literature would solve the race problem, his language was actually quite cautious. Negro literature would “be ample,” that is, sufficient, to contradict those Whites who claimed Blacks were intellectually inferior “if they [i.e., the Black writers] properly expound [it].” Their success would allow “for a new judgment . . . of the race” if Whites were willing to render it. But there are no guarantees.



“Carl Van Doren then laid out what the White literary press wanted from the younger Negroes: art not anger.”



More powerfully expressed was Locke’s belief that by avoiding a literature of racial harangue, the new group of writers could make a broader contribution than those who had come before them. Locke advanced a new concept, that of generation, to suggest this was a new cohort of Black writers possessed of a devotion to literary values that set them apart from their forerunners. For example, Locke introduced Du Bois “with soft seriousness as a representative of the ‘older school’ ” of writing. That seemed to put Du Bois slightly on the defensive and felt called upon to justify writers of the past as “of necessity pioneers and much of their style was forced upon them by the barriers against publication of literature about Negroes of any sort.” Locke introduced James Weldon Johnson—a writer of poetry, music lyrics, and a novel—“as an anthologist of Negro verse”—another dig, since Johnson was a novelist, a lyricist, and a poet, in addition to editing The Book of American Negro Poetry and writing a powerful introductory essay. Locke did acknowledge him for having “given invaluable encouragement to the work of this younger group.” By defining these NAACP literary scions as elderly fathers and uncles, Locke implied their virtual sons and daughters were Oedipal rebels whose writings rejected the stodginess of their literary parents. Against the backdrop of an ornate Civic Club dinner, with its fine china, polished silverware, and formally attired White patrons, Locke issued a generational declaration of independence for the emerging literary lions of the race.


Carl Van Doren then laid out what the White literary press wanted from the younger Negroes: art not anger. Gingerly but tellingly, Van Doren ventured that the literary temperament of the African American, whether produced by African or American conditions, was distinguished by its emotional transcendence, its reputed ability to avoid haranguing White America for its obvious wrongs and turning suffering into works of unparalleled beauty. Young Black writers had to sit and listen to Van Doren declare that “long oppressed and handicapped, [Negro artists] have gathered stores of emotion and are ready to burst forth with a new eloquence once they discover adequate mediums. Being, however, as a race not given to self-destroying bitterness, they will, I think, strike a happy balance between rage and complacency—that balance in which passion and humor are somehow united in the best of all possible amalgams for the creative artist.” A certain amount of condescension was the cost of liberal White support.


Featured image credit: “Books Notepad Pen Education Notebook Student” by Free-Photos. CC0 via Pixabay


The post Alain Locke, Charles S. Johnson, and the establishment of Black literature [excerpt] appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on February 24, 2018 03:30

Oxford University Press's Blog

Oxford University Press
Oxford University Press isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Oxford University Press's blog with rss.