Oxford University Press's Blog, page 662

May 28, 2015

Can the Sequential Intercept Model help with behavioral health justice?

There is now pending legislation in the United States Senate and the U.S. House involving the diversion of justice-involved individuals with behavioral health disorders from standard prosecution. Both bills use the Sequential Intercept Model (SIM), developed by Mark Munetz and Patty Griffin, in collaboration with Hank Steadman, as an organizing tool to help structure the proposed law. What is the SIM? How can it be used?


Deciding on appropriate and fair dispositions of cases in which individuals who have behavioral health problems such as mental illness, substance abuse, or trauma symptoms, is very challenging. Using standard prosecution means that such individuals, some of whom would not re-offend if provided with appropriate treatment, are instead convicted and perhaps incarcerated. Returning to the community following incarceration, they face the same challenges they experienced prior to jail or prison. The standard process of arrest, conviction, incarceration, and standard reentry is simply not the best approach for many such individuals.


Various kinds of support for this conclusion have been growing for years. Therapeutic jurisprudence is a legal philosophy describing how the law can act as a therapeutic agent–in this case, to provide appropriate services while simultaneously reducing the risk of re-offending–rather than serving a primarily punitive influence. Treatment in the community is less expensive than incarceration in a jail or prison, or involuntary commitment to a hospital. Some people really do commit crimes that are basically a public display of behavioral health problems. For these individuals, being provided with the appropriate treatment in the community can both promote behavioral health and reduce the cost and public safety risk to society.



Statue Of 'Justice' Old Bailey. By Lonpicman. 22 January 2011. CC-BY-3.0 via Wikimedia.Statue Of ‘Justice’ Old Bailey by Lonpicman. CC-BY-3.0 via Wikimedia Commons.

But how do we decide on where to intervene? How do we make well-informed decisions about those who are appropriate for non-standard diversion versus those who need prosecution and incarceration as usual? This is where the Sequential Intercept Model can be useful. The model identifies five points of “interception” that include first contact with police, first appearance, jail incarceration following arrest, reentry planning while incarcerated, and specialized parole after returning from incarceration. Each of these points can focus efforts to develop alternatives to standard prosecution. For example, at the first intercept (initial police contact), training police officers in specialized police responding such as Crisis Intervention Team (CIT) provides officers with a better understanding of behavioral health problems and additional tools for dealing with such problems without unnecessarily escalating an already-tense situation.  Another example involves “problem-solving courts” such as drug court, mental health court, or veterans court. Clients enter problem-solving court because they have the potential to respond to rehabilitation in a way that both improves their lives and reduces the risk that they will commit another crime.


The SIM can help communities consider their current criminal justice and behavioral health system functioning, and decide whether and how they would like to provide alternatives to standard prosecution. It can help practitioners make sense of both systems, and administrators and policy-makers to use their available public funding most efficiently. It can help legislators collaborate in proposing needed legislation that is likely to be effective.


Headline image: Lady Justice Statue ontop of the Old Bailey in London, © chrisdorney. Via iStock photo


The post Can the Sequential Intercept Model help with behavioral health justice? appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2015 02:30

College education for emerging adults [infographic]

College education trends have been changing a lot over the past few decades — from tuition fees to enrollment rates to reasons for attending. While it may seem as though today’s emerging adults aren’t satisfied with today’s education trends, 9 out of 10 high schoolers expect to continue their education in some way after graduation, and 84% of college graduates believe their education was a good investment.


Jeffrey Jensen Arnett examines the college process of those in their late teens and early twenties in the second edition of Emerging Adulthood, uncovering what years of rising higher education costs and post-high school education opportunities have had on the attitudes of those trying to assert their independence. The infographic below illustrates some of the positive and negative college data from his work.


744-CF_CollegeIG_040915_final


Download the pdf or jpg of the infographic.


Featured image credit: University of Toronto. CC0 via Pixabay.


The post College education for emerging adults [infographic] appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2015 01:30

Surveillance and privacies

In its recent report, Privacy and Security: A modern and transparent legal framework, the Parliamentary Intelligence and Security Committee pondered on the scale of public concern about digital surveillance:


“It is worth noting that this debate does not seem to arise in the context of the Agencies intercepting letters, or listening to people’s home or office landline calls. So what is it about the internet that makes it different? For many, the free and open nature of the internet represents liberty and democracy, and they consider that these values should not be compromised for the sake of detecting a minority who wish to use it for harmful purposes.”


A feature of the current controversy is its narrow chronology. The decades before 9/11 correspond to the medieval period and the centuries before the internet are lost in the mists of time. The legislation that controls the behaviour of the security agencies, particularly the Acts of 1989, 1994, and 2000, is generally seen as obsolete. And it is evident that the Intelligence and Security Committee has a limited grasp on the history of public attitudes to surveillance.


The last time Parliament devoted a specific report to the issue was in 1957, when a committee of three Privy Councillors chaired by Lord Birkett reviewed the interception of communications following the tapping of the phone of a barrister acting for a London gangster. It had no doubt that the practice was widely regarded as ‘inherently objectionable’:


“Whether practised by unauthorised individuals or by officials purporting to act under authority, the feeling still persists that such interceptions offend against the usual and proper standards of behaviour as being an invasion of privacy and an interference with the liberty of the individual in his right to be ‘let alone when lawfully engaged upon his own affairs.'”


The committee looked back to the period before electronic communication and found a counterpart to the concern about telephones in the controversy in 1844 when Robert Peel’s government was caught intercepting the mail of political exiles just after the creation of a mass postal system. It cited the opinion of the Home Secretary Sir James Graham that practice of opening letters was generally seen as “odious, invidious and obnoxious.” So explosive has the subject been since then that governments of every political hue have striven to avoid any form of Parliamentary debate.



SurveillanceCameraSurveillance Camera by Antranias. Public Domain via Pixabay.

Between 1844 and 2015 there has been a critical ambiguity in the deployment of “privacy” as the measure of what is attacked by the surveillance of communications. On the one hand it refers to the invasion of the personal archive. There is held to be a fundamental claim, embodied in the post-war declarations of human rights, to the protection of information relating to the individual. Revelations of interception are immediately framed in terms of theft and loss. Conversely the security agencies and their official sponsors seek to reassure the protesters that they are only obtaining “metadata”, not substantive knowledge about actual persons, and only assault the privacy of those seeking to commit serious crime.


On the other hand it relates to what Birkett termed “the liberty of the individual.” This refers to the basic settlement of liberal democracy that emerged during the eighteenth and nineteenth centuries. The monopoly of violence and limited powers in areas such as raising taxes were ceded to elected governments which in turn agreed not to invade the realm of the family and the conduct of public debate. The state deliberately refrained from knowing what its citizens knew and thought. In its full form, as expressed by John Stuart Mill, this category of protected information is the arena in which liberties are advanced and abuses checked. In this case it is not the substance of the information that is in question, but the ability freely to give voice to it.


Despite all the headlines about government-sponsored snooping into personal affairs, it is the second version of privacy and not the first that is the fundamental concern.  In Citizenfour, the film about Edward Snowden, one of the leading digital activists, Jacob Appelbaum, captured the slide in terminology: “what people used to call liberty and freedom we now call privacy. And we say, in the same breath, that privacy is dead … When we lose privacy we lose agency, we lose liberty itself, because we no longer feel free to express what we think.”


Featured image credit: Rader Dish. Public Domain via Pixabay.


The post Surveillance and privacies appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2015 00:30

May 27, 2015

Monthly etymology gleaning for May 2015

Sir James A. H. Murray

In the United States everything is planned very long in advance, while in Europe one can sometimes read about a conference that will be held a mere three months later. By that time all the travel money available to an American academic will have been spent a millennium ago. In the United States, we have visions rather than short-range plans. Knowing all that, I still want to remind our readers that on 26 July the world is expected to mark the centennial (centenary) of James Murray’s birth. No symposium can be held in the middle of summer, but perhaps somebody somewhere will observe the date and honor the great man. Let the event be in September or even in October, but let it not be lost in the welter of everyday activities.


Here are a few passages from the article “N.E.D.” published in The Nation, vol. 124, 1927, p. 660:


“The current year is expected to see the appearance of the last volume of the ‘New English Dictionary’. Probably most people will not remember 1927 chiefly for that, but it is not likely that any of its other achievements will more deserve to be remembered…. James A. H. Murray… had hoped that he might live to see it [The N.E.D.] complete, and three years before he died in 1915 he was working seventeen hours a day, but the task was too vast to be accomplished by even such intemperate labor…. The ‘New English Dictionary’ is one of those magnificent achievements which are only possible when an organization exists capable of drawing to itself during several generations the precise men for the task in hand.”


As a matter of fact, the dictionary was completed in 1928. In 1913 Bernhard Kellermann brought out his futuristic and most successful novel The Tunnel (Der Tunnel). Its protagonist spends twenty six years constructing a tunnel between Europe and America. Although handicapped at every step, he succeeds, but “The train arrived in Europe with a twenty-six minutes’ delay” (one minute for each year, I assume). By that time airplanes crossed the distance between the continents in several hours (according to a prediction made in 1913), and no one ever used the tunnel. Fortunately, the OED cannot be outdated, but it too needed and needs supplements.



Paint brushes put to good use. (Self Portrait as an Artist. Vincent van Gogh. 1888. Public domain via Wikiart.org)Paint brushes put to good use. (Self Portrait as an Artist. Vincent van Gogh. 1888. Public domain via Wikiart.org)

Although Murray is famous, relatively little has been written about him, and no book analyzes the whole of his voluminous correspondence. I would like to quote his letter printed in Notes and Queries, 9th Series/XII, 17 October 1903, p. 307. It gives an idea of his activities.


“PAINT-BRUSH.—This term appears to be recent in literary use. I do not find it in any dictionary before Cassell’s ‘Encyclopedic’ in 1886, and I have not come upon a literary example before 1882. All the same, I remember it in colloquial use more than fifty years ago; indeed I can remember having a paint-box and paint-brushes, and buying ‘camel’s-hair paint-brushes’ in 1845. Was it then a child’s word? In works on art one finds only ‘the brush’, the ‘product of his brush’, &c. In 1792 the Gentleman’s Magazine speaks of a ‘painter’s brush’, and in a ‘Book of Trades’ of 1848, under ‘The Brushmaker’, where scrubbing-brushes, shoe-brushes, clothes-brushes, and tooth-brushes all appear, one finds paint-brushes referred to only as ‘the brushes used by painters’. But surely tradesmen who sold them called them ‘paint-brushes’ fifty years ago? Can any one furnish examples?” From Kellermann to O. Henry, who wrote a story titled “Shearing a Wolf.”


This is what it took to shear the vocabulary of English for twelve centuries. Read and admire.


Getting down to brass tacks

The comments and letters on this subject have been most useful. One of our correspondents called my attention to the colloquial meaning of brass balls. The phrase is well-known, but it hardly explains the origin of brass tacks. More to the point is the publication by Peter Reitan in the periodical Comments on Etymology, which Gerald Cohen summarizes in Vol.44/8: “Brass tacks—emblem of the only inevitable and last friend, the undertaker. Studded over our final ligneous adornment, brass tacks are suggestive of stern, inexorable reality.” This is part of a note in Wyandot Pioneer for 14 May 1848. Reitan explained: “Brass coffin tacks once served as a reminder of the humble fate awaiting us all. Get/come down to brass tacks was a call to set aside pretense/sham/false fronts; deal humbly and seriously with the task at hand.”


Perhaps the riddle has been solved, but a few questions remain. It is surprising that in the past not everybody knew the origin of such a seemingly transparent idiom. Also, to get down to brass tacks does not have and has apparently never had grim associations or meant “to deal humbly with the task at hand” (seriously is not synonymous with humbly). The phrase means “come to business” and is interchangeable with get down to bedrock. Finally, the funerary explanation of the idiom may have been the product of folk etymology invented in retrospect. It is not my intention to fill the otherwise excellent ointment with flies, but in such cases the motto should always be: “Better safe than sorry.”


The bishop and his foot

Here too I am grateful for the comments. Obviously, the bishop’s foot was at one time associated with all kinds of bad things. I am still wondering where milk came in. We are missing a link. Unfortunately, in all kinds of reconstruction links exist mainly for this purpose.



Alexander John Ellis, an outstanding philologist and a passionate advocate of spelling reform. (Portrait by William John Naudin; albumen carte-de-visite, 8 March 1886; CC BY-NC-ND 3.0; © National Portrait Gallery, London)Alexander John Ellis, an outstanding philologist and a passionate advocate of spelling reform. (Portrait by William John Naudin; albumen carte-de-visite, 8 March 1886; CC BY-NC-ND 3.0; © National Portrait Gallery, London)

English spelling

Slowly but steadily the Spelling Society is moving toward convening a congress that will make the first practical steps toward the eagerly awaited and as eagerly resisted reform. There is no lack of interest in the media, but it is my impression that the public in Great Britain is more engaged than those who use spell checkers in the United States. What has happened to the famous American activism? We are so sensitive to everything that looks unjust. For example, I have seen an ad promoting “humanely raised poultry” and was immensely pleased to know that the chicken in my soup was raised humanely.


In 1848 Alexander J. Ellis, a great philologist, brought out the second edition of A Plea for Phonetic Spelling; or the Necessity of Orthographic Reform. Some statements from a review of this book (The Westminster Review 61, 1849) might entertain our readers.



“How is it that, of twenty five million inhabitants of the British Islands who speak English as their native tongue, and speak no other [hear! hear!]… so small a number read and write it with such a facility as to make doing so an agreeable relaxation instead of a painful task?”
“Is it that to ‘spell English, is the most difficult of human attainments?’ Mr. Ellis says that it is the cause.”
“English would be that best fitted for universal adoption, ‘were it not’, says he, ‘obscured by a whimsically antiquated orthography; and the other nations of Europe may esteem themselves fortunate, that the English have not yet made this discovery’.”

Aren’t humanely raised and educated children at least as precious as chickens?


I see America writing, or who laid down in the street and whom did what

“American Special Operation forces mounted a rare raid… killing a senior leader… as well as freeing a… woman whom Pentagon officials said had been held as a slave.” (New York Times)
“Prime Minister David Cameron… hailed a statesman whom he said knew that Britain was ‘not just a place on the map but a force in the world, with a destiny to shape events and a duty to stand up for freedom’.” (Associated Press)
“On the opening day of the 2015 session, protesters laid down in the Capitol rotunda, chanting phrases such as “No justice, no peace.” (From a local newspaper)

I wish I could lay down in front of some very important house, chant “No rational orthography, no education,” and meet a statesman whom would mobilize the masses for spelling reform.


Some questions remain on the back burner. Kindly wait for the answers until June!


The post Monthly etymology gleaning for May 2015 appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 27, 2015 05:30

The importance of continuing professional development in medicine

We all want our doctors to be familiar with the latest developments in medicine, and to be able to offer us as patients the very best and informed healthcare. It is important that doctors in the fields of anaesthesia, critical care, and pain are up to date and familiar with the latest developments in these rapidly developing areas of medicine, with new techniques and drugs emerging which improve outcomes for patients. As professionals, we cannot stand still and we must always strive to improve outcomes for our patients.


Anaesthesia is a relatively young speciality, which emerged around 1850. Operations as we know them today would be impossible without anaesthesia. In the early years, drugs such as ether and chloroform provided unconsciousness; however with modern developments we are now able to support all body functions during the perioperative period, and, importantly, pain relief. Patients who would have been denied lifesaving surgery due to excessive risks 25 years ago, are now able to undergo surgery safely with developments in the perioperative care of patients. Modern anaesthesia allows operations to be performed today as day cases, which in the past would have required a week in hospital.


With the worldwide expansion in medical information, new technology, and constant research, it is increasingly difficult for doctors to keep up with new developments in practice, with new articles, recommendations, and guidance appearing almost weekly.


The General Medical Council (GMC) in the UK launched Revalidation in December 2012, and defines it as the process by which all licensed doctors are required to demonstrate, on a regular basis, that they are up to date, fit to practise, and are able to provide a good level of care in their chosen field. In order to be revalidated, and so continue to hold a licence to practice, doctors must collect a portfolio of supporting information for discussion at their appraisals which demonstrates how they are meeting the professional values in Good Medical Practice, the GMC’s core guidance which describes what is expected of all registered doctors.



medic-563425_640Medic by DarkoStojanovic. Public domain via Pixabay.

One of the key supporting information requirements is evidence that doctors have participated in Continuing Professional Development (CPD), which the GMC defines as any learning outside of undergraduate education or postgraduate training that helps doctors maintain and improve their performance. CPD includes both formal and informal learning activities, covering all areas of the doctor’s professional practice. It can also support specific changes in practice that will benefit patients. Doctors must evidence their participation in CPD on an annual basis. A variety of activities can count towards completion of CPD including e-Learning and private reading from relevant books and journals, medical and academic writing, working with another consultant to learn or refresh specific techniques or skills, and attending course and meetings, both within the doctor’s hospital and meetings organised by regional, national or international providers, where there are opportunities for participants to share ideas and good practice with other doctors.


Journals, such as Continuing Education in Anaesthesia, Critical Care & Pain (soon to become BJA Education), provide succinct reviews in areas of clinical practice, key for practicing doctors to be aware of. With the introduction of revalidation the importance of publications such as this have increased, assisting both trainees preparing for examinations, and trained practitioners wishing to keep up to date in areas of practice. Developments in e-learning have also helped in delivery of educational material, using both online MCQs, and additional video material designed to assist learning. Users can complete on line tests and produce certificates which can then be used to support their revalidation.


Most Medical Royal Colleges and Faculties in the UK have developed CPD guidance and, as a general rule, achievement of at least 50 hours of CPD per year is recommended as the minimum likely to be required in order to remain up to date. Participation in a broad range of activities is also strongly encouraged.


As new innovations and developing practices happen it is vital doctors and other professionals working in the fields of Anaesthesia, Critical Care & Pain keep up to date. This will lead to improved perioperative outcomes for all of us as patients in the future; as it is likely that all of us will at some point in our lives require surgery and anaesthesia.


Featured image: Doctor technology by skeeze. CC0 via Pixabay.


The post The importance of continuing professional development in medicine appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 27, 2015 03:30

Can your diet make you feel depressed?

I am often asked whether eating particular foods can enhance mood and treat the symptoms of depression. With very few exceptions, the answer is no. In contrast, our mood can be easily depressed by our diet. Why? For adults, the brain responds primarily to deficits, not surpluses, in the diet.


For example, scientists once thought that drinking a glass of warm milk before bed or eating a large meal of protein made us drowsy because of tryptophan loading – the current evidence does not support this explanation but the claim makes an important point: we must get enough of any particular nutrient into our brain in order for us to notice any effects. Unfortunately, tryptophan has difficulty getting into our brain, particularly when consumed within the context of a large variety of other amino acids, i.e. meat.


So, what’s the scientific evidence for considering the cognitive effects of these foods? Mostly, it’s related to what happens when we do not get enough of them. For example, numerous studies have shown that consuming too little tryptophan makes us depressed and angry (Journal of Neural Transmission 2014;121:451-455); historians now blame low tryptophan diets for multiple wars and acts of cannibalism. Too little water-soluble vitamins (the B’s and C) in the diet will induce changes in brain function that we will begin to notice after a few weeks of deprivation. Many authors naively jump to the conclusion that giving high doses of such nutrients will rapidly improve our mood or thinking: sadly, this is rarely the case.



Eggs. Uploaded by PDPics. 7 February 2013. CC0 Public Domain via Pixabay.Eggs. CC0 Public Domain via Pixabay.

Can dietary supplements increase brain tryptophan levels and improve mood? The answer is no. There is no evidence for improving mood through dietary manipulation of tryptophan primarily because it is difficult to change plasma tryptophan levels through diet alone. Tryptophan supplementation and depletion studies suggest that altering tryptophan levels may only affect certain groups of patients who already have a personal, or family history, of depression. Popular media articles often recommend diets and foods to increase blood tryptophan levels, suggesting that this will lead to increased activity and function of serotonergic neurons. Such recommendations, while superficially appealing, are misleading and not supported by any scientific evidence.


If you eat less tryptophan, your brain produces less serotonin. Conversely, providing additional tryptophan in the diet may lead to increased production of serotonin within neurons, however, producing more serotonin does not guarantee that the neuron will actually release it. If too much serotonin is produced inside the brain, the excess is simply discarded. The depletion of tryptophan from the diet can negatively influence serotonin-controlled brain processes such as mood and sleep.


A recent study published in the journal Neuropsychopharmacology (2014;39) investigated whether it was possible to deplete the brain’s reward chemical dopamine in humans, by restricting access to the amino acid tyrosine that is required for its synthesis by the brain. Within a few hours the subjects showed a blunted reaction by their brain’s dopamine centers in response to a monetary reward. Overall, their mood was also slightly depressed. Fortunately, the effects of this experimental diet were temporary because the subjects were young and the brain was able to compensate quickly.


The overall lesson here is that if you’re feeling down, it might be your diet. However, do not expect your diet to make you feel happy; most often, a good diet will only prevent you from feeling depressed.


The post Can your diet make you feel depressed? appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 27, 2015 02:30

What is ‘Zen’ diplomacy? From Chinese monk to ambassador

In 1654, a Chinese monk arrived in Japan. His name was Yinyuan Longqi (1592-1673), a Zen master who claimed to have inherited the authentic dharma transmission—the passing of the Buddha’s teaching from teacher to student—from the Linji (Rinzai) sect in China. This claim gave him tremendous authority in China, as without it a Zen teacher cannot be considered for leading a Zen community.


Considering the long history of interactions between China and Japan, Chinese monks arriving in Japan with teachings, scriptures, relics and such were very common, and were welcomed by Japanese monks and rulers. Before Yinyuan, there were already eminent Chinese monks who had established themselves in Japan, and Yinyuan was simply one among many. So why was Yinyuan’s arrival so important?


Before we address this question, let us first cover a few examples of similar seemingly ‘insignificant’ historical events that became hugely important in the long term. In 1971, a group of American table tennis players arrived in Beijing after competing in the 31st World Table Tennis Championship in Japan. They were welcomed by the Chinese government and played table tennis with Chinese players. Nothing particularly extraordinary, right? And yet, today this event is now known as ‘Pingpong Diplomacy,’ which opened the door to President Richard Nixon’s visit to China in 1972, the first visit of a US President since China’s Communist Revolution of 1949. Similarly, in early 2014, in an event now referred to as ‘Basketball Diplomacy,’ a group of retired NBA players led by ex-star Dennis Rodman arrived in Pyongyang, North Korea to play basketball with a North Korean team. While it is still not clear what this will mean for American and North Korean relations, the common characteristic of these two events is that at the time when these groups of American civilians arrived, there was no formal diplomatic relationship between the United States and either China or North Korea. This lack of formal connection necessitated an urge to manipulate ordinary civilian activities to achieve symbolic advantage for cultural and political gains.



Yinyuan_Longqi“Portrait of Yinyuan Longgi” by Kita Chobei at Kobe City Museum. Public Domain via Wikimedia Commons.

This can also be said about the Chinese Zen master Yinyuan’s arrival in Japan in 1654. China and Japan did not have formal relationship at that time and had different visions of the political future of East Asia. For centuries, Chinese diplomatic relationships with other countries had been handled within a concentric tribute system with China in the center. China would only have a relationship with countries that accepted a China-centric world order; this would mean recognizing the Chinese emperor as the ruler of the Universe, paying regular tributes to China, having leaders accept the title of vassal kings bestowed by the Chinese emperor, and adopting the Chinese calendar and reign names for international and domestic use. Since the fourteenth century, many East and Southeast Asian countries were accepted into this system except Japan, which vehemently resisted the Chinese notion of world order. After the reunification of the country under Toyotomi Hideyoshi and Tokugawa Ieyasu in the late sixteenth and early seventeenth centuries, it is clear that Japan developed its own thoughts about the international order—they wanted a Japan-centered world order in East Asia, mimicking the Chinese version.


At this juncture, the Chinese monk Yinyuan arrived with a claim of the authentic transmission from China. The honor and prestige the Tokugawa shoguns (the ‘de facto’ rulers of Japan at that time) lavished onto Yinyuan was quite remarkable; Yinyuan was granted audience with the fourth shogun in 1658 and was given land in Uji, Kyoto to build a new Chinese-style temple, Manpukuji, in 1660, which still stands today. His sect was allowed to proselytize in Japan, finally resulting in a new Zen sect called Obaku, the third sect of Japanese Zen alongside Rinzai and Soto. More extraordinary was the government regulation passed that allowed only Chinese monks to be abbots of Manpukuji until 1784 when the last Chinese monk passed away and the Tokugawa government was unable to recruit more monks from China. These Chinese monks were requested to visit Edo castle regularly, especially on the occasion of a new shogun’s accession ceremony and a dead shogun’s funeral service, similar to the missions carried out by the Korean and Ryukyu (now Okinawa) embassies at that time. This unusually preferential treatment of Chinese monks in Tokugawa Japan points to a strong Japanese intention to manipulate the presence of the monks as a symbolic representation of China at that time.


Although Yinyuan was a monk living in the seventeenth century, his legacy is prolonged and eventful, as Japanese emperors bestowed him many honorific titles after his death​. The most recent of these was given by the Showa emperor Hirohito (1901-1989) on 27 March 1972, entitled ‘the Light of China.’ Six months after this, on 29 September 1972, China and Japan restored formal diplomatic relations after a long period of antagonism between the two since the end of the nineteenth century. Coincidence? Possibly, but it proves once again that many things can happen when a monk arrives in a foreign land.


Image Credit: “Manpuku temple cherry” by sdkfz183. CC BY-SA 3.0 via Wikimedia Commons.


The post What is ‘Zen’ diplomacy? From Chinese monk to ambassador appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 27, 2015 01:30

Ten facts about economic gender inequality

Gender is a central concept in modern societies. The promotion of gender equality and women’s empowerment is key for policymakers, and it is receiving a growing attention in business agendas. However, gender gaps are still a wide phenomenon. While gender gaps in education and health have been decreasing remarkably over time and their differences across countries have been narrowing, gender gaps in the labour market and in politics are more persistent and still vary largely across countries.


Understanding the determinants of gender gaps is essential for any country who wants to put forward effective ways to realize equality between men and women and promote a balanced pattern of economic growth. However there is no unique determinant of gender gaps, and ‘gender gap’ itself is a multidimensional, complex indicator. Thus, identifying the determinants of gender gaps is a challenging, though fundamental, task.


The following list of ten facts about economic gender inequality may represent a useful guide to identify the determinants of gender gaps.


1. Gender gaps have historical roots. These roots can be traced back to the organization of the family and to the traditional agricultural practices in the pre-industrial period, which influenced the gender division of labour, the role of women and the evolution and persistence of gender norms.


2. Culture matters in determining gender gaps. Gender stereotypes are well-established, both among men and among women. They influence the extent to which men and women share the same responsibilities, in particular in domestic work and childcare and they contribute to explaining gender gaps in the labour market.


3. Men and women have different attitudes and behaviours. On average women are significantly less likely than men to make risky choices and to engage competition. These differences can contribute to explaining gender pay gaps, glass ceilings, and the lower presence of women in high-paying jobs, or in highly competitive environments.


4. Maternity does not explain it all. There is no trade-off between fertility and female employment. Maternity is a penalty in the labour market. However, there is no trade-off between fertility and female employment; countries where women work more also have higher fertility rates. Thus, low female employment is not necessarily due to maternity decisions.


5. Education is the first engine of gender equality. Women and men are currently equally educated, and women often surpass male educational attainments in developed countries. However differences across fields of study, with a limited share of women in STEM disciplines, remain and may explain part of the still-existing gender gaps in access to the labour market and careers.


6. Gender gaps in employment and the glass ceiling are different phenomena, although they often go hand-in-hand. Even in countries where the problem of access to the labour market for women has been substantially solved, women still encounter obstacles in careers and in reaching top positions (ie. the glass ceiling).


7. Labour demand is as important as individuals’ choices. Firms’ decisions, employers’ attitudes and beliefs, i.e. labour demand, are as important as individuals’ incentives and choices to determine the gender composition of the workforce, female careers and the overall outcomes of gender gaps. The selection process itself is typically not gender-neutral.


8. Institutions play a crucial role in supporting female employment. Family policies, parental leave, and formal child care provisions may help supporting female labour supply.


9. Institutions play a crucial role in determining the glass ceiling. How to promote female leadership and the presence of women in top positions is a highly debated issue, which countries are addressing through a variety of policies, from the introduction of gender quotas to voluntary regimes. Gender quotas have recently attracted wide attention; they have been proved to be effective not only in increasing the number of women in top positions, but also in inducing a better selection process and a beneficial renovation of the ruling class.


10. Women’s empowerment and economic development are interrelated. On one side, economic development improves women’s conditions and reduces inequality between men and women, on the other side, the involvement of women in the economy is a key engine for growth.


These ten facts suggest that the determinants of gender gaps range from culture and history, to attitudes and behaviour, educational choices, family choices including maternity, firms’ behaviors, policy interventions, and economic development. These determinants are also strictly interrelated. Understanding these ten facts could be a useful guideline in helping to determine future policy on gender inequality.


Headline image credit: ‘Mind the Gap’, by Sarah Stierch. CC-BY-2.0 via Flickr.


The post Ten facts about economic gender inequality appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on May 27, 2015 00:30

May 25, 2015

Is Christian feminism an oxymoron?

Is Christian feminism an oxymoron? For the past century or so, it’s often seemed that way. But it wasn’t all that long ago that many women not only considered Christianity and feminism compatible, but in fact believed each was essential to the other. Perhaps no figure makes this case more powerfully than Katharine Bushnell.

An internationally-known anti-trafficking activist in the late 19th and early 20th centuries, Bushnell repeatedly encountered Christian men who had perpetrated acts of appalling cruelty against women, often without remorse or consequence. Ultimately, she concluded that “the crime is indirectly the fruit of the theology”—that men’s cruelty toward women must be rooted in patriarchal theology.

Yet Bushnell refused to abandon Christianity in its entirety. Suspecting that Christian patriarchy was a distortion of the true gospel, the result of misogynistic mistranslations of God’s word, she turned instead to Hebrew and Greek texts. Spending years retranslating portions of the Scriptures, she developed new readings of the biblical narrative, from Genesis to Revelations. Upending traditional notions of women’s subordination, her translations defined patriarchy as a sin, and women’s liberation as redemption. According to Bushnell, Christianity—rightly understood—provided an essential foundation for women’s rights.

What’s most remarkable about Bushnell’s work is that she achieved her dramatic revisions while upholding the authority of the Scriptures. Indeed, by the 1920s she had come to identify as a fundamentalist, staunchly opposed to theological modernism. For this reason, her writings continue to speak powerfully today to those who hold a high view of Scripture, and to those who understand the gospel of Christ as one of liberation for women, as well as for men.

The following quotes provide a small glimpse into Katharine Bushnell’s intriguing theology.

“The Bible is all that it claims for itself. It is inspired…infallible…and inviolable.” “Standing in the Light of your halo, Arequipa” by Geraint Rowland. CC BY NC 2.0 via Flickr. “We would rather believe that the expositor is mistaken, than that the very term “Gospel,”—“Good News,”—proclaims oppression to women.” “The world, the Church, and women are suffering sadly from woman’s lack of ability to read the Word of God in its original languages. There are truths therein that speak to the deepest needs of a woman’s heart, and that give light upon problems that women alone are called upon to solve.” “What wonder that all versions [of the Bible], having for all time been made by men, should disclose the fact that, on the woman question, they all travel more or less in a circle, in accordance with sex bias, hindering the freedom and progress of women, since…the self interest of man led him to suppose that woman served God best as his own undeveloped subordinate?” “Cows were made before men—even before theologians—[therefore] men must be subordinated to cows.” “Any argument drawn from the ‘image’ [of God] idea must apply surely quite as equally to woman, who was created at the same time as man, and by the same act. It is the spirit of phallic worship which contends that this image inheres in physical sex, not the spiritual characteristics.” “Servility and weakness are two contemptible vices. They have been too often recommended to women clothed in the names of “humility” and “meekness,” to which virtues they are as opposed as north is to south.” “We imagine [some male expositors] would have been pleased had God sent into the world, an additional female Christ, to set women a female example; but since God did not see fit to do so, women are under obligation to endeavor, as best they are able, to follow the ‘manly’ example of Jesus Christ, and leave the consequences with God. [This] is woman’s truly humble place. Any other is sham humility.” “Woman can never be matured as a useful instrument in God’s hands or an efficient servant of His church until she comes to understand that ‘she is not her own; she is bought with a price,’ and it is neither her duty nor her privilege to give herself away to any human being, in marriage or in any other way.” “Here is where the great mistake is being made on the ‘woman question.’ Is it ‘prudent’ to allow women to do thus and so? men ask themselves at every step of woman’s progress. The only question that should be asked is, ‘Does justice demand this?’ If so, ‘let justice be done though the heavens fall'; anything short of justice is mere mischief-making.”

The post Is Christian feminism an oxymoron? appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on May 25, 2015 05:30

Parsing schizophrenia

The effective treatment of schizophrenia has long presented a challenge to clinicians and scientists. Common misunderstandings around symptoms and behaviors, and inadequate approaches to diagnosis and physiology, have hindered significant progress for patients and professionals. However, with recent advances in both methodology and research, it might be possible to design specific treatment regimens to address the various forms of the illness. Personalized medicine for the individual with schizophrenia is on the horizon.

The concept of schizophrenia began over a hundred years ago when Emil Kraepelin defined two major psychoses: manic-depressive psychosis (now termed bipolar disorder) and dementia praecox, a disorder comprising disorganization of thought and avolition pathologies with a chronic course. Eugen Blueler coined the term schizophrenia, emphasizing the concept of “splitting” or schism–the dissociation within thought and between thought, emotion, and behavior. To differentiate schizophrenia from other forms of psychosis, Kurt Schneider provided first rank symptoms for diagnosis and shifted the concept towards a reality distortion disorder based on special forms of hallucinations and delusions.

Despite the clarity with which these pioneers defined schizophrenia, there are two major impediments to further scientific enquiry. First, schizophrenia is a heterogeneous clinical syndrome: a grouping of patients with important similarities in the absence of known causal pathways. (Dementia, for example, is a syndrome where the memory impairment is central but can be caused by distinct diseases such as Alzheimer’s or stroke.) Second, the various pathologies associated with the syndrome are not unique to schizophrenia.

Scientists and clinicians have been slow in addressing the problems associated with a heterogeneous clinical syndrome. There are noteworthy exceptions, but even today most genetic studies are based on schizophrenia as a phenotype, regulatory bodies approve drugs for the syndrome, and most scientific studies are designed as though schizophrenia is a single disease entity rather than a syndrome.

However, the most recent edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) introduced eight symptom dimensions to be assessed in patients with psychotic disorders. Deconstructing the syndrome into these component domains of psychopathology provides more discreet targets for causal and therapeutic discovery. While antipsychotic drugs are approved for schizophrenia by regulatory bodies, the scope of their therapeutic actions doesn’t include all the relevant domains. (Negative symptoms such as low drive/motivation and impaired emotional processing and cognition impairments represent leading unmet therapeutic needs.)

Meanwhile, at a sub-clinical level, investigators have assessed features associated with schizophrenia and bipolar disorders that more closely align with underlying brain dysfunction. The National Institutes of Mental Health (NIMH) recently sponsored the Bipolar Schizophrenia Network on Intermediate Phenotypes to assess cognitive, neuroimaging, electrophysiological, and psychological markers to clarify the nature of our current disorders and explore new approaches to their classification.

Furthermore, NIMH has introduced the Research Domain Criteria (RDoC) to orient clinical research towards behavioral constructs with known neural networks, hypothesizing that this will enhance discovery at the cellular, molecular, and genetic levels that can then be translated back to aspects of currently defined disorders. For example, behavioral constructs such as fear processing are thought to relate to anxiety; impairments in a social processes construct may relate to interpersonal pathologies. The five constructs developed so far include negative valence, positive valence, cognition, social processes, and arousal/regulatory. These systems are viewed on a continuum from normal to pathological. It is expected that currently defined disorders will have impairments related to one or more of the five selected behavioral constructs, and identified pathologies will cut across current diagnostic boundaries.

This shift should have substantial effects on scientific enquiry. The field may move from a large number of genes contributing small effects on a heterogeneous clinical syndrome to assessment methods that enable genetic interrogation in relation to specific behavioral constructs, phenotypes, symptom dimensions, or neural circuit function. Disrupted brain physiology may be related to specific pathways underlying equally specific components of psychopathology–enabling therapeutic discovery related to a specific symptom or behavioral cluster.

An informed mapping of human illness phenotypes onto pre-clinical animal models may increase validity of translational science. The how and why questions can be examined in rodent models of behavioral constructs, generating hypotheses relevant to specific clinical pathology rather than syndrome. The fundamental phenotypes of schizophrenia can be identified and the relationship with clinical symptoms explored.

Clinicians may one day be able to see a patient with schizophrenia, specify the particular symptoms in each case, understand a connection with basic behavioral constructs and underlying brain dysfunction, and select suitable therapeutics.

Featured Image Credit: By Lee Scott. CC0 via Unsplash.

The post Parsing schizophrenia appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on May 25, 2015 03:30

Oxford University Press's Blog

Oxford University Press
Oxford University Press isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Oxford University Press's blog with rss.