Oxford University Press's Blog, page 224

October 6, 2018

Research, collection, preservation, and more: Japan’s Kyoto International Manga Museum

Many people both in and out of Japan may be acquainted with the word “manga,” even if they don’t follow it. Manga has played a significant role in Japanese culture for the last century and has recently gained the respect of a wider audience. Manga is often presented as the Japanese iteration of the comics medium, though there are many manga publications that are longer and a different format than comics in other cultures. Manga also has a distinct visual style that comes from the dramatic manipulation of how story components (e.g. scenery, characters, and items) are represented in the panels and on the pages, which tends to engross readers in the story.

Recently, a new phenomenon has emerged of people collecting and archiving manga items, such as books, magazines, goods, and original drawings. This move to preserve manga history is comparable to other media archive projects around the world. In the United States, libraries and academic institutions were at first slow to embrace comics as a viable medium worthy of collecting and preserving but soon gained traction during the late 1990s.

In Japan, there have been individual collectors of Manga, but it’s uncommon to have open access to their collections. In addition, it is difficult for some collectors to take care of their collections for a long period of time. Even though the importance of manga research has been gaining more attention recently, there are still not many establishments or facilities that are focused on compiling comprehensive collections of manga-related items in Japan. Because of this, Kyoto city and Kyoto Seika University—famous for its faculty of manga that was the first of its kind in Japan, as well as its collection of manga and its role in manga research—opened the Kyoto International Manga Museum in 2006. It is known as the first comprehensive center for manga culture in Japan, and today the museum is still one of the few places to preserve and document the layers of Japanese manga culture and history. The museum has many functions beyond its exhibition and research duties, including preserving manga materials, administering creative workshops to the public, acting as a community center space, and providing performances of Kamishibai—a traditional form of Japanese storytelling.

“Kyoto Manga Museum” by Tatyana Temirbulatova. CC BY 2.0 via Wikimedia Commons.

The museum is located in the heart of Kyoto, the ancient capital of Japan, and the building was once an elementary school. At the museum, there are approximately 300,000 manga-related items, such as manga/comic books, magazines, old newspapers, and woodcut prints from the Edo era. 50,000 of these items are available in open bookshelves as well as exhibition showcases. Of the remaining 250,000 items, most are accessible for registered visitors (over 18 years old only) at the reference research room and can be loaned under certain conditions.

The museum has many creative formats to display manga materials, including a Manga Hall of Fame, with influential manga from 1945 to 2005, and a Manga Expo, where visitors can read manga translated in their own language. One highlight of the museum is the “Wall of Manga,” a colorful 200-meter wall that runs throughout the building, showcasing 50,000 manga publications. The museum also has a children’s library with thousands of picture books.

The museum also features a “Genga’ (Dash)” collection. Genga’ (Dash) are elaborate reproductions that often appear identical to the original manga manuscripts. The project is led by Takemiya Keiko, a manga artist and the former president of Kyoto Seika University, in order to aid in the preservation and public exhibition of perishable manga manuscripts. The aim is to preserve original manuscripts in order to prevent fading as much as possible, while also creating reproductions to showcase the original vibrant colors.

The field of manga has accelerated in growth more in the last few years than the past several decades combined. Researching manga or building a manga museum would have seemed unimaginable to most people even in the 1980s, but manga preservation has since become routine. It’s exciting to imagine how manga culture will change in the future, with new facilities and elements that have not existed before. The manga museum has been playing a key part in this period of change, and we will continue to support manga culture in the future.

Featured image credit: “Manga Reader” by Miika Laaksonen. Public Domain via  Unsplash .

The post Research, collection, preservation, and more: Japan’s Kyoto International Manga Museum appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 06, 2018 00:30

October 5, 2018

The history of manned space flight [infographic]

The Soviet Union launched the first man-made satellite, Sputnik, into space in October 1957, initiating the scientific rivalry between the USSR and the United States at the height of the Cold War. In the subsequent decades, the Soviet and American space programs traded milestones as they each embarked upon manned space flight and the exploration of space. Soviet cosmonaut Yuri Gagarin became the first man in space in 1961, and in 1969 the US closed out the decade with astronauts walking on the surface of the moon, a culmination of John F. Kennedy’s famous 1961 proclamation: “I believe that this Nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to earth.”

From early Earth orbits and the moon landings to space stations and commercial spacecraft, we’ve collected the key programs that fueled the Space Race and continue to make scientific discoveries to this day.

Download the infographic as a PDF or JPEG.

Featured image credit: Space Shuttle Atlantis lifts off at Kennedy Space Center in Florida, May 2009. Scott Andrews, NASA. Public Domain via Wikimedia Commons .

The post The history of manned space flight [infographic] appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 05, 2018 08:00

Renewing the Centre?

Have recent events – notably the election (and re-election) of Jeremy Corbyn as leader of the Labour party following the Conservative victory in the 2015 general election, and the 2016 vote to leave the EU leading to a ‘hard Brexit’ strategy from the Conservative government – revitalised British politics by breaking from the centrist politics of the preceding period? Was ‘centrism’ a problem that needed to be solved or is the problem that the main parties have vacated the centre ground of politics, creating an urgent need for the centre to be renewed? Is centrism the problem or the solution?


Tony Blair has consistently argued that centrism is the solution, epitomised in the 1998 statement  of  his ‘personal political philosophy’, The Third Way. Back then the case for a centrist politics was made in terms of moving beyond ‘new right’ and ‘old left’, or ‘neo-liberalism and a highly statist brand of social democracy’. Twenty years later Blair is making the case for ‘renewing the centre’ all over again, only this time the problem is that the centre-ground has been vacated by ‘the Brexit-dominated Tory party and a hard-left Labour party’. 


The rationale for centrist politics is expressed by Blair’s claim that a successful modern party ‘must be in the centre, speaking for the mainstream majority’. The point is to find a position ‘that can get the support to win in order that you can do things for the people that desperately need help’ (Blair in Smith 2016). 


The idea that the route to power runs through the centre ground because that route brings you in contact with the ‘mainstream majority’ has become something of a conventional wisdom. As recently as March 2018 former chancellor George Osborne, speaking with Blair, echoed the argument that ‘The centre ground was where general elections used to be fought and won .. [and] … that it was where many voters still remained’ (Coughlan 2018). 


There are two conceptions of the centre that need to be distinguished: the ideological centre and the centre of debate. In ideological terms the centre is defined in relation to the left-right spectrum, as the space in between. In contrast, the centre of debate refers to those ideas that at any time tend to dominate and set the terms of political debate. The centre of debate might not correspond with the ideological centre.


The economic left-right spectrum can be seen, essentially, as contesting the nature of a capitalist economy and the role of the state. The ideological centre can be seen as quite expansive, providing space for a meaningful debate between social liberalism and social democracy. In the postwar decades in Britain (viz. roughly the late 1940s to the mid-1970s) this is what the centre ground meant and it provided a basis for consensus politics, i.e. it was also the centre of debate. During the 1970s this consensus broke down in the face of critiques from left and right in the context of mounting economic difficulties. The alternatives were either, on the left, more statism (e.g. public ownership etc.) and, on the right, to ‘roll back the state’ (e.g. privatisation etc.). It was the latter, neoliberal, position that provided the basis for a new form of statecraft in the guise of ‘Thatcherism’ (and Reaganism in the US). 


This conception of the ideological centre enables us to view Blair’s advocacy of occupying the centre ground in a critical light. First, the centrist postwar consensus constituted a political shift from the ideas that dominated British politics in the first half of the 20th century – a shift to the left. It can be seen as a synthesis of two trends: a shift within liberalism to social liberalism in order to save capitalism from itself through reform, and the pressure of the labour movement in the form of a social democratic reformist brand of socialism.


Second, when this consensus broke down in the 1970s, Thatcherism succeeded by seeking to challenge and overturn the central elements of the centrist postwar politics. For Thatcherism, centrism was definitely not the solution but the problem, to be solved by a shift to the right. 


Third, Blair’s ‘third way’ strategy of occupying the centre ground was definitely not a return to postwar social democracy, now characterised as the ‘old left’. It was, in that sense, a repudiation of the centre. Instead New Labour and the third way represented a significant adjustment to Thatchersim. 



Jeremy Corbyn,  Labour Leader, speaking at a political rally during the Labour leadership election, in Matlock, Derbyshire. (Source: Wikimediacommons)


In seeking to ‘renew the centre’ in reaction to a ‘hard left’ Labour party, Blair fails to recognise that it is the shift to the right to embrace neoliberalism in the guise of ‘centrism’ that contributed to disillusionment with party politics, and that ‘Corbynism’ can make a stronger claim to centrism as a revival of traditional social democratic politics. Perhaps the centre is the solution, but not in the way Blair conceives it.


Feature Image credit: Palace of Westminster by Michael Beckwith via Pixabay


The post Renewing the Centre? appeared first on OUPblog.


 •  0 comments  •  flag
Share on Twitter
Published on October 05, 2018 03:30

Reviewing the Centre?

Have recent events – notably the election (and re-election) of Jeremy Corbyn as leader of the Labour party following the Conservative victory in the 2015 general election, and the 2016 vote to leave the EU leading to a ‘hard Brexit’ strategy from the Conservative government – revitalised British politics by breaking from the centrist politics of the preceding period? Was ‘centrism’ a problem that needed to be solved or is the problem that the main parties have vacated the centre ground of politics, creating an urgent need for the centre to be renewed? Is centrism the problem or the solution?

Tony Blair has consistently argued that centrism is the solution, epitomised in the 1998 statement  of  his ‘personal political philosophy’, The Third Way. Back then the case for a centrist politics was made in terms of moving beyond ‘new right’ and ‘old left’, or ‘neo-liberalism and a highly statist brand of social democracy’. Twenty years later Blair is making the case for ‘renewing the centre’ all over again, only this time the problem is that the centre-ground has been vacated by ‘the Brexit-dominated Tory party and a hard-left Labour party’. 

The rationale for centrist politics is expressed by Blair’s claim that a successful modern party ‘must be in the centre, speaking for the mainstream majority’. The point is to find a position ‘that can get the support to win in order that you can do things for the people that desperately need help’ (Blair in Smith 2016). 

The idea that the route to power runs through the centre ground because that route brings you in contact with the ‘mainstream majority’ has become something of a conventional wisdom. As recently as March 2018 former chancellor George Osborne, speaking with Blair, echoed the argument that ‘The centre ground was where general elections used to be fought and won .. [and] … that it was where many voters still remained’ (Coughlan 2018). 

There are two conceptions of the centre that need to be distinguished: the ideological centre and the centre of debate. In ideological terms the centre is defined in relation to the left-right spectrum, as the space in between. In contrast, the centre of debate refers to those ideas that at any time tend to dominate and set the terms of political debate. The centre of debate might not correspond with the ideological centre.

The economic left-right spectrum can be seen, essentially, as contesting the nature of a capitalist economy and the role of the state. The ideological centre can be seen as quite expansive, providing space for a meaningful debate between social liberalism and social democracy. In the postwar decades in Britain (viz. roughly the late 1940s to the mid-1970s) this is what the centre ground meant and it provided a basis for consensus politics, i.e. it was also the centre of debate. During the 1970s this consensus broke down in the face of critiques from left and right in the context of mounting economic difficulties. The alternatives were either, on the left, more statism (e.g. public ownership etc.) and, on the right, to ‘roll back the state’ (e.g. privatisation etc.). It was the latter, neoliberal, position that provided the basis for a new form of statecraft in the guise of ‘Thatcherism’ (and Reaganism in the US). 

This conception of the ideological centre enables us to view Blair’s advocacy of occupying the centre ground in a critical light. First, the centrist postwar consensus constituted a political shift from the ideas that dominated British politics in the first half of the 20th century – a shift to the left. It can be seen as a synthesis of two trends: a shift within liberalism to social liberalism in order to save capitalism from itself through reform, and the pressure of the labour movement in the form of a social democratic reformist brand of socialism.

Second, when this consensus broke down in the 1970s, Thatcherism succeeded by seeking to challenge and overturn the central elements of the centrist postwar politics. For Thatcherism, centrism was definitely not the solution but the problem, to be solved by a shift to the right. 

Third, Blair’s ‘third way’ strategy of occupying the centre ground was definitely not a return to postwar social democracy, now characterised as the ‘old left’. It was, in that sense, a repudiation of the centre. Instead New Labour and the third way represented a significant adjustment to Thatchersim. 

Jeremy Corbyn,  Labour Leader, speaking at a political rally during the Labour leadership election, in Matlock, Derbyshire. (Source: Wikimediacommons)

In seeking to ‘renew the centre’ in reaction to a ‘hard left’ Labour party, Blair fails to recognise that it is the shift to the right to embrace neoliberalism in the guise of ‘centrism’ that contributed to disillusionment with party politics, and that ‘Corbynism’ can make a stronger claim to centrism as a revival of traditional social democratic politics. Perhaps the centre is the solution, but not in the way Blair conceives it.

Feature Image credit: Palace of Westminster by Michael Beckwith via Pixabay

The post Reviewing the Centre? appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 05, 2018 03:30

October 4, 2018

‘Unnecessary’ and ‘risky’ – the end of ENT surgery in the NHS?

In July this year, NHS England announced that it planned to cease funding four surgical procedures entirely, and to limit funding for thirteen others. Within this list of procedures, three ear, nose and throat procedures were identified: tonsillectomy for tonsillitis, grommet insertion for glue ear and surgery for snoring. The mainstream media provided mixed opinions, from supporting the Department of Health’s aspiration to reduce the provision for “unnecessary or risky procedures”, to outcry that essential treatments would become unavailable. Some commentators have reported that “…sometimes doing nothing, or doing less, really is the best approach”.

Who should decide?

As doctors, we are taught to examine the available evidence and provide balanced advice to aid our patients in determining the best treatment for their individual circumstances. The basis of guidelines and contemporary clinical practice is often from ‘landmark’ papers in the field, but despite the frequency with which certain ear, nose and throat procedures are performed, there remains a distinct paucity of high-quality research. For example, a Cochrane review of tonsillectomy, involving a search of all published scientific literature, concluded that, for children, “tonsillectomy leads to a reduction in the number of days with sore throat in the first year after surgery”, but the size of the effect is very modest; this means that further research is likely to have an important impact on how confident we are in the results and may change those results. The quality of evidence for tonsillectomy in adults is low. Tens of thousands of tonsillectomies are performed every year in the United Kingdom but, due to the lack of good evidence available, many national guidelines are primarily based on a single study performed over thirty years ago.

Grommet surgery has been performed regularly in ENT practice since 1945.  Strict guidelines from NICE have been in place for many years to guide who should be offered surgery, and evidence supports that the procedure provides short term benefits in hearing which diminish after six to nine months in healthy children. This may seem a short period of time, but six to nine months can span the majority of an academic year, during which a child may fall behind as a consequence of their hearing loss, leading to potential loss of educational opportunities, poor behaviour, and balance issues. This can have a further impact on a child’s confidence moving forward, even when hearing improves, and have implications for forming relationships with their peers.  An alternative to having grommets inserted is to adopt a ‘watch and wait’ policy, but there are potential risks for adopting this approach, which have been publicised in the press.

Cochrane have provided their own conclusions based on the available evidence, and their systematic review found the quality of the evidence for the placement of grommets in children with recurring acute middle ear infections to be “low to very low”.

Snoring can also be a very troublesome problem for patients and their families. Surgery for snoring can reduce apnoea, the most serious associated symptom, by up to 50%, but is now rarely available on the NHS and, in light of the Department of Health announcement, this snoring surgery is set to all but disappear. Whilst surgery has always been a last resort for snoring, snoring can have widespread implications not only for the snorer, who may have a disturbed sleep affecting productivity, but also for their bed partner, who will suffer a poor night’s sleep also.

‘Absence of evidence is not evidence of absence’

The NHS has always operated within a sensitive financial and political framework. In a healthcare system that has limited funds and increasing demands, the funding of healthcare interventions has to be somehow rationalised.  As doctors, our primary goal is to provide the best care for the individual patient sitting in front of us. If we are to justify the provision of surgical procedures for our patients, we need to either cite the evidence to support these interventions, or be striving to develop research strategies to address the many gaps in our evidence base.

Featured Image Credit: “Portrait of woman who can not sleep because her husband snores” by Josep Suria. Used under licence via Shutterstock .

The post ‘Unnecessary’ and ‘risky’ – the end of ENT surgery in the NHS? appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 04, 2018 04:30

Bioremediation: using microorganisms to clean up the environment

Microorganisms are known for their ability to adapt to any environment. We can find them in the most hazardous places on Earth. Their invisible work has led to visible results ― terraforming the planet billions of years ago and converting it into the viable green world that is today. Their ability to utilize and adapt to any available substrate in order to gain energy kept the balance in the ecosystem until humans become dominant species. Since the industrial revolution, human activity has produced a broad range of novel substances to which microorganisms can naturally adapt. The problem is that biodegradation can’t keep pace with the amount of substances being produced. Thankfully modern science offers a technology which employs microorganisms’ adaptability. It is called bioremediation.

What is Bioremediation? 

Bioremediation is a biotechnology procedure, in which microorganisms adapted to degrade pollutants from a contaminated site are stimulated to achieve a better biodegradation rate by enrichment with fertilizers and/or oxygen. Another variation of the technology is introducing microorganisms adapted in a laboratory into the contaminated site. In recent years, scientists have bypassed the adaptation steps microorganisms take by creating microorganisms which are genetically modified using integrating genes necessary for biodegradation of a specific contaminant. Microorganisms involved in the process aren’t only bacteria, but archaea, microalgae, and fungi. The practice poses minimal risk to the environment because after the substrate-contaminant is removed, the microorganisms can no longer survive and the process is cost effective. In situ bioremediation is the most common form of this process. However, instead of treating the contaminated place in its natural environment, bioremediation could also be done ex situ, by excavating pollutants from polluted sites and subsequently transporting them to another site for treatment.

The idea of employing microorganisms in waste management is not as new as one may think. It was used in ancient Rome wastewater treatment facilities. To fulfil their water supply needs, ancient Romans built aqueducts which led to excessive amount of wastewater. This problem was solved by the construction of a sophisticated sewage system – the Cloaca Maxima – where water was drained and cleaned by natural biodegradation. The process was slow and inefficient, but demonstrated nature’s potential to degrade waste substances.

Image credit: In situ bioremediation by Lidiya Angelova. Author owned.

Years later, the same process was employed by George M. Robinson, who developed modern bioremediation in the 1960s. He was a curious petroleum engineer, working for a petrol company in Santa Maria, California. Robinson mixed bacterial cultures with petroleum products and figured out that they could be a potential “cure” for the numerous oil spills, and to help improve the management of waste. He was able to collect enough evidence which convinced the local authorities and public to apply his findings in practice. This led to the implementation of bioremediation for preventing environmental disaster, such as cleaning the British passenger ship the Queen Mary’s fuel storage tanks before her “retirement” as a tourist attraction in Long Beach California in 1967. He later helped design the first large-scale microbial clean-up of an oil spill. California’ s large oil and tourism industry was an enormous burden for the environment, and Robinson’s development of bioremediation assisted to reduce its impact. His work didn’t go unnoticed by the scientific community, and in 1975 Ananda Mohan Chakrabarty ‒ a microbiologist working for General Electrics – designed the first genetically engineered oil degrading bacterium, from the genus Pseudomonas.

In addition to the successful decontamination of oil spills, which have a long-term impact on the health of ecosystems, this technology has countless other applications, from the recovery of coal or metal mines, to environmentally friendly disposal of radioactive waste. BTEX compounds (benzene, toluene, ethylbenzene, and xylenes) are toxic derivates from the oil industry which often end up in the groundwater. Bacteria and archaebacteria species have bioremediation potential which could help break them to less harmful substances.

Recent studies have shown the potential of microorganisms to decrease the environmental impact of landfills which are suffocating the land near metropolises. Even human carcinogens like vinyl chloride can’t stop bacteria. Their ability to degrade it could be implemented in decontamination of groundwater.

Despite its advantages, bioremediation is not well known amongst the people who could benefit from it and those who can make important decisions regarding environmental issues ‒ legislators, regulators, politicians and other influential figures outside the environmental science world.

Scientists and science communicators can help fill the gap by spreading awareness about the abilities of microorganisms to clean up our environment.

Featured image credit: Earth sustainability by Ann Ca. CC0 via Pixabay.

The post Bioremediation: using microorganisms to clean up the environment appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 04, 2018 03:30

Dignified debates: a better way to argue about politics

Rebecca Roache expressed a common feeling when in 2015 she blogged, “I am tired of reasoned debate about politics.” Many people today find arguments unpleasant and useless. That attitude is both sad and dangerous because we cannot solve our social problems together if we know that we disagree but do not understand why.

Luckily, arguments can help us accomplish a lot even in extreme cases. Consider the case of Megan Phelps-Roper, who proudly held signs condemning gays and Jews to hell while she was a child in Westboro Baptist Church. Later, she found interlocutors on Twitter who displayed calm, concern, and curiosity while asking about her views. Their affability opened her to their views, but Phelps-Roper reports in her Ted talk, “As kind as my friends on Twitter were, if they had not actually made their arguments, it would have been so much harder for me to see the world in a different way.” She adds elsewhere that her new friends pointed out inconsistencies in her old views so clearly that she could not deny them. Reason and argument, thus, helped her to understand her opponents and herself and then to change her deeply-held religious beliefs.

The spread of bad arguments is undeniable but not inevitable.

Another example involved Ann Atwater, who was a leader of the civil rights movement in Durham, NC, and C. P. Ellis, who was Exalted Cyclops of the local Ku Klux Klan. They could not have started further apart, but they became close friends. How? They began by asking questions, listening to each other, and giving reasons. Atwater fought to improve housing because she wanted her children to have better lives. Ellis opposed integration in public schools, but mainly because he wanted his children to get a good education. When each learned the other’s reasons, they could build on shared values, respect each other, and work together.

If these sworn enemies could become friends, so can Republicans and Democrats today. Admittedly, extremists often hide in their echo chambers and homogenous neighborhoods. They never listen to the other side. When they do venture out, the level of rhetoric on the internet is abysmal, abusive, and absurd. Trolls resort to slogans, name-calling, and jokes. Of course, much of this rhetoric is based on emotions and tribalism. When they do bother to give arguments, their arguments often simply justify what suits their feelings instead of their reasons. The spread of bad arguments is undeniable but not inevitable.

Rare but valuable examples like Atwater, Ellis, and Phelps-Roper show us how we can escape our cultural rut.

Reach out. Nothing would have happened if Phelps-Roper and her friends had not gone onto Twitter and looked for opponents. Similarly, all of us will remain polarized if we never leave our isolated cells.

Ask questions. If Atwater and Ellis had merely asserted their views and assumed that they already understood each other, then they never would have learned that they both cared about their children and were frustrated by their poverty. By asking the right questions in the right way, we can show opponents that we are curious and really want to understand them as individuals.

Be patient. Atwater and Ellis spent eight hours a day for ten days in a charrette before they finally came to understand and appreciate each other. Like them, we all need to slow down and fight the tendency to interrupt and retort with quick quips and slogans that demean opponents.

Give arguments. It is not enough to announce what we believe. We need to add why we believe it so that others can understand us and our basic values. On controversial issues, neither side is obvious enough to escape demands for evidence and reasons, which are presented in the form of arguments.

Demand sound reasoning. When people do not expect to be held accountable for their claims, they are less careful to base those claims on relevant facts and evidence. Then disagreements will be unjustified, antagonistic, and harder to resolve.

None of these steps are easy or quick, but resources are available to teach us how to appreciate and develop arguments. These skills will enable us to do our part to shrink the polarization that stunts our societies and our lives.

Featured image credit: “character-back-to-back-male-woman-1797362/” by Fxq19910504. CC0 via Pixabay.

The post Dignified debates: a better way to argue about politics appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 04, 2018 00:30

October 3, 2018

Etymology gleanings for September 2018

Many thanks to those who have commented on the recent posts and written me privately. My expertise is in Germanic, with occasional timid inroads into the rest of Indo-European. Therefore, I cannot answer questions about Arabic and Chinese. Below, I’ll say something about Hittite, but, obviously, for my information I depend on the authority of others. The same holds for Classical Greek, Celtic, and Old Slavic.

The word hate, its putative cognates in Hittite and Greek and its connection with heat.

Many closely related Hittite words begin with kart-. All of them refer to anger. The specialists’ unanimous opinion is that their root is the same as in Latin cord- and Engl. heart, even though no sources at our disposal suggest that in Hittite beliefs the heart was the center of anger. Greek kertoméō “I taunt” is more problematic. Its origin is unknown (mere guesswork), but it seems to be of sound-symbolic or sound-imitative origin, like many words belonging to this semantic sphere (probably “low slang”), some of which are borrowings: compare Engl. sneer, jeer, fleer, scoff, chaff, and josh, among quite a few others. German Hitze ~ Engl. heat and hate have incompatible root vowels and should be kept apart. German hunting term Hatz “hunt” is of course akin to hetzen, discussed in the post on hatred.

Is blood red?

Ion Carstoiu is an active researcher living in Rumania and publishing in Rumanian (search for Orginea limbajulu, that is, “language origins,” his site dealing with etymology). He believes that the idea of finding the word for “blood” came to many people from the fact that blood is red (which made speakers think of sacrifices, fight, war, etc.), and he found pairs across languages, allegedly showing that “blood” and “sun,” our glowing luminary, are associated in people’s consciousness. Among his examples are wi (“blood” in Tora) ~ wi in Dakota; ra in Malagash and Sawu and ra in Egypt; gore in English and garri “sun” in Ngadjon (Australia), as well as gorri “red” in Basque; Latin sanguis, along with senggi “blood” in Manchu and sanggwa in Angave (Papua), etc., among quite a few others. Although I have a high opinion of Wilhelm Oehl’s works on primitive creation, I find it hard to see a guiding principle in the material cited above, as I told Mr. Carstoiu privately, but he asked me to publish his material, so that others could think about it, and I promised him to do so.

Dollop, its etymology

The prevalent opinion that this word traveled (perhaps from German) to Norwegian (Norw. dolp “lump”) and thence to England seems to be rather well-founded. I once looked at the history of similar words, such as collop, gallop ~ wallop, jalopy, trollop, lollop, and shallop. None of them seems to be native; a few are of undiscovered origin in English and in the lending languages. There is something innately funny about them and other such (symbolic? onomatopoeic?) formations. The same is true of clop, flop, whop, swop, sop (compare milksop), and their likes.

A dollop: its appearance suggests its etymology. Image credit: Blot Blur Dab by Clker-Free-Vector-Images. CC0 via Pixabay.

More on lying

Yes, indeed, the noun lie can and does sometimes go with epithets: compare bold-faced lie, cited in the comment, brazen, blatant, downright, whopping lie (German grelle Lüge: grell “shrill, glaring,” etc.), along with a few others (see Morton Benson, Evelyn Benson, and Robert Ilson’s The BBI Combinatory Dictionary of English), and even white lie, also noted there. It appears that one can lie on an ascending scale. By contrast, truth, when modified, tends to refer to the depth of the statement: it can be naked, plain, unvarnished, absolute (but opposed to relative!), and so forth.

Danish lyve “to lie” is “regular.” Gothic liugan and all the Scandinavian forms (Old Icelandic ljúga) are phonetic variants of the same protoform. In Danish, g (which we see in ljúga) changed to some semblance of the w-sound, and then to v; hence Danish lav “low” and mave “stomach” versus Old Icelandic lagr and magi. Sometimes this sound disappeared altogether (as in Danish flue “a fly” versus Swedish fluga, and others).

House

Is this a house? Image credit: Palau house with yapese stones NOAA by Dr. James P. McVey, NOAA Sea Grant Program. Public Domain via Wikimedia Commons.

In connection with the post of January 21, 2015, a correspondent asked me why the origin of the word house is still unknown. The origin of this word is not unknown: it is rather not quite clear and therefore debatable. The answer evades us because we don’t and cannot know what exactly house once meant. Did it refer to a human habitat or to some shed, granary, hut, or whatever? In many cases, even when we believe that we know what the oldest meaning was, the root continues to be opaque. The history of husband (mentioned in the question) has been traced, so that there is no riddle. The question continues: “What word did people use to describe the house before the Viking Age?” Since there are no written records for the ancient epoch, we know very little about its vocabulary. Very little is not a euphemism for nothing: sometimes very old words were borrowed by people’s neighbors, and their meaning becomes partly clear. In some way, this holds for house, as explained in the post. But a still more “original” word cannot be known.

Etymological dreams

Question: “Is there any law in etymology like Grimm’s Law for phonetics”? To discover such a law has been historical linguists’ dream for about two centuries. Alas, the answer to the question is “No.” But some general directions exist, for, although there is no law, quite a few patterns have been discovered. They relate to semantic change and show that certain meanings often and rather naturally produce other meanings. Unfortunately, it does not follow that such changes must occur. For this reason, historical phonetics is rather reliable, while historical semantics contains only non-obligatory recipes.

The way we write (and speak). From newspapers:

“The U.S. government maintains a bevy of watch lists. The most well known, maintained by the FBI, is a large database, etc.” (1) Can bevy be applied to inanimate objects? I doubt it. (2) Should well known be hyphenated? I am sure that here, in its attributive function, it should, as opposed to this rule is well known to everybody (where it is used predicatively). (3) Finally, what is attractive about the phrase most well-known, as opposed to best-known?

“The crash left the bus laying on its side….” I have once quoted this sentence. No comment. Obviously, this was a horrible lay.

The article describes a blood-curdling incident: a man assaulted a woman, bit off part of a man’s finger, and attacked a sheriff’s deputy. This tirade, we are informed, lasted until, etc. Moral: don’t use the words whose meaning you don’t know.

Quite a tirade. Image credit: Peter Paul Rubens Massacre of the Innocents by Ken Thompson the Collector: The Thompson Collection at the Art Gallery of Ontario. Public Domain via Wikimedia Commons.

Featured image credit: Urban Sunset by Chase Emmons. Public Domain via Unsplash.

The post Etymology gleanings for September 2018 appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 03, 2018 04:30

Humanities and scientific explanation: the need for change

For too long, presentations of science for the general public, and education in schools, has suggested that science wields a sort of hegemonic power, as if its terms and methods gradually replace and make redundant all other discourse; the only reason it has not yet completed its conquest is that the world is complicatedbut it is only a matter of time…

I think this way of understanding science is utterly misleading and wrong. I think it is already wrong within my own discipline (fundamental physics), and I think that what poets do, and what literary critics do, and what musicians and moral philosophers do, is every bit as truthful and insightful about the nature of reality as is what physicists and chemists and biologists do (and mathematicians, engineers, historians, etc.). In short, a description at one level does not render a description at another level redundant, and the notion of “explanation” does not involve or invite ignorance of this. But there is, announced by breathless popularisers and now looming throughout our system of education, another idea: the claim or the hint that moral and aesthetic opinions are “really” just subtle reproductive strategies, or pre-scientific names for hormone changes, or value-free movements of a universal wave-function.

To some readers, the fact that such claims are illogical and unfounded will be clear. But others will be puzzled about what is wrong with the hegemonic way of presenting science. This puzzlement reveals a deep failure in our public discourse and system of education.

The rhetoric to watch out for is “you thought X (some rich subject such as social dynamics) but it is really Y (some low-level description such as brain chemistry or genetics or quantum mechanics).”

Here are two illustrations of what is wrong with this.

First, consider the arched shape commonly used in buildings and bridges. This shape is strong under compressive forces. This property is owing to the overall form: a force from above tends to push the parts together, not apart. Suppose an arch is made of stones. Analysis of the individual stones will not yield an insightful explanation of the global property. Therefore the arch is not fully understood or explained that way. Even if all arches everywhere were made of stone, it would still be false to say that arches are explained by stones, because the property of intereststrength under compressionwould still hold if the arch were made of something else.

Image credit: ‘Los Baños’ by Stefano Zocca. Public Domain via Unsplash.

In a similar way, brain chemistry is of only marginal relevance to an understanding of social dynamics among humans. The idea that ‘all the explanatory arrows point downwards’, as Steven Wienberg put it, is quite wrong.

My second illustration is more technical: it is the role of symmetry principles in fundamental physics.

Suppose an engineer investigates a car engine. He or she will use Newton’s laws of motion (such as the one that says the acceleration is proportional to the force) and various laws of chemistry describing the burning of petrol, and things like that. The engineer might never make explicit use of a certain symmetry principle of physics, called ‘translational invariance’. So we can imagine the engineer taking the following point of view:

“Look, here is the equation of motion: the force causes the piston head to accelerate, and is given by the formula for pressure in a hot gas. That’s it. That’s all we need. We can find the solution, for given starting conditions, and we have everything. That symmetry principle you mentioned—translation symmetry, or whatever it was—that is completely irrelevant. It has no role at all. I have no need of that hypothesis.”

What this reaction fails to grasp is that the symmetry principle already makes its contribution before we ever write or discover the formulas and equations, because it expresses conditions on what sorts of equations could make sense. And science is all about making sense, or finding the sense that can be made. Symmetry principles in fact play an important role, because they amount to meta-laws which express broad principles that laws of motion must respect if they are to make certain types of sense. Such principles have been central to making progress in fundamental physics for over a century. Symmetry both helps us to formulate physical models and provides deeper insight.

These examples illustrate what is going on throughout scientific discourse in all areas: the use of multiple sets of terms and relations, suited to multiple types of question and phenomenon, in which no layer or level of discussion is even capable of expressing the concepts of another, let alone replacing another or making it redundant.

Rather than teaching GCSE students the current curriculum in physics, chemistry and biology, we need to cut back a little in all areas and devote a few lessons, and an exam question, to the broad understanding of how science works within its own domain and how it connects to our wider values. We could teach the following valuable idea, which I dub the “Embodiment Principle”:

The embodiment principle

Science is about building up an insightful picture in which the underlying microscopic dynamics do not replace, nor do they explain, the most significant larger principles, but rather they give examples of how those larger principles come to be physically embodied in particular cases. The lower level and higher level principles are in a reciprocal relationship of mutual consistency in which each illuminates the other.

Featured image credit: ‘sky-clouds-construction-brick-layer-78113’ by 12019. CC0 1.0 via Pixabay.

The post Humanities and scientific explanation: the need for change appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 03, 2018 03:30

October 2, 2018

Danger, devotion, and domestic life in Renaissance Italy

Two summers ago, I was holidaying in Central Italy with my family. We’d rented a house in a beautiful spot, a few miles outside a small cathedral town. One night, we were shaken awake by an invisible force. My first, rash impulse was to stay put: I’d experienced earthquakes before, in Venice and even in Cambridge; they were a talking point, nothing more. It was only in the morning, when we started to receive messages from concerned friends and family back home, that we realised the extent of the destruction that had hit the region.

In that situation, a lot of things go through your mind—some predictable, others less so. The image I couldn’t get out of my head was of a sixteenth-century family, neatly attired, mother, father, and four children, kneeling in prayer on the first floor of their home as their town collapses around them. The painting, roughly executed on wooden board and bearing the name ‘VIADANA’, was an ex-voto: a gift presented by the family at the shrine of St. Nicholas of Tolentino to thank their local saint for saving them from an earthquake. They—like us—were the lucky ones.

An earthquake brings down walls and exposes hidden interiors. The image of the Viadana family is an invaluable historical source since it allows us to see something we cannot usually see: the devotional life of the Renaissance home. Domestic religion is an elusive topic, poorly recorded in official archives and often ignored in the writings of priests and theologians. And yet, for ordinary lay people, Christ, the Virgin Mary, and saints came closest in moments of crisis experienced within the household.

Image credit: Parents at Prayer beside Their Baby’s Cradle, 1530, tempera on panel, 20.5 cm. x 23.5 cm. Lonigo, Madonna dei Miracoli, Museo degli ex-voto. Used with permission.

Renaissance Italians had many ways of warding off danger. They would hang strings of coral above their beds or place Agnus Dei—small pendants decorated with the Lamb of God and containing fragments of wax from the Easter candle burned at St Peter’s in Rome—in their infants’ cribs. When families in financial difficulty were reduced to seeking loans from charitable pawn banks, run by the Franciscans, they often left as security a bundle of trinkets, tied up in an old handkerchief: a typical assemblage of items might include a little cross, a coral rosary, a charm shaped like a crescent moon, and a wolf’s tooth. These were objects that ignored the formal boundaries between sacred and profane and which worked together to keep their owners safe.

From crossing oneself at the threshold of the home to sprinkling holy water by the bedside, domestic devotional practices were profoundly protective. Images of the Madonna, often positioned above a door or by the bed, were not only objects of veneration; whether or not children and their parents remembered to hail the Virgin as they passed by, she was always there, looking out for them. Religious books, which we tend to study for their content rather than their form, were often most powerful as material objects, kept close to the body at times of sickness or danger. 16th-century publishers were aware of this material function of texts and printed prayers against ‘thunder, earthquakes and pestilence’ on tiny scraps of paper, which were designed to be folded up and worn in a pouch hanging from a girdle.

It’s not often that personal experiences resonate so closely with one’s research. But, on that August night in 2016, my glimpse of a timeless terror brought home to me a matrix of Renaissance objects and practices. We have tended to think of this period as one of increasing worldliness, when classicism and consumerism dissipated a ‘medieval’ piety. But spiritual concerns ran into every capillary of daily life and every stage of the life-cycle, in ways that we are only now beginning to understand.

Featured image credit: The Viadana family prays to St Nicholas to save them from an earthquake, sixteenth century, tempera on panel, 20.5 cm. x 26.7 cm. Museo di San Nicola, Tolentino. Used with permission.

The post Danger, devotion, and domestic life in Renaissance Italy appeared first on OUPblog.

 •  0 comments  •  flag
Share on Twitter
Published on October 02, 2018 04:30

Oxford University Press's Blog

Oxford University Press
Oxford University Press isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Oxford University Press's blog with rss.