Oxford University Press's Blog, page 783
July 27, 2014
The month that changed the world: Monday, 27 July 1914
July 1914 was the month that changed the world. On 28 June 1914, Archduke Franz Ferdinand was assassinated, and just five weeks later the Great Powers of Europe were at war. But how did it all happen? Historian Gordon Martel, author of The Month That Changed The World: July 1914, is blogging regularly for us over the next few weeks, giving us a week-by-week and day-by-day account of the events that led up to the First World War.
By Gordon Martel
By the time the diplomats, politicians, and officials arrived at their offices in the morning more than 36 hours had elapsed since the Austrian deadline to Serbia had expired. And yet nothing much had happened as a consequence: the Austrian legation had packed up and left Belgrade; Austria had severed diplomatic relations with Serbia and announced a partial mobilization; but there had been no declaration of war, no shots fired in anger or in error, no wider mobilization of European armies. What action there was occurred behind the scenes, at the Foreign Office, the Ballhausplatz, the Wilhelmstrasse, the Consulta, the Quai d’Orsay, and at the Chorister’s Bridge.
Some tentative, precautionary, steps were taken. In Russia, all lights along the coast of the Black Sea were ordered to be extinguished; the port of Sevastopol was closed to all but Russian warships; flights were banned over the military districts of St Petersburg, Vilna, Warsaw, Kiev, and Odessa. In France, over 100,000 troops stationed in Morocco and Algeria were ordered to metropolitan France; the French president and premier were asked to sail for home immediately. In Britain the cabinet agreed to keep the First and Second fleets together following manoeuvres; Winston Churchill, First Lord of the Admiralty, notified his naval commanders that war between the Triple Alliance and the Triple Entente was ‘by no means impossible’. In Germany all troops were confined to barracks. On the Danube, Hungarian authorities seized two Serbian vessels.

Winston Churchill with the Naval Wing of the Royal Flying Corps, 1914. Public domain via Wikimedia Commons.
Throughout the day the Serbian reply to the Austrian ultimatum was communicated throughout Europe. Austria appeared to have won great diplomatic victory. Sir Edward Grey thought the Serbs had gone farther to placate the Austrians than he had believed possible: if the Austrians refused to accept the Serbian reply as the foundation for peaceful negotiations it would be ‘absolutely clear’ that they were only seeking an excuse to crush Serbia. If so, Russia was bound to regard it as a direct challenge and the result ‘would be the most frightful war that Europe had ever seen’.
The German chancellor concluded that Serbia had complicated things by accepting almost all of the demands and that Austria was close to accomplishing everything that it wanted. The Kaiser who arrived in Kiel that morning, presided over a meeting in Potsdam at 3 p.m. where he, the chancellor, the chief of the general staff, and several more generals reviewed the situation. No dramatic decisions were taken. General Hans von Plessen, the adjutant general, recorded that they still hoped to localize the war, and that Britain seemed likely to remain neutral: ‘I have the impression that it will all blow over’.
The question of the day, then, was whether Austria would be satisfied with a resounding diplomatic victory. Russia seemed prepared to offer them one. In St Petersburg on Monday Sazonov promised to go ‘to the limit’ in accommodating them if it brought the crisis to a peaceful conclusion. He promised the German ambassador that he would they ‘build a golden bridge’ for the Austrians, that he had ‘no heart’ for the Balkan Slavs, and that he saw no problem with seven of the ten Austrian demands.
In Vienna however, Berchtold dismissed Serbia’s promises as totally worthless. Austria, he promised, would declare war the next day, or by Wednesday at the latest – in spite of the chief of the general staff’s insistence that war operations against Serbia could not begin for two weeks.
Grey was distressed to hear that Austria would treat the Serb reply as if it were a ‘decided refusal’ to comply with Austria’s wishes. The ultimatum was ‘really the greatest humiliation to which an independent State has ever been subjected’ and was surely enough to serve as foundation of a settlement.
By the end of the day on Monday, uncertainty was still widespread. Two separate proposals for reaching a settlement were now on the table: Grey’s renewed suggestion for à quatre discussions in London, and Sazonov’s new suggestion for bilateral discussions with Austria in St Petersburg. Germany had indicated that it was encouraging Austria to consider both suggestions. The German ambassador told Berlin that if Grey’s suggestion succeeded in settling the crisis with Germany’s co-operation, ‘I will guarantee that our relations with Great Britain will remain, for an incalculable time to come, of the same intimate and confidential character that has distinguished them for the last year and a half’. On the other hand, if Germany stood behind Austria and subordinated its good relations with Britain to the special interests of its ally, ‘it would never again be possible to restore those ties which have of late bound us together’.
Gordon Martel is a leading authority on war, empire, and diplomacy in the modern age. His numerous publications include studies of the origins of the first and second world wars, modern imperialism, and the nature of diplomacy. A founding editor of The International History Review, he has taught at a number of Canadian universities, and has been a visiting professor or fellow in England, Ireland and Australia. Editor-in-chief of the five-volume Encyclopedia of War, he is also joint editor of the longstanding Seminar Studies in History series. His new book is The Month That Changed The World: July 1914.
Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
The post The month that changed the world: Monday, 27 July 1914 appeared first on OUPblog.










A revolution in trauma patient care
Major trauma impacts on the lives of young and old alike. Most of us know or are aware of somebody who has suffered serious injury. In the United Kingdom over five-thousand people die from trauma each year. It is the most common cause of death in people under forty. Many of the fifteen-thousand people who survive major trauma suffer life-changing injuries and some will never fully recover and require life-long care. Globally it is estimated that injuries are responsible for sixteen-thousand deaths per day together with a large burden of people left with permanent disability. These sombre statistics are driving a revolution in trauma care.
A key aspect of the changes in trauma management in the United Kingdom and around the world is the organisation of networks to provide trauma care. People who have been seriously hurt, for example in a road traffic accident, may have suffered a head injury, injuries to the heart and lungs, abdominal trauma, broken limbs, and serious loss of skin and muscle. The care of these injuries may require specialist surgery including neurosurgery, cardiothoracic surgery, general (abdominal and pelvic) surgery, orthopaedic surgery, and plastic surgery. These must be supported by high quality anaesthetic, intensive care, radiological services and laboratory services. Few hospitals are able to provide all of the services in one location. It therefore makes sense for the most seriously injured patients to be transported not to the nearest hospital but to the hospital best equipped to provide the care that they need. Many trauma services around the world now operate on this principle and from 2010 these arrangements have been established in England. Hospitals are designated to one of three tiers: major trauma centres, trauma units, and local emergency hospitals. The most seriously injured patients are triaged to bypass trauma units and local emergency hospitals and are transported directly to major trauma centres. While this is a new system and some major trauma centres in England have only “gone live” in the past two years, it has already had an impact on trauma outcomes, with monitoring by the Trauma Audit and Research Network (TARN) indicating a 19% improvement in survival after major trauma in England.
Not only have there been advances in the organisation of trauma services, but there have also been advances in the immediate clinical management of trauma. In many cases it is appropriate to undertake “early definitive surgery/early total care” – that is, definitive repair of long bone fractures within twenty-four hours of injury. However, patients who have suffered major trauma often have severe physiological and biochemical derangements by the time they arrive at hospital. The concepts of damage control surgery and damage control resuscitation have emerged for the management of these patients. In this approach resuscitation and surgery are directed towards stopping haemorrhage, performing essential life-saving surgery, and stabilising and correcting the patient’s physiological state. This may require periods of surgery followed by intervals for the administration of blood and clotting factors and time for physiological recovery before further surgery is undertaken. The decision as to whether to undertake early definitive care or to institute a damage control strategy can be complex and is made by senior clinicians working together to formulate an overview of the state of the patient.
Modern radiology and clinical imaging has helped to revolutionise modern trauma management. There is increasing evidence to suggest that early CT scanning may improve outcome in the most unstable patients by identifying life-threatening injuries and directing treatment. When a source of bleeding is identified it may be treated surgically, but in many cases interventional radiology with the placement of glue or metal coils into blood vessels to stop the bleeding offers an alternative and less invasive solution.
The evolution of the trauma team is at the core of modern trauma management. Advances in resuscitation, surgery, and imaging have undoubtedly moved trauma care forward. However, the care of the unstable, seriously injured patient is a major challenge. Transporting someone who is suffering serious bleeding to and from the CT scanner requires excellent teamwork; parallel working so that several tasks are carried out at the same time requires coordination and leadership; making the decision between damage control and definitive surgery requires effective joint decision-making. The emergence of modern trauma care has been matched by the development of the modern trauma team and of specialists dedicated to the care of seriously injured patients. It is to this, above all, that the increasing numbers of survivors from serious trauma owe their lives.
Dr Simon Howell is on the Board of the British Journal of Anaesthesia (BJA) and is the Editor of this year’s Postgraduate Educational Issue: Advances in Trauma Care. This issue contains a series of reviews that give an overview of the revolution in trauma care. The reviews expand on a number of presentations that were given at a two-day meeting on trauma care organised by the Royal College of Anaesthetists in the Spring of 2014. They visit aspects of the trauma patient’s journey from the moment of injury to care in the field, on to triage, and arrival in a trauma centre finally to resuscitation and surgical care.
Founded in 1923, one year after the first anaesthetic journal was published by the International Anaesthesia Research Society, the British Journal of Anaesthesia remains the oldest and largest independent journal of anaesthesia. It became the Journal of The College of Anaesthetists in 1990. The College was granted a Royal Charter in 1992. Since April 2013, the BJA has also been the official Journal of the College of Anaesthetists of Ireland and members of both colleges now have online and print access. Although there are links between BJA and both colleges, the Journal retains editorial independence.
Subscribe to the OUPblog via email or RSS.
Subscribe to only health and medicine articles on the OUPblog via email or RSS.
Image credit: Female doctor looking at x-ray photo, © s-dmit, via iStock Photo.
The post A revolution in trauma patient care appeared first on OUPblog.










Are schools teaching British values?
In June, (now former) Education Secretary Michael Gove announced that all primary and secondary schools should promote “British values”. David Cameron said that the plans for values education are likely to have the “overwhelming support” of citizens throughout the UK. Cameron defined these values as “freedom, tolerance, respect for the rule of law, belief in personal and social responsibility and respect for British institutions”. At root, such a policy gets at the emotional conditioning of children. To adhere to a certain ideological conceptualization of “freedom,” to feel “tolerant,” or to be “respectful” (whether of parents, teachers, authorities or institutions), is to act according to implicit feelings of rightness.
Values are never just abstract ideas, but are expressed and experienced through emotions. And they are not ideologically neutral. To stress the education of British values is to put a form of emotional education on the agenda. Though many commentators have pointed out that the broad outlines of such an education already exist in schools, the fear of “extremism”, of the promotion of the “wrong” sort of values, has triggered a vigorous debate. What has largely gone unrecognized in this debate, however, is that it is emphatically not new.
In the nineteenth and early twentieth centuries, politicians and educationalists promoted a new education based on character training and the emotions, precisely to build British citizens who would respect and uphold British institutions. This brand of education was to be accomplished at school, but also at home, and in religious and youth organizations.
Herbert Fisher, the President of the Board of Education who spearheaded the Education Act of 1918, argued that the masses should be educated “to stimulate civic spirit, to promote general culture … and to diffuse a steadier judgement and a better informed opinion through the whole body of the community.” Other educational commentators broadly agreed with this mission. Frederick Gould, a former Board School teacher and author of many books on education argued that “The community cannot afford to let the young people pass out with a merely vague notion that they ought to be good; it must frame its teaching with a decisive and clear vision for family responsibilities, civic and political duties”.

Michael Gove, by Paul Clarke, CC-BY-3.0 via Wikimedia Commons.
Civic duties – the civic spirit – were to be taught to the extent that they would become ingrained, implicit, felt. This was to be primarily a moral education. Educators stressed character training, linking moral education to British imperialism or nationalism in an unashamedly patriotic spirit. Education reform was to improve future citizens’ productivity and develop national character traits.
Like Gould, educator John Haden Badley stressed the need to teach active citizenship and service. Education on these lines would provide “a deeper understanding of the human values that give to life its real worth”, cultivating and maximizing the potential of a “superior” Britishness. Meanwhile, in a speech in Manchester in 1917, Fisher argued that “the whole future of our race and of our position in the world depends upon the wisdom of the arrangements which we make for education.” He observed, in language strikingly familiar to contemporary political rhetoric, that “we are apt to find that the wrong things are being taught by the wrong people in the wrong way.”
But even in 1917 the rhetoric was clichéd. A generation of commentators before Fisher argued that the civic shortfalls in mass formal education could be fixed by informal education in youth groups and religious organizations and through improved reading matter. Much juvenile and family literature, whether motivated politically or religiously, stressed emotional socialization, especially in the building of morality and character, as critical for national cohesion.
The trouble with visions of national cohesion, as the last century and a half of educational debate bears out, is the difficulty in getting any two parties to agree what that vision looks like. At the turn of the twentieth century all agreed that children mattered. How they were to be educated was important not just to individual children and their families, but equally importantly, to the community and the nation.
Yet some reformers had patriotic aims, others religious; some civic, some imperial; some conservative, others socialist. Many combined some or all of these aims. All, whether explicitly stated or not, wanted to train, instrumentalize and harness children’s emotions. Children’s reading matter, the stories they were told, and the lessons they heard were known to be powerful forces in cultivating the emotions. Hence the high stakes, then and now, on the narratives supplied to children.
Michael Gove, in common with his Victorian forebears, turns to the “great heroes of history” to serve as models of emulation. Back in the early 1900s, Gould thought history “the most vital of all studies for inspiration to conduct.” The study of history is certainly no stranger to being manipulated for didactic ends in order to impart “British values.”
While Gove is only the latest in a long line to link British history, British values and education, there are surely lessons to be learnt from past attempts and past failures to implement this strategy. A generation of boys and young men at the turn of the twentieth century had grown up learning the positive value of patriotic service. In this memorial year, marking a century since the outbreak of the First World War, it seems appropriate to reflect on what values we might want to instil in the young. What feelings do we want them to learn?
Stephanie Olsen is based at the history department, McGill University (Montreal) and the Max Planck Institute for Human Development, Center for the History of Emotions (Berlin). She was previously postdoctoral fellow at the Minda de Gunzburg Center for European Studies at Harvard University. The co-author of Learning How to Feel: Children’s Literature and the History of Emotional Socialization, c. 1870-1970 she is currently working on children’s education and the cultivation of hope in the First World War.
Subscribe to the OUPblog via email or RSS.
Subscribe to only politics articles on the OUPblog via email or RSS.
The post Are schools teaching British values? appeared first on OUPblog.










July 26, 2014
Confidence and courage in mentoring
Mentorship is one of the most compelling assets for professional success. The mentor-mentee relationship offers one of the most priceless of all human qualities — transparency. The mentor offers the mentee hope for the future by sharing both wisdom and past challenges. Mentors help mentees be their best selves by helping them overcome their fears of failure and apprehension of taking risks.
Everyone struggles and gets scared. It takes courage to ask for help. Many of us are afraid to take the risk of being vulnerable. So we pretend to know. In fact, we are often encouraged to “fake it until we make it.” But if we never talk about our challenges and fears openly, we will never get help with those challenges. More importantly, we miss out on key authentic moments. Being fearful about our imperfections and abilities — as well as of the future are all universal human emotions — and it is at the intersection of these authentic moments that we learn, accept, and grow. If we pretend to know it all, no one reaches out to us. When we ask for help and guidance, many hands are extended.
There has been a paradigm shift as to how professional knowledge is passed on. It no longer happens naturally through traditional professional grooming and succession rituals. With greater turnover, less time, lower budgets, and more uncertainty, traditional mentorship models have become nearly obsolete in today’s workplace. This dramatic upheaval in the professional landscape has changed how 21st century professionals can most effectively cultivate career success. Mentorship is more important now than ever before.
Some benefits of mentoring are:
Enhances career development initiatives
Creates a “learning organization”
Improved on-boarding and training programs
Improved diversity initiatives
Improved adjustment to the workplace culture
Improved employee engagement & retention
Targeted skill and leadership development
Can address skills gaps
Mentoring has existed throughout the ages as an effective way to develop talent. More formal mentoring programs comprise structured components, such as training and onboarding programs. These programs are often tied to specific, quantifiable business goals and objectives. There are many new mentoring styles too, including:
Reverse mentoring: Senior employees are mentored by junior employees to fill a specific skill gap.
Team mentoring: Work teams are mentored by a supervisor.
Group mentoring: Groups from within different departments or the same department are mentored by a senior manager
Distance mentoring: Mentor-mentee pairs who are working in different locations.
Less formal mentoring relationships are less hierarchical. There is an equal partnership where both parties greatly benefit — and learn — from the relationship.
Mary Pender Greene, LCSW-R, CGP is a psychotherapist, relationship expert, clinical supervisor, career & executive coach, trainer, and consultant, with a private practice in Midtown Manhattan. Mary’s background also includes executive management roles at America’s largest non-profit organization, The Jewish Board of Family Services in NYC. Mary is the author of Creative Mentorship and Career-Building Strategies: How to Build your Virtual Personal Board of Directors.
Subscribe to the OUPblog via email or RSS.
Subscribe to only social work articles on the OUPblog via email or RSS.
Image: Computer industry entrepreneur workshop by Dell’s Official Flickr Page. CC-BY-2.0 via Wikimedia Commons.
The post Confidence and courage in mentoring appeared first on OUPblog.










How I created the languages of Dothraki and Valyrian for Game of Thrones
My name is David Peterson, and I’m a conlanger. “What’s a conlanger,” you may ask? Thanks to the recent addition of the word “conlang” to the Oxford English Dictionary (OED), I can now say, “Look it up!” But to save you the trouble, a conlanger is a constructed language (or conlang) maker — i.e. one who creates languages.
Language creation has been around since at least the 12th century, when the German abbess Hildegard von Bingen created her Lingua Ignota — Latin for “hidden language” — an invented vocabulary she used for writing hymns. In the centuries that followed, philosophers like Leibniz and John Wilkins would create languages that were intended to serve as grand classification systems, and idealists like L. L. Zamenhof would create languages intended to simplify international communication. All these systems focused on the basic utility of language — its ability to encode and convey meaning. That would change in the 20th century.
Tolkien: the father of modern conlanging
Before crafting the tales of Middle-Earth, J. R. R. Tolkien was a conlanger. Unlike the many known to history who came before him, though, Tolkien created languages for the pure joy of it. Professionally, he became a philologist, but he continued to work on his own languages, eventually creating his famous Lord of the Rings series as an extension of the linguistic legendarium he’d been crafting for many years. Though his written works would become more famous than his linguistic creations, his conlangs, in particular Sindarin and Quenya, would go on to inspire new generations of conlangers throughout the rest of the 20th century.
Due to the general obscurity of the practice, many conlangers remained unknown to each other until the early 1990s, when home internet use started to become more and more common. The first dedicated meeting place for conlangers, virtual or otherwise, was the Conlang Listserv (an online mailing list). Some list members came out of interest in Tolkien’s languages, as well as other large projects, like Esperanto or Lojban, but the majority came to discuss their own work, and to meet and learn from others who also created languages.
Since the founding of the original Conlang Listserv, many other meeting places have sprung up online, and through a couple of decades of regular conlanger interaction, the practice of conlanging has evolved.
Conlang typology
Conlangs have been separated into different types since at least the 19th century. First came the philosophical languages, as discussed, then the auxiliary languages like Esperanto (also known as auxlangs), but with Tolkien emerged a new type of language: the artistic language, or artlang. At its most basic, an artlang is a conlang created for artistic purposes, but that broad definition includes many wildly divergent languages (compare Denis Moskowitz’s Rikchik to Sylvia Sotomayor’s Kēlen). Finer-grained distinctions became necessary as the community grew, and so emerged the naturalistic conlang.
This is where the languages of HBO’s Game of Thrones and Syfy’s Defiance come in. The languages I’ve created for the shows I work on come out of the naturalist tradition. The goal with a naturalistic conlang is to create a language that’s as realistic as possible. The realism of a language is grounded in the reality (fictional or otherwise) of its speakers. If the speakers are more or less human (or humanoid) and are intended to be portrayed in a realistic fashion, then their language should be as similar as possible to a natural language (i.e. a language that exists here on Earth, like Spanish, Tagalog, or Cham).
The natural languages we speak are large, but also redundant and imperfect in a uniquely human way. Conlangers have gotten pretty good at emulating them over the years, usually employing one of two different approaches. The first, which I call the façade method, is to create a language that looks like a modern natural language by replicating the various features of a modern natural language. Thus, if English has irregular plurals, such as mouse~mice, then the conlang will have irregular plurals, too, by targeting certain nouns and making their plurals irregular in some way.
The historical method: making sense of irregular plurals in Valyrian
A contrasting approach is the method that Tolkien pioneered called the historical method. With the historical method, an ancestor language called a proto-language is created, and the desired language is evolved from it, via simulated linguistic evolution. The process takes a lot longer, but in some ways it’s simpler, since irregularities will naturally emerge, rather than having to be created by hand. For example, in Game of Thrones, the High Valyrian language Daenerys speaks differs from the Low Valyrian the residents of Slaver’s Bay speak. In fact, the latter evolved from the former. As the language evolved, it produced some natural irregularities. Consider the following nouns and their plurals from the Valyrian spoken in Slaver’s Bay:
hubre “goat” hubres “goats”
dare “queen” dari “queens”
aeske “master” aeske “masters”
Given that the singular forms all end in ‘e’, one has to say at least two of the plurals presented are irregular. But why the arbitrary differences in the plural forms? It turns out it’s because the three nouns with identical singular terminations used to have very different forms in the older language, High Valyrian, as shown below:
hobres “goat” hobresse “goats”
dāria “queen” dārī “queens”
āeksio “master” āeksia “masters”
Each of these alternations is quite regular in High Valyrian. In the simulated history, a series of sound changes which simplified the ends of words produced identical terminations for each of the three words in the singular, leaving later speakers having to memorize which have irregular plurals and which regular.
Conceptualizing time
Simulated evolution applies to both grammar and the lexicon, as well. For example, natural languages often derive terminology for abstract concepts metaphorically from terminology for concrete concepts. Time, for instance, is an abstract concept that is frequently discussed using spatial terminology. How it’s done differs from language to language. In English, events that occur later in time occur after the present (where “after” derives from “aft,” a word meaning “behind”), and events that occur earlier in time occur before the present. Thus, time is conceptualized as a being standing in the present, facing the past, with the future behind them.
In Irathient, a language I created for Syfy’s Defiance, time is conceptualized vertically, rather than horizontally. The word for “after”, in temporal terms, is shei, which derives from a word meaning “above”; “before”, on the other hand, is ur, which also means “below” or “underneath”. The general metaphor that the future is up and the past is down bears out throughout the rest of the language, where if one wanted to say “Go back to what you were saying before”, the literal Irathient translation would be “Go down to what you were saying underneath”.
Ultimately, what one hears on screen sounds and feels like a natural language, regardless of whether or not one knows the work that went on behind the scenes. Since the prop used on screen is a language, though, rather than a costume or a piece of the set, the words can be recorded and analyzed at any time. Consequently, a conlang needs to be real in a way that a throne or a 700 foot wall of ice does not.
It’s still extraordinary to me that in less than 25 years, we came from a time when many conlangers were not aware that there were other conlangers to a time where our work is able to add to the authenticity of some of the best productions the big and small screen have to offer. The addition of the word “conlang” to the OED is a fitting capper to an unbelievable quarter century.
David J. Peterson is a language creator who works on HBO’s Game of Thrones, Syfy’s Defiance, and Syfy’s Dominion. You can find him on Twitter at @Dedalvs or on Tumblr.
Subscribe to the OUPblog via email or RSS.
Subscribe to only language articles on the OUPblog via email or RSS.
Subscribe to only television and film articles on the OUPblog via email or RSS.
Images: Game of Thrones Season 3 – Dragon Shadow Wallpaper and Game of Thrones Season 3 - Daenerys Wallpaper. ©2014 Home Box Office, Inc. All Rights Reserved.
The post How I created the languages of Dothraki and Valyrian for Game of Thrones appeared first on OUPblog.










The month that changed the world: Sunday, 26 July 1914
July 1914 was the month that changed the world. On 28 June 1914, Archduke Franz Ferdinand was assassinated, and just five weeks later the Great Powers of Europe were at war. But how did it all happen? Historian Gordon Martel, author of The Month That Changed The World: July 1914, is blogging regularly for us over the next few weeks, giving us a week-by-week and day-by-day account of the events that led up to the First World War.
By Gordon Martel
When day dawned on Sunday, 26 July, the sky did not fall. Shells did not rain down on Belgrade. There was no Austrian declaration of war. The morning remained peaceful, if not calm. Most Europeans attended their churches and prepared to enjoy their day of rest. Few said prayers for peace; few believed divine intervention was necessary. Europe had weathered many storms over the last decade. Only pessimists doubted that this one could be weathered as well.
In Austria-Hungary the right of assembly, the secrecy of the mail, of telegrams and telephone conversations, and the freedom of the press were all suspended. Pro-war demonstrations were not only permitted but encouraged: demonstrators filled the Ringstrasse, marched on the Ballhausplatz, gathered around statues of national heroes and sang patriotic songs. That evening the Bürgermeister of Vienna told a cheering crowd that the fate of Europe for centuries to come was about to be decided, praising them as worthy descendants of the men who had fought Napoleon. The Catholic People’s Party newspaper, Alkotmány, declared that ‘History has put the master’s cane in the Monarchy’s hands. We must teach Serbia, we must make justice, we must punish her for her crimes.’
Kaiser Wilhelm
Just how urgent was the situation? In London, Sir Edward Grey had left town on Saturday afternoon to go to his cottage for a day of fly-fishing on Sunday. The Russian ambassadors to Germany, Austria and Paris had yet to return to their posts. The British ambassadors to Germany and Paris were still on vacation. Kaiser Wilhelm was on his annual yachting cruise of the Baltic. Emperor Franz Joseph was at his hunting lodge at Bad Ischl. The French premier and president were visiting Stockholm. The Italian foreign minister was still taking his cure at Fiuggi. The chiefs of the German and Austrian general staffs remained on leave; the chief of the Serbian general staff was relaxing at an Austrian spa.Could calm be maintained? Contradictory evidence seemed to be coming out of St Petersburg. It seemed that some military steps were being initiated – but what these were to be remained uncertain. Sazonov, the Russian foreign minister, met with both the German and Austrian ambassadors on Sunday – and both noted a significant change in his demeanour. He was now ‘much quieter and more conciliatory’. He emphatically insisted that Russia did not desire war and promised to exhaust every means to avoid it. War could be avoided if Austria’s demands stopped short of violating Serbian sovereignty. The German ambassador suggested that Russia and Austria discuss directly a softening of the demands. Sazonov, who agreed immediately to suggest this, was ‘now looking for a way out’. The Germans were assured that only preparatory measures had been undertaken thus far – ‘not a horse and not a reserve had been called to service’.
By late Sunday afternoon, the situation seemed precarious but not hopeless. The German chancellor worried that any preparatory measures adopted by Russia that appeared to be aimed at Germany would force the adoption of counter-measures. This would mean the mobilization of the German army – and mobilization ‘would mean war’. But he continued to hope that the crisis could be ‘localized’ and indicated that he would encourage Vienna to accept Grey’s proposed mediation and/or direct negotiations between Austria and Russia.
By Sunday evening more than 24 hours had passed since the Austrian legation had departed from Belgrade and Austria had severed diplomatic relations with Serbia. Many had assumed that war would follow immediately, but there had been no invasion of Serbia or even a declaration of war. The Austrians, in spite of their apparent firmness in refusing any alteration of the terms or any extension of the deadline, appeared not to know what step to take next, or when additional steps should be taken. When asked, the Austrian chief of staff suggested that any declaration of war ought to be postponed until 12 August. Was Europe really going to hold its breath for two more weeks?
Gordon Martel is a leading authority on war, empire, and diplomacy in the modern age. His numerous publications include studies of the origins of the first and second world wars, modern imperialism, and the nature of diplomacy. A founding editor of The International History Review, he has taught at a number of Canadian universities, and has been a visiting professor or fellow in England, Ireland and Australia. Editor-in-chief of the five-volume Encyclopedia of War, he is also joint editor of the longstanding Seminar Studies in History series. His new book is The Month That Changed The World: July 1914. Read his previous blog posts.
Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
Image credit: Kaiser Wilhelm, public domain via Wikimedia Commons.
The post The month that changed the world: Sunday, 26 July 1914 appeared first on OUPblog.










July 25, 2014
The month that changed the world: Saturday, 25 July 1914
July 1914 was the month that changed the world. On 28 June 1914, Archduke Franz Ferdinand was assassinated, and just five weeks later the Great Powers of Europe were at war. But how did it all happen? Historian Gordon Martel, author of The Month That Changed The World: July 1914, is blogging regularly for us over the next few weeks, giving us a week-by-week and day-by-day account of the events that led up to the First World War.
By Gordon Martel
Would there be war by the end of the day? It certainly seemed possible: the Serbs had only until 6 p.m. to accept the Austrian demands. Berchtold had instructed the Austrian representative in Belgrade that nothing less than full acceptance of all ten points contained in the ultimatum would be regarded as satisfactory. And no one expected the Serbs to comply with the demands in their entirety – least of all the Austrians.
When the Serbian cabinet met that morning they had received advice from Russia, France, and Britain urging them to be as accommodating as possible. No one indicated that any military assistance might be forthcoming. They began drafting a ‘most conciliatory’ reply to Austria while preparing for war: the royal family prepared to leave Belgrade; the military garrison left the city for a fortified town 60 miles south; the order for general mobilization was signed and drums were beaten outside of cafés, calling up conscripts.

Kaiservilla in Bad Ischl, Austria: the summer residence of Emperor Franz Joseph I. Kaiserville, Bad Ischl, Austria. By Blue tornadoo CC-BY-SA-3.0, via Wikimedia Commons
How would Russia respond? That morning the tsar presided over a meeting of the Russian Grand Council where it was agreed to mobilize the thirteen army corps designated to act against Austria. By afternoon ‘the period preparatory to war’ was initiated and preparations for mobilization began in the military districts of Kiev, Odessa, Moscow, and Kazan.
Simultaneously, Sazonov tried to enlist German support in persuading Austria to extend the deadline beyond 6 p.m., arguing that it was a ‘European matter’ not limited to Austria and Serbia. The Germans refused, arguing that to summon Austria to a European ‘tribunal’ would be humiliating and mean the end of Austria as a Great Power. Sazonov insisted that the Austrians were aiming to establish hegemony in the Balkans: after they devoured Serbia and Bulgaria Russia would face them ‘on the Black Sea’. He tried to persuade Sir Edward Grey that if Britain were to join Russia and France, Germany would then pressure Austria into moderation.
How would Britain respond? Sir Edward Grey gave no indication that Britain would stand shoulder-to-shoulder with the Russians in a conflict over Serbia. His only concern seemed to be to contain the crisis, to keep it a dispute between Austria and Serbia. ‘I do not consider that public opinion here would or ought to sanction our going to war over a Servian quarrel’. But if a war between Austria and Serbia were to occur ‘other issues’ might draw Britain in. In the meantime, there was still an opportunity to avert war if the four disinterested powers ‘held the hand’ of their partners while mediating the dispute. But the report he received from St Petersburg was not encouraging: the British ambassador warned that Russia and France seemed determined to make ‘a strong stand’ even if Britain declined to join them.
When the Austrian minister received the Serb reply at 5:58 on Saturday afternoon, he could see instantly that their submission was not complete. He announced that Austria was breaking off diplomatic relations with Serbia and immediately ordered the staff of the delegation to leave for the railway station. By 6:30 the Austrians were on a train bound for the border.
That evening, in the Kaiservilla at Bad Ischl, Franz Joseph signed the orders for mobilization of thirteen army corps. When the news reached Vienna the people greeted it with the ‘wildest enthusiasm’. Huge crowds began to form, gathering at the Ringstrasse and bursting into patriotic songs. The crowds marched around the city shouting ‘Down with Serbia! Down with Russia’. In front of the German embassy they sang ‘Wacht am Rhein’; police had to protect the Russian embassy against the demonstrators. Surely, it would not be long before the guns began firing.
Gordon Martel is a leading authority on war, empire, and diplomacy in the modern age. His numerous publications include studies of the origins of the first and second world wars, modern imperialism, and the nature of diplomacy. A founding editor of The International History Review, he has taught at a number of Canadian universities, and has been a visiting professor or fellow in England, Ireland and Australia. Editor-in-chief of the five-volume Encyclopedia of War, he is also joint editor of the longstanding Seminar Studies in History series. His new book is The Month That Changed The World: July 1914.
Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
The post The month that changed the world: Saturday, 25 July 1914 appeared first on OUPblog.










Re-thinking the role of the regional oral history organization
Jason Steinhauer. Photo by Amanda Reynolds
What is the role of a regional oral history organization?The Board of Officers of Oral History in the Mid-Atlantic Region (OHMAR) recently wrestled with this question over the course of a year-long strategic planning process. Our organization had reached an inflection point. New technologies, shifting member expectations and changing demographics compelled us to re-think our direction. What could we offer new and existing members that local or national organizations did not —and how would we offer it?
Our strategic planning committee set out to answer these questions, and to chart a course for 2014 and beyond. Four board members served on the committee: Kate Scott of the Senate Historical Office; LuAnn Jones of the National Park Service; Anne Rush of the University of Maryland; and myself, of the Library of Congress, acting as director. OHMAR dates back to 1976 and has been a vibrant organization for nearly 40 years. Therefore, our goal was not to re-invent but rather to re-focus. To start, we identified OHMAR’s core values. We determined them to be:
Openness
Passion
Community
Education
Expertise
Whatever our new direction, we would stay true to these ideals.
For months, the committee discussed how OHMAR could better serve members with these values in mind. We also polled membership and consulted with past organization presidents about what they valued in OHMAR and what they wanted in the future. What emerged was a plan with several key considerations for how any regional organization can serve its membership:
Build community. Through digital technology, formal and informal events, and low-cost membership, regional organizations can foster meaningful professional networks, offer support, and create opportunities for intimate interaction on an ongoing basis.
Provide targeted resources. Local knowledge can allow regional organizations like OHMAR to provide targeted educational, professional, and monetary resources. For example, oral historians working for the federal government in and around Washington, D.C., have unique challenges to which OHMAR can provide specific tools, tips, and advice.
Leverage expertise. Our region boasts tremendous expertise courtesy of oral historians such as Don Ritchie, Linda Shopes, Roger Horowitz, and more. These experts can help educate new members, especially those from fields such as journalism, the arts, public history, and advocacy on best practices.
Offer meaningful opportunities. By forming new committees, we can offer members meaningful ways to get involved and gain leadership experience.
We presented our findings in the form of a new Strategic Plan at our April 2014 annual meeting. The intimate two-day event was attended by more than 60 oral historians and reaffirmed the value of regional conferences. In fact, feedback stated that for some, ours was the best conference they had ever attended. On the afternoon of the second day, our members ratified OHMAR’s Strategic Plan for 2015-2020. Accordingly, next year, we will focus on improving our internal operations, updating our bylaws, and overhauling our website, member management system, and e-newsletter. In the following years, we will also introduce several new initiatives, including a Martha Ross Memorial Prize for students, named for our beloved founder.
We will be discussing our strategic plan and the role of regional oral history organizations in a panel at the Oral History Association’s upcoming 2014 annual meeting in Madison, Wisconsin. We hope you’ll join us and share your ideas.
Jason Steinhauer serves on the Board of Oral History in the Mid-Atlantic Region (OHMAR). He directed the organization’s strategic planning process from 2013-2014. You can follow Jason on Twitter at @JasonSteinhauer and OHMAR at @OHMidAtlantic.
The Oral History Review, published by the Oral History Association, is the U.S. journal of record for the theory and practice of oral history. Its primary mission is to explore the nature and significance of oral history and advance understanding of the field among scholars, educators, practitioners, and the general public. Follow them on Twitter at @oralhistreview, like them on Facebook, add them to your circles on Google Plus, follow them on Tumblr, listen to them on Soundcloud, or follow their latest OUPblog posts via email or RSS to preview, learn, connect, discover, and study oral history.
Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
The post Re-thinking the role of the regional oral history organization appeared first on OUPblog.










Microbes matter
We humans have a love-hate relationship with bugs. I’m not talking about insects — although many of us cringe at the thought of them too — but rather the bugs we can’t see, the ones that make us sick.
Sure, microorganisms give us beer, wine, cheese, and yoghurt; hardly a day goes by without most people consuming food or drink produced by microbial fermentation. And we put microbes to good use in the laboratory, as vehicles for the production of insulin and other life-saving drugs, for example.
But microbes are also responsible for much of what ails us, from annoying stomach ‘bugs’ to deadly infectious diseases such as tuberculosis and plague. Bacteria and viruses are even linked to certain cancers. Bugs are bad; antibiotics and antivirals are good. We spend billions annually trying to rid ourselves of microorganisms, and if they were to all disappear, well, all the better, right?
This is, of course, nonsense. Even the most ardent germaphobe would take a deep breath and accept the fact that we could no more survive without microbes than we could without oxygen. No matter how clean we strive to be, there are 100 trillion bacterial cells living on and within our bodies, 10 times the number of human cells that comprise ‘us’. Hundreds of different bacterial species live within our intestines, hundreds more thrive in our mouths and on our skin. Add in the resident viruses, fungi, and small animals such as worms and mites, and the human body becomes a full-blown ecosystem, a microcosm of the world around us. And like any ecosystem, if thrown off-balance bad things can happen. For example, many of our ‘good’ bacteria help us metabolize food and fight off illness. But after a prolonged course of antibiotics such bacteria can be knocked flat, and normally benign species such as ‘Clostridium difficile’ can grow out of control and cause disease.
Given the complexity of our body jungle, some researchers go as far as to propose that there is no such thing as a ‘human being’. Each of us should instead be thought of as a human-microbe symbiosis, a complex biological relationship in which neither partner can survive without the other. As disturbing a notion as this may be, one thing is indisputable: we depend on our microbiome and it depends on us.
And there is an even more fundamental way in which the survival of Homo sapiens is intimately tied to the hidden microbial majority of life. Each and every one of our 10 trillion cells betrays its microbial ancestry in harboring mitochondria, tiny subcellular factories that use oxygen to convert our food into ATP, the energy currency of all living cells. Our mitochondria are, in essence, domesticated bacteria — oxygen-consuming bacteria that took up residence inside another bacterium more than a billion years ago and never left. We know this because mitochondria possess tiny remnants of bacterium-like DNA inside them, distinct from the DNA housed in the cell nucleus. Modern genetic investigations have revealed that mitochondria are a throwback to a time before complex animals, plants, or fungi had arisen, a time when life was exclusively microbial.
As we ponder the bacterial nature of our mitochondria, it is also instructive to consider where the oxygen they so depend on actually comes from. The answer is photosynthesis. Within the cells of plants and algae are the all-important chloroplasts, green-tinged, DNA-containing factories that absorb sunlight, fix carbon dioxide, and pump oxygen into the atmosphere by the truckload. Most of the oxygen we breathe comes from the photosynthetic activities of these plants and algae—and like mitochondria, chloroplasts are derived from bacteria by symbiosis. The genetic signature written within chloroplast DNA links them to the myriad of free-living cyanobacteria drifting in the world’s oceans. Photosynthesis and respiration are the biochemical yin and yang of life on Earth. The energy that flows through chloroplasts and mitochondria connects life in the furthest corners of the biosphere.
For all our biological sophistication and intelligence, one could argue that we humans are little more than the sum of the individual cells from which we are built. And as is the case for all other complex multicellular organisms, our existence is inexorably linked to the sea of microbes that share our physical space. It is a reality we come by honestly. As we struggle to tame and exploit the microbial world, we would do well to remember that symbiosis—the living together of distinct organisms—explains both what we are and how we got here.
John Archibald is Professor of Biochemistry and Molecular Biology at Dalhousie University and a Senior Fellow of the Canadian Institute for Advanced Research, Program in Integrated Microbial Biodiversity. He is an Associate Editor for Genome Biology & Evolution and an Editorial Board Member of various scientific journals, including Current Biology, Eukaryotic Cell, and BMC Biology. He is the author of One Plus One Equals One: Symbiosis and the Evolution of Complex Life.
Subscribe to the OUPblog via email or RSS.
Subscribe to only science and medicine articles on the OUPblog via email or RSS.
Image credit: Virus Microbiology. Public domain via Pixabay
The post Microbes matter appeared first on OUPblog.










Occupational epidemiology: a truly global discipline
Occupational epidemiology is one of those fascinating areas which spans important areas of human life: health, disease, work, law, public policy, the economy. Work is fundamental to any society and the importance society attaches to the health of its workers varies over time and between countries. Because of the lessons to be learned by looking at other countries as well as one’s own, occupational epidemiology is a truly global discipline. Emerging economies often prioritize productivity over other issues, but also can learn from the long history of improvement in working conditions which has taken place in developed countries. Looking the other way, the West can learn from fresh insights gained in studies set in low and middle-income countries.
Exposures are usually higher in emerging economies and epidemiological methods are an important tool in detecting and quantifying outbreaks of occupational disease which may have been controlled in the West. A recent study of digestive cancer in a Chinese asbestos mining and milling cohort provides additional evidence that stomach cancer may be associated with high levels of exposure to chrostile asbestos, for example. This was a collaborative study between researchers in China, Hong Kong, Japan, and the United States, and illustrates the way that studying an “old” disease in a new context can provide results which are of global benefit.
Issues in occupational epidemiology are never static. Work exposures change along with materials and processes. The ubiquitous printing industry, for example, is always developing new inks, cleaning agents, and processes. A cluster of cases of the rare liver cancer, cholangiocarcinoma, was noted in Japanese printers and this finding was replicated in the Nordic printing industry by using one of the large Nordic population-based databases. This replication is important because it shows that the association is unlikely to be due to a lifestyle factor specific to Japan.
“Big data” sharpens statistical power and there are now specific data pooling projects in occupational epidemiology, to supplement the use of existing large databases. The SYNERGY study, for example, pools lung cancer case-control studies with the aim of teasing out occupational effects from behind the masking effect of smoking, which remains by far the most important driver for lung cancer. A recent analysis with around 20,000 cases and controls was able to show that bakers are not at increased risk of lung cancer, whereas the many previous smaller studies had given inconsistent results.
The addition of systematic reviews to the toolkit has strengthened the evidence base in occupational epidemiology, allowing policy about occupational risks and their prevention to be made with confidence. Health economics, also, can be applied to findings from occupational epidemiology to clarify policy issues.
Development brings its own issues to which occupational epidemiology can be applied. We now live longer in the West, and we will have to work into old age, often while carrying chronic diseases. Despite frequently-expressed concerns about an ageing workforce, a recent study in an Australian smelter confirmed others in that the older workers maintained their ability to work safely and the highest injury rates were in young workers. Patients with previously fatal diseases survive into adult life and, potentially, the workforce; a survey of patients with cystic fibrosis, for example, found that disease severity was less important as a predictor of employment than social factors such as educational attainment and locality. A loss of heavy industry in the West, combined with cheap transport, means that many of us spend most of our waking hours sitting down, promoting obesity and its complications. A sample of UK office workers spent 65% of their work time sitting and did not compensate for this by being more active outside work. The economic downturn is a major political and social preoccupation, bringing uncertainty about future employment, which may fuel dysfunctional behaviour such as ‘presenteeism’. A Swedish study suggested that this may be associated with poor mental wellbeing.
Katherine M. Venables is a Reader in the Department of Public Health at the University of Oxford. Her research has always focused on aetiological epidemiology. At Oxford, she has worked on a cohort study of mortality and cancer incidence in military veterans exposed to low levels of chemical warfare agents, and also on the provision of occupational health services to university staff. She is editor of Current Topics in Occupational Epidemiology.
Subscribe to the OUPblog via email or RSS.
Subscribe to only science and medicine articles on the OUPblog via email or RSS.
Image: Woman smoking a cigarette by Oxfordian Kissuth. CC-BY-SA-3.0 via Wikimedia Commons.
The post Occupational epidemiology: a truly global discipline appeared first on OUPblog.










Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
