Oxford University Press's Blog, page 736
November 19, 2014
Luciano Floridi responds to NYROB review of The Fourth Revolution
In the October 9th edition of the New York Review of Books, philosopher John Searle criticized Luciano Floridi’s The Fourth Revolution, noting that Floridi “sees himself as the successor to Copernicus, Darwin, and Freud, each of whom announced a revolution that transformed our self-conception into something more modest.” In the response below, Floridi disputes this claim and many others made by Searle in his review of The Fourth Revolution.
John Searle’s review of The Fourth Revolution – How the Infosphere is Reshaping Human Reality (OUP, 2014) is astonishingly shallow and misguided. The silver lining is that, if its factual errors and conceptual confusions are removed, the opportunity for an informed and insightful reading can still be enjoyed.
The review erroneously ascribes to me a fourth revolution in our self-understanding, which I explicitly attribute to Alan Turing. We are not at the center of the universe (Copernicus), of the biological kingdom (Darwin), or of the realm of rationality (Freud). After Turing, we are no longer at the center of the world of information either. We share the infosphere with smart technologies. These are not some unrealistic AI, as the review would have me suggest, but ordinary artefacts that outperform us in ever more tasks, despite being no cleverer than a toaster. Their abilities are humbling and make us revaluate our unique intelligence. Their successes largely depend on the fact that the world has become an IT-friendly environment, where technologies can replace us without having any understanding or semantic skills. We increasingly live onlife (think of apps tracking your location). The pressing problem is not whether our digital systems can think or know, for they cannot, but what our environments are gradually enabling them to achieve. Like Kant, I do not know whether the world in itself is informational, a view that the review erroneously claims I support. What I do know is that our conceptualization of the world is. The distinction is trivial and yet crucial: from DNA as code to force fields as the foundation of matter, from the mind-brain dualism as a software-hardware distinction to computational neuroscience, from network-based societies to digital economies and cyber conflicts, today we understand and deal with the world informationally. To be is to be interactable: this is our new “ontology”.
The review denounces dualisms yet uncritically endorses a dichotomy between relative (or subjective) vs. absolute (or objective) phenomena. This is no longer adequate because today we know that many phenomena are relational. For example, whether some stuff qualifies as food depends on the nature both of the substance and of the organism that is going to absorb it. Yet relativism is mistaken, because not any stuff can count as food, sand never does. Likewise, semantic information (e.g. a train timetable) is a relational phenomenon: it depends on the right kind of message and receiver. Insisting on mapping information as either relative or absolute is as naïve as pretending that a border between two nations must be located in one of them.
The world is getting more complex. We have never been so much in need of good philosophy to understand it and take care for it. But we need to upgrade philosophy into a philosophy of information of our age for our age if we wish it to be relevant. This is what the book is really about.
Feature image credit: Macro computer citrcuit board, by Randy Pertiet. CC-BY-2.0 via Flickr.
The post Luciano Floridi responds to NYROB review of The Fourth Revolution appeared first on OUPblog.










The other side of El Sistema: Music education, discipline, and profit
The Venezuelan youth orchestra scheme El Sistema is perhaps the world’s most famous music education program today. It’s lauded as a revolutionary social program that has rescued hundreds of thousands of Venezuela’s poorest children. Simon Rattle has called it “the most important thing happening in music anywhere in the world.” Classical music education is back in vogue, now aligned with the rhetoric of social justice.
However, the training of social or ethnic Others (the poor, the destitute, the non-white) as classical music performers is hardly a new idea, and delving into its long history may improve understandings of El Sistema outside Venezuela, which are currently very limited. While this training might have provided a helping hand to the most disadvantaged in society, it also had less benevolent aspects.
The music conservatoire has its roots in the conservatorios, or orphanages, of Renaissance Venice. Young female orphans were trained in music at institutions such as the Ospedale della Pietà, where Vivaldi worked. These institutions served a clear charitable purpose, providing for destitute children and aiming to turn them into productive citizens. Yet it’s worth noting a couple of further points: discipline and profit.
The Ospedali Grandi’s purpose was primarily to regulate the city’s social environment, and along with the opportunities provided to impoverished girls went strict control over their day-to-day lives. As Vanessa M. Tonelli notes, the young musicians had to submit to an inflexible monastic routine: silence, lots of work, and little leisure time. When the English music historian Charles Burney visited in the eighteenth century, he noted “good discipline observed in every particular,” and he described the orchestra as “under the most exact discipline,” with its musicians “under that kind of subordination which is requisite in a servant to a superior.” Clearly, then, musical training was an extension of the Ospedali’s social control. It also had an economic angle: concerts by the orphan girls became a major attraction, and the Ospedali were thus able to turn their musical talents into profit, enriching the institutions and their administrators by eliciting larger donations.
Naples, too, had a conservatorio system, in which, as David Yearsley writes, music “was drummed into thousands of children in a system of forced labor…. There was huge international demand for the fashionable Neapolitan style, and the conservatories fed it.” Employing a training system that was “often cruel,” these “music mills” forged highly trained performers for export across Europe. Their art “belied the inexorable regime of study, discipline, and punishment that lay behind it”: Burney described sweatshop-like conditions, with students practising ten hours a day with only a few days off per year.

Yearsley posits a “dialectic of musical enlightenment: on one side of the split screen was the musical workhouse of the poor orphans, on the other, the courtly chamber filled by the most elegant music played by bewigged and fully-trained instrumentalists.” Enchanting music emerged from conditions of control and exploitation.
The alliance of coercion and beauty was not limited to Europe. The Spanish Conquest of the Americas saw missionaries fanning out across the continent and founding schools that taught music as a core subject. A key aim was to instill in the indigenous population what the Spaniards called policía–order, Christianity, and civilization. Music education thus served as a handmaiden of colonialism.
In nineteenth-century Britain, music education was promoted among the poor as part of a drive for moral and religious improvement. Howard Smither argues that a key motivation was the political protection of the upper and wealthy middle classes. Music was seen as a way of keeping the workers out of taverns, increasing their productivity and decreasing their opportunities to discuss revolutionary ideas. Similarly, David Gramit underlines how nineteenth-century educational reform in Germany reaffirmed the social and economic order; music education proponents “sought to create a disciplined but docile labor pool” and thus promote more efficient capitalism. Grant Olwage explores how the perceived efficacy of music education as social control in Britain made it an obvious tool for disciplining and “civilizing” the black population in its colony in South Africa.
What all these programs had in common was an attempt to order and control social Others. They reveal music education in its guise of disciplinary practice, and thus as profoundly ambiguous. As Michel Foucault argues, discipline is effective and productive, as the high level of performing skill attained by many of these musical trainees attests. But discipline also “imposes unequal, asymmetrical, non-reciprocal relations” (James Johnson), and produces docile, apolitical subjects. Music education thus brings benefits, but they often accrue as much to the educators as the educated (as economic and symbolic capital) and may be accompanied by significant (if hidden) costs.
Training, discipline, profit: these three threads run through the history of music education of social or ethnic Others. Let’s now turn to Venezuela.
El Sistema has been lionised by the international press as a revolution in artistic education and a beacon of social justice. Clearly, then, there has been a failure to connect it to similar music education initiatives stretching back half a millennium and to take account of the darker side of music education. For all the contemporary talk of a “revolutionary social project,” El Sistema offers little that is new—its core ideas were presaged in sixteenth-century Latin America and nineteenth-century Europe–and such programs have historically been reactionary, not revolutionary.
Historical precedents alert us to the possibility that alongside El Sistema’s undoubted productivity lie discipline and profit. And indeed, the Venezuelan program displays a familiar urge to control social Others and benefit from their musical activities. Founder José Antonio Abreu has said: “As an educator, I was thinking more about discipline than about music.” Gustavo Dudamel and the Simón Bolívar Youth Orchestra, meanwhile, have become mainstays of the global music industry, attracting the kind of international praise and wonder that Venetian orphan musicians did centuries earlier, and generating considerable revenue in the process. Like the Neapolitan conservatorios, it drills young performers intensively to fulfill the desires of audiences across Europe; once again, the musicians’ “naturalness” is celebrated, their unbending training regime overlooked.
To understand El Sistema, we need to remember music education’s two faces. The press and public have fixated on one–“rescuing” and training the poor–and have largely ignored the other: discipline and profit. Only when the second is fully grasped can a proper assessment be made.
Headline image credit: ‘Maestro. Orchestra. Conductor’. Public domain via Pixabay
The post The other side of El Sistema: Music education, discipline, and profit appeared first on OUPblog.










Scholarly reflections on ‘slacktivism’
Whether its the use of Facebook in the 2008 US Presidential election or the Ice Bucket Challenge in 2014, there are new forms of activism emerging online. But are all these forms of activism equal? With the inclusion of slacktivism on Oxford Dictionaries Word of the Year shortlist, we asked a number of scholars for their thoughts on this new word and emerging phenomenon.
* * * * *
“I’m sure slacktivism is meant to criticize the “activity” and the slackers who do it. It was probably made up by real activists who felt they had to draw a line and protect their own credibility. Still, the phenomenon may not be as bad as it seems. There are all those partially reformed slackers out there, they came of age with the Internet, and they’ll never be real activists, right, so isn’t it better that they at least be slacktivists? Also, how many activists are there? But there are more than two billion Internet users worldwide — potentially that’s a lot of slacktivists. So here’s the question occasioned by this year’s contest for Word of the Year: Can two billion slacktivists accomplish more than all of the certified activists? Anyway, I’m a sucker for a blend.”
— Michael Adams, Indiana University at Bloomington, author of Slayer Slang: A Buffy the Vampire Slayer Lexicon, Slang: The People’s Poetry, editor of From Elvish to Klingon: Exploring Invented Languages
* * * * *

“This term is a combination of the words slacker and activist and generally refers to actions, largely on the Internet, to influence policy or politics that require little to no effort. Slacktivism typically is used to criticize behavior that appears to have only a marginal utility, but makes the participants feel better about themselves. Slacktivism typically includes signing Internet petitions, joining a Facebook group, changing your online profile picture to a symbolic image, mass e-mail campaigns, and resending political or policy content through social media. While the term is most often applied to Internet activities, it can also refer to offline activism that also requires little to no effort or commitment, such as wearing a ribbon or political button. The primary concern with slacktivism is that it may occupy or satisfy people with ineffective activities who would otherwise be more engaged participants in more influential forms of activism. However, these critiques may be an oversimplification, as slacktivism activities can have a measurable influence and do not preclude more direct forms of activism.”
— Kevin M. Wagner, Associate Professor of Political Science at Florida Atlantic University and co-author of Tweeting to Power: The Social Media Revolution in American Politics
* * * * *
“A bit of a mouthful, but highly descriptive. People who care about political and social causes are usually comfortable with talking about -isms. Too bad they don’t put more of their time where their hearts are. (The political scientist Robert Putnam talked about this decline of social capital in his book Bowling Alone.)”
— Naomi S. Baron, Professor of Linguistics and Executive Director of the Center for Teaching, Research & Learning at American University in Washington, DC; author of Always On: Language in an Online and Mobile World and the forthcoming Words Onscreen: The Fate of Reading in a Digital World
* * * * *
Headline image credit: Computer Keyboard by Marina Shemes via PublicDomainPictures.net.
The post Scholarly reflections on ‘slacktivism’ appeared first on OUPblog.










The ayes have it
The ayes may have it, but we, poor naysayers, remain in ignorance about the derivation of ay(e) “yes.” I hope to discuss the various forms of assent in December, and we’ll see that that the origin of some synonyms of ay(e) is also enigmatic. Perhaps the word does not even deserve the attention lavished on it by linguists, but, as usual in etymology and in much of scholarship, once a question is asked, there is no way to get rid of it. It draws more and more people into the controversy and gains momentum.
The earliest known example of ay(e), at that time spelled as I, goes back to 1576. Shakespeare was born in 1564, which means that he heard aye ~ I most of his life; he used it freely in his plays. How and why did ay(e) come into existence in the second half of the sixteenth century? Guesses vary, with some conclusions looking more realistic than the others.
I should propose that such a word, almost an interjection, originated “on the street” rather than in official parlance. Twenty-five years ago, Professor Rolf Bremmer wrote an article on aye and in passing compared aye and OK. The comparison seems apt. The origin of OK became clear after years of laborious research. Some people are still unconvinced by the results, but the statement one finds in the most recent dictionaries is probably all correct. The word gained fame (or notoriety) during an election campaign, spread from its home, and in the twentieth century, mainly after the Second World War, conquered half of the world. (As late as 1938, an Englishman, in a letter to The Spectator, vented his wrath on OK for “defiling” the English language and on those who dared say that it was “OK to walk in the Zoo on Sunday.” Ay(e) must have had a similar history: it probably rose from the lower depths, lost its slangy tinge, became conversational, and ended up among the most respectable, even if dead, words in the language, considering its use in voting (“all in favor say aye”—oyez, oyez, oyez). It is reasonable to suggest that by 1576 it had been around for a few decades.
The common opinion has it that ay(e) lacks cognates outside English, but, while examining early sixteenth-century Frisian legal documents, Bremmer found ay, aij, and aey “yes,” a word related to Engl. yea, in the answers of several witnesses. (Incidentally, English etymologist Hensleigh Wedgwood knew about the Frisian form, but today hardly anyone opens even the last of four editions of Wedgwood’s dictionary.) Bremmer considered the following possibilities: (1) Frisian borrowed the formula of assent from English, (2) English borrowed it from Frisian, (3) both borrowed it from a third language, and (4) although aye goes back to an ancient period, it surfaced in both languages around 1600. In his view, only the second option has a semblance of verisimilitude. I will not go over his arguments (the word is obviously not very old, while a “third language” is pure fiction) but say that his conclusion may need modification. Among other things, Bremmer, following the German scholar Hermann Flasdieck, mentioned the chance of a nautical origin. Flasdieck did not elaborate. Bremmer probably thought of borrowing from Frisian-speaking sailors. One can indeed imagine a formula like “Ay, ay, Sir” becoming part of international slang. (The origin of nautical words is often hard to trace: compare my old post on awning.)

Let us now look at how some other scholars tried to deal with ay(e). Their approaches are partly predictable. Since ay(e) was spelled as I, it was natural to try to derive the word from the pronoun. Allegedly, people suddenly began saying “I, I” when they meant “yes, yes.” Objections to this hypothesis have been many. Mine is hidden in the adverb suddenly prefaced to began in the previous sentence. We will see that no one asked what had made the word popular around 1560, and this, I think, is the reason why the origin of aye remains unknown to this day.
Then there is the adverb aye “ever,” and it occurred to some that ay(e) “yes” is the same word (after all, no goes back to the negation n- and Old Engl. a “ever”; the vowel was long, as, for example, in Modern Engl. spa or the family name Haas). Those who have been exposed to several varieties of English know that in many areas Kate, mate, and so forth sound as kite and mite. (So it is now in London, and I remember my futile attempts to explain to a secretary at Cambridge University that the first letter of my name—Anatoly—is an a. Unfortunately, she pronounced the town’s name as Kimebridge and could not make out what I wanted. I still have that ID for I. Liberman.) However short the path from A to I may be, I “yes” never meant “ever, always.” Yet even under the best of circumstances why should an obscure dialectal form of the affirmative take root in the capital and stay in the language? Even in the nineteenth century, Londoners did not say stition for station.
At least two etymologists attempted to trace ay(e) to longer words or whole phrases. Both scholars have good credentials, but their conjectures strike me as less than totally persuasive (to use a polite euphemism), and I have to repeat the same fateful question: What caused the appearance of the enigmatic word in the sixteenth century? It may be worthwhile to reiterate a simple but constantly ignored rule of linguistic reconstruction. Whether we investigate the nature of a sound change, a shift in grammar, or the origin of a word, we have to discover the circumstances in which the process took place. If, let us say, short vowels became long in the thirteenth century, why just then? Certainly not because short vowels tend to strive for upward mobility.

I may also add my traditional rueful comment. Before the recent publication of a bibliography of English etymology it was hard to find even the most important works on the history of any given word. In 1950 Gösta Langenfelt, in a Swedish journal (but he wrote the article in English!), proposed the derivation of ay(e) from the group ah je. In 1954 E. K. C. Varty had a similar idea and put it forward in Notes and Queries. He was unaware of his predecessor. In 1956, Klaus J. Kohler developed Langenfelt’s idea (the most sensible etymology, as he called it). He published his findings in English and then incorporated his idea into a longer work in German. He never discovered Varty’s one-page note. Even the most conscientious etymologists are doomed to roaming in the gloaming. Despite the consensus on the matter in hand among three distinguished authors, none of whom addressed the question of chronology, I keep thinking that ay(e) did not develop from a compound or a word group.
We will disregard the idea that ay is ya or ja, with the sounds in reverse order, or that it is a borrowing from Latin (so Samuel Johnson; his editor Todd questioned this hypothesis), but for the fun of it we may follow the path of Webster’s dictionary: first some vague references to Scandinavian and Celtic, then silence (no etymology in Webster-Mahn (!)), later “perhaps a modification of yea,” and the final splash: “Of uncertain etymology.” Being uncertain is an honest etymologist’s immutable fate.
Image credit: (1) Parliament adopts EU budget for 2011. ©European Parliament/Pietro Naj-Oleari. CC BY-NC-ND 2.0 via European Parliament Flickr. (2) The Aye-Aye (from Trans. of Zool. Soc.). Illustration from “On The Genesis of Species” by St. George Mivart, F.R.S. (1827-1900.) London: Macmillan and Co. 1871. Project Gutenberg. Public domain via Wikimedia Commons.
The post The ayes have it appeared first on OUPblog.










Put the debate about slacktivism to rest
Oxford Dictionaries included slacktivism on its Word of the Year 2014 shortlist, so we invited several experts to comment on this Internet activism phenomenon.
The term slacktivism is based on a question that should never have been asked: are digital activists doing anything worthwhile, or are they mere “slacktivists,” activists who are slacking off?
The word arose from a debate about what value there was to all those people who were willing to click a few buttons to express their outcry over the shooting of Trayvon Martin or participate in the ALS ice-bucket challenge—but do nothing else. While some have hailed this new era of digital activism as a great democratizing force, opening up the political process and giving voice to people in a new way, others have scoffed at its impact. “The revolution will not be tweeted,” Malcolm Gladwell famously wrote.
Asking whether digital activism is a meaningful lever for social change is the wrong question to ask for several reasons.
First, the forms and capabilities of digital activism itself are changing rapidly, so that answers to the question become obsolete almost as fast as the technologies on which they are based.
Second, the types and uses of digital activism are as diverse as traditional forms of activism, ranging tremendously in what they can accomplish. So there is no one-size-fits-all answer to this question. Digital activism, like regular activism, can be both effective and ineffective, both thin and thick.
Third, and perhaps most importantly, focusing on digital activism puts the emphasis on the tools, not the people who use the tools. The success of any change effort has always depended on the capabilities of the people who use whatever tools are at their disposal.
The right question to ask, then, is not “How good are the tools?” but instead “How good are the people and organizations who use those tools?” Lots of organizations, campaigns, and movements all over the world try to get people engaged in activism every day. Some are better than others. Why?
Conventional wisdom might argue that an organization’s ability to engage activists depends on a charismatic leader, a catchy message, or, in this day and age, its ability to leverage big data and technology. Those things all matter. But after spending two years comparing high-engagement organizations to their low-engagement counterparts, I find that what really differentiates the high-engagement organizations is their ability to create transformative experiences for their activists.
While most organizations focus on trying to get more people to do more stuff by making participation as easy as possible, these high-engagement organizations are doing more. Whether they are doing it online or offline, these organizations are carefully engaging people in ways that cultivate the motivational, strategic, and practical capacities they need to engage in further activism—as a result, everything from the kinds of activities they plan, to the way they structure their organizations, to the way they communicate with volunteers, is different. Combined with a hard-nosed focus on numbers, these organizations are thus able to achieve both the breadth and depth of activism that many organizations seek.
So let’s put the debate about slacktivism to rest. Instead, let’s begin asking how we can use technology to create transformative spaces where people can develop their individual and collective agency. If technology can create the kinds of interpersonal, transformative spaces face-to-face organizations have built for years, then we’ll be able to get the depth we want, but on a grander scale than we’ve ever had before.
Headline image credit: Protest Illustration. Public domain via Pixabay.
The post Put the debate about slacktivism to rest appeared first on OUPblog.










Slacktivism as optical illusion
Oxford Dictionaries included slacktivism on its Word of the Year 2014 shortlist, so we invited several experts to comment on this Internet activism phenomenon.
Slacktivism is a portmanteau, bridging slacker and activism. It is usually not intended as a compliment.
The term is born out of frustration with the current state of public discourse: signing an e-petition, retweeting a message, or “liking” something on Facebook seems too easy. When people engage in these simple digital acts around social causes, we wonder: are they fooling themselves into believing they can make a difference? What can these clicks actually accomplish? Do they degrade “real” social activism, or make citizens less likely to take more substantial steps?
But if we examine these digital acts a little more closely, it turns out that slacktivism is a bit of an optical illusion. Simple digital acts of participation can be wispy or powerful. They can be a dead end for social engagement, or Act 1 in a grand narrative of social mobilization. It all depends on the context and the intended purpose of these digital actions, and on how committed, organized groups of citizens make use of them.
Three features are particularly important when deciding whether an act of online participation should be dismissed as “just slacktivism.”
First is what Andrew Chadwick (2013) calls “the hybrid media system.” One major goal of most citizen activism is to attract media attention. Mainstream media still help to set the agenda for the national conversation – whether CNN and USA Today are covering Ebola or the national debt helps to magnify attention to each of those issues. In the pre-digital era, activist groups would stage rallies and send press releases to attract the attention of the media. Today, journalists and their editors often turn to digital media in order to pick out potential stories worth covering. So online petitions, likes, and hashtags can be more than just slacktivism if they are strategically used to attract media attention.
Second is the target of the digital action. All activist tactics – digital or offline – should be viewed within their strategic context. Who is being targeted, and why would the target listen? Marshall Ganz (2010) writes that “Strategy is how we turn what we have into what we need to get what we want.” Digital petitions, by this logic, can be tremendously effective or a complete waste of time. A petition to “stop animal cruelty,” aimed at no one in particular, is guaranteed to make no difference. But a recent wave of online petitions aimed at Boy Scout Troops resulted in the Boy Scouts of America officially changing its position and allowing openly gay youths to participate in the organization. Likewise, when online “slackers” submit millions of online comments to the FCC in support of net neutrality, those comments carry the force of demonstrated public opinion. When online citizens tweet or post their displeasure at corporations, reputation-conscious companies have been known to change their policies and practices.
Third, is the organizational context. Some simple acts of digital engagement can indeed leave people less likely to engage in larger-scale activism. In particular, researchers have found that initial acts of token support can relieve psychological pressure that would otherwise push an individual to engage more deeply. But these same acts can also operate as the first step in a “ladder of engagement.” You start by retweeting a news article, and then signing a petition. In the process, you are added to the member rolls of a “netroots” advocacy organization. And that organization then reaches out to you, inviting you to a street protest or a local meeting about the issue. As Hahrie Han (2014) demonstrates, organizations develop activists by building relationships with them over time. These initial acts of “slacktivism” can vanish into nothing, or provide a base for civic mobilization.
The complaints we hear today about “slacktivism” are identical to an earlier generation of complaints about “armchair activism.” Where today we hear that actions performed via the Internet are too simple to make a difference, in the 1970s we heard that actions performed via the mail or the telephone were too simple to make a difference. Then, as now, those complaints were an optical illusion: the power of these activist techniques depends on what angle you observe them from. The medium through which we engage in politics matters less than the networks, relationships, and strategies we employ along the way.
Headline image credit: Large crowd of small symbolic 3d figures, over white. © higyou via iStock.
The post Slacktivism as optical illusion appeared first on OUPblog.










Slacktivism, clicktivism, and “real” social change
Oxford Dictionaries included slacktivism on its Word of the Year 2014 shortlist, so we invited several experts to comment on this Internet activism phenomenon.
Like its corollary clicktivism, slacktivism is a term that unites entrenched technosceptics and romantic revolutionaries from a pre-Internet or, more precisely, a pre-social media age as they admonish younger generations for their lack of commitment to “real” social change or willingness to do “what it takes” to make the world around them a better place.
This perception is based on drawing a corollary between the mounting evidence that people are spending more and more time online and the perception that political and social movements are no longer what they were. I would agree with both observations.
I would not agree, however, with this widely held assumption that online forms of sociopolitical mobilization, information-exchange, or community-building are either inferior or less genuine to offline varieties. There are good, bad, and indifferent forms of online political engagement just as there are in the offline world, e.g. going on a demonstration or signing a paper petition are not in themselves signs of above-average mobilization. In this sense then slacktivism, defined as actions in “support of a political or social cause but regarded as requiring little time or involvement,” existed long before the Internet came of age in the 1990s with the world wide web. And slacktivism, according to this part of the definition, will exist long after the social media platforms that dominate the internet of today have made way for the next generation of goods and services. Disapproval at the way any given generation makes use — or not — of the media and communications of their time will also continue long after the target of this pejorative term, so-called digital natives, have grown up and started to lament the way their children seem to have become disengaged from the social and political problems of their time in turn. Half-hearted or short-lived forms of political action, empty rhetoric, or fleeting movements for change are neither reducible to, nor are they synonymous with any particular technological artefact or system, even a transformative and complex one such as the Internet. This held true for the Internet’s socio-technological precursors such as television, the telephone, radio, and even the printing press.

My taking distance from this easy dismissal of the way people use the Internet to call to account power abuses at home and abroad, or to share information and get organized by going online, arises from a longstanding concern I have about the way that media pundits — and some parts of academe — look for easy ways to generate headlines or sell books by drawing such false dichotomies between our online and our offline lives. This preference for the simple either/or tends to overlook more pressing questions about the changing face and nature of sociopolitical engagement in a domain that is being squeezed from all sides by incumbent political and economic interests. It is tempting, and comforting to treat online mobilization as suspicious by default, but to do so, as astute observers (not) so long ago have already noted (Walter Benjamin and Donna Haraway for instance) does a disservice to critical analyses of how society and technological change collide and collude with one another, and in complex, over-determined ways. But I would go further here to argue that tarring all forms of online activism as slacktivism is a form of myopic thinking that would condemn the ways in which today’s generation’s communicate their concerns about the injustices of the world in which they live online. It also underestimates the politicizing effect that recent revelations about way in which the internet, the medium and means in which they find out about their world is being excessively if not illegally data-mined and surveilled by vested — governmental and commercial — interests.
Assuming that the Internet, admittedly a harbinger of major shifts in the way people access information, communicate with one another, and organize, is the main cause for the supposedly declining levels of civic engagement of the younger generation is to succumb to the triple perils of technological determinism, older-generational myopia, and sloppy thinking. It also overlooks, indeed ignores, the fact that organizing online is a time-consuming, energy-draining, and expensive undertaking. This holds true even if many of the tools and applications people can draw on are offered “free” or are, arguably, relatively easy to use. Sustaining a blog, a website, a social media account, getting people to sign an e-petition, or deploying email to good effect are activities that require know-how, want-to, and wherewithal. Moreover, mounting any sort of campaign or community project in order to address a social injustice at home let alone around the world, cannot be done these days without recourse to the internet.
What has changed, like it or not, is that in Internet-dependent contexts, any sort of serious political or social form of action now has to include an online dimension, and a sustained one at that. This means that additional energies need to be devoted to developing multi-sited and multi-skilled forms of strategic thinking, deployment of human resources, and ways to make those qualities that can inspire and mobilize people to get involved work for the online environment (e.g. how to use micro-blogging idioms well), on the ground (e.g. face-to-face meetings), and in non-digital formats (e.g. in written or physical forms). It is a sign of our age that sociopolitical action needs to know how to combine age-old, pre-digital age techniques to mobilize others with those that can speak in the 24/7, mobile, and user-generated idioms of online solidarity that can engage people close to home as well as those living far away. Huge sociocultural and political power differentials aside, given that people and communities access and use the Internet in many ways at any one time around the world, the effort and commitment required of pre-Internet forms of organizing pale in comparison to those called for in an Internet age.
The post Slacktivism, clicktivism, and “real” social change appeared first on OUPblog.










Does workplace stress play a role in retirement drinking?
Alcohol misuse among the retired population is a phenomenon that has been long recognized by scholars and practitioners. The retirement process is complex, and researchers posit that the pre-retirement workplace can either protect against—or contribute to—alcohol misuse among retirees.
The prevalence of alcohol misuse among older workers is staggering. In the United States, the rate of heavy drinking (i.e., more than seven drinks per week or two drinks on any one occasion) among those aged 65 and older is calculated to be at 10% for men and 2.5% for women, with some studies estimating the frequency of alcohol misuse among older (i.e., age 50 and older) as 16% or higher. Yet another study makes the case that 10% of all alcoholics are over 60. As a point of reference, the incidence of frequent heavy drinking in the workforce (US) is 9.2% and rate of alcohol abuse is 5.4%.
Estimates of future problem drinking and predictions of how prevalence rates may rise may be underestimated, not only because of the aging of the population, but also because of shifting societal and cultural norms. There is evidence that individuals follow relative stable drinking patterns as they age. If this is the case, the Baby Boomer generation may show a higher prevalence of alcohol problems as they enter later life than their parents and grandparents. Moreover, some research suggests that the frequency and severity of alcohol misuse may increase in aging populations, especially among individuals with a history of drinking problems.
Recent research has suggested that retirement drinking may be influenced by workplace factors.
Richman, Zlatoper, Zackula, Ehmke, and Rospenda (2006) investigated the role of aversive workplace conditions that could influence drinking behavior among retirees: sexual harassment, generalized workplace abuse, and psychological workload. The analysis of a longitudinal study of employees at a Midwestern university shows that retirees who had experienced high levels of stress drank more than their counterparts who were still employed (and who were still experiencing a stressful workplace). This pattern held even in relation to a comparison between stressed and non-stressed workers. The study suggests that for those still employed, workplace norms and regulations may inhibit the use of alcohol as a means of self-medication in response to highly stressful experiences, retirement removes the social controls that curtailed drinking while the individual was in the workforce.

Bacharach, Bamberger, Biron, & Horowitz-Rozen (2008) examined the role that positive work conditions might have on the retirement-drinking relationship, positing that pre-retirement job satisfaction might interact with retirement agency to affect retirees’ drinking behavior. Using data from a NIH-funded ten-year study of retirement-eligible and retired workers, the research team found a positive association between “push” perceptions and both the quantity and frequency of drinking (though not drinking problems), and an inverse association between “pull” perceptions and both drinking frequency and drinking problems (though not quantity). The study also found that greater job satisfaction amplified the positive association between “push” perceptions and alcohol consumption, and attenuated the inverse association between “pull” perceptions and unhealthy or problematic drinking. This moderating effect of pre-retirement job valence suggests that people who are most satisfied with their jobs are likely to fare worst in response to the stress of a retirement that is unplanned or undesired. Even when retirement is the result of personal volition, it may still be associated with a sense of loss and negative emotions for which alcohol may serve as a coping mechanism.
Bacharach, Bamberger, Doveh and Cohen (2007) examined how the social availability of alcohol in and around the workplace prior to retirement may have divergent effects on older adult drinking behavior. Bacharach et al. found that problem drinkers—after retiring from a workplace with permissive drinking norms—drank less over the first two years of retirement. This population not only left the workplace, but they also dropped their regular association with coworkers who supported and encouraged drinking behavior. The findings suggest that for those with a history of problem drinking, retirement may be linked to a net decline in the severity of drinking problems.
To assess the degree to which this decline in problem drinking may be attributed to separating from a permissive workplace drinking culture, the team examined shifts in the extent of the problem-drinking cohort’s social support networks during the study period. Findings suggest that the decline in problem drinking severity was apparent among those whose social networks became smaller in retirement. Conversely, for the small number whose social networks expanded in retirement, problem drinking severity increased. The nature of the retirement-problem drinking relationship, at least for baseline problem drinkers, may be contingent upon the social availability of alcohol in the work environment from which they disengage.
While there is a lack of research demonstrating the role of strain as a mediator linking these stressors to shifts in older adults’ drinking behavior, a substantial body of evidence examining the role of stress in the origin and intensification of alcohol use and misuse suggests that strain is likely to serve as the intermediary mechanism. To the extent that strain plays such a mediating role, the same network factors are likely to also operate as vulnerability or protective moderating factors in this second stage of the mediation. As suggested by Bacharach et al. (2007), the impact of disengagement-related strain on older adults’ drinking behavior is likely to vary depending upon whether they exit into a non-work social network with more or less permissive drinking norms than those associated with their workplace or occupation.
The post Does workplace stress play a role in retirement drinking? appeared first on OUPblog.










Give thanks for Chelmsford, the birthplace of the USA
Autumn is here again – in England, the season of mists and mellow fruitfulness, in the US also the season of Thanksgiving. On the fourth Thursday in November, schoolchildren across the country will stage pageants, some playing black-suited Puritans, others Native Americans bedecked with feathers. By tradition, Barack Obama will ‘pardon’ a turkey, but 46 million others will be eaten in a feast complete with corncobs and pumpkin pie. The holiday has a long history: Lincoln fixed the date (amended by Roosevelt in 1941), and Washington made it a national event. Its origins, of course, lay in the Pilgrim Fathers’ first harvest of 1621.
Who now remembers who these intrepid migrants were – not the early ‘founding fathers’ they became, but who they were when they left? The pageant pilgrims are undifferentiated. Who knows the name of Christopher Martin, a merchant from Billericay near Chelmsford in Essex? He took his whole family on the Mayflower, most of whom, including Martin himself, perished in New Plymouth’s first winter. They died Essex folk in a strange land: there was nothing ‘American’ about them. And as for Thanksgiving, well that habit came from the harvest festivals and religious observances of Protestant England. Even pumpkin pie was an English dish, exported then forgotten on the eastern side of the Atlantic.
Towns like Billericay, Chelmsford and Colchester were crucial to American colonization: ordinary places that produced extraordinary people. The trickle of migrants in the 1620s, in the next decade became a flood, leading to some remarkable transformations. In 1630 Francis Wainwright was drawing ale and clearing pots in a Chelmsford inn when his master, Alexander Knight, decided to emigrate to Massachusetts. It was an age of austerity, of bad harvests and depression in the cloth industry. Plus those who wanted the Protestant Reformation to go further – Puritans – feared that under Charles I it was slipping backwards. Many thought they would try their luck elsewhere until England’s fortunes were restored, perhaps even that by building a ‘new’ England they could help with this restoration. Wainwright, aged about fourteen, went with Knight, and so entered a world of hardship and danger and wonder.
One May dawn, seven years later, Wainwright was standing by the Mystic River in Connecticut, one of seventy troops waiting to shoot at approaching Pequot warriors. According to an observer, the Englishmen ‘being bereaved of pity, fell upon the work without compassion’, and by dusk 400 Indians lay dead in their ruined encampment. The innkeeper’s apprentice had fired until his ammunition was exhausted, then used his musket as a club. One participant celebrated the victory, remarking that English guns had been so fearsome, it was ‘as though the finger of God had touched both match and flint’. Another rejoiced that providence had made a ‘fiery oven’ of the Pequots’ fort. Wainwright took two native heads home as souvenirs. Unlike many migrants, he stayed in America, proud to be a New Englander, English by birth but made different by experience. He lived a long life in commerce, through many fears and alarms, and died at Salem in 1692 during the white heat of the witch-trials.

The story poses hard historical questions. What is identity, and how does it change? Thanksgiving pageants turn Englishmen into Americans as if by magic; but the reality was more gradual and nuanced. Recently much scholarly energy has been poured into understanding past emotions. We may think our emotions are private, but they leak out all the time; we may even use them to get what we want. Converted into word and deed, emotions leave traces in the historical record. When the Pilgrim William Bradford called the Pequot massacre ‘a sweet sacrifice’, he was not exactly happy but certainly pleased that God’s will had been done.
Puritans are not usually associated with emotion, but they were deeply sensitive to human and divine behaviour, especially in the colonies. Settlers were proud to be God’s chosen people – like Israelites in the wilderness – yet pride brought shame, followed by doubt that God liked them at all. Introspection led to wretchedness, which was cured by the Holy Spirit, and they were back to their old censorious selves. In England, even fellow Puritans thought they’d lost the plot, as did most (non-Puritan) New Englanders. But godly colonists established what historians call an ‘emotional regime’ or ‘emotional community’ in which their tears and thunder were not only acceptable but carried great political authority.
John Winthrop, the leader of the fleet that carried Francis Wainwright to New England, was an intensely emotional man who loved his wife and children almost as much as he loved God. Gaunt, ascetic and tirelessly judgmental, he became Massachusetts Bay Colony’s first governor, driven by dreams of building a ‘city upon a hill’. It didn’t quite work out: Boston grew too quickly, and became diverse and worldly. And not everyone cared for Winthrop’s definition of liberty: freedom to obey him and his personal interpretation of God’s designs. But presidents from Reagan to Obama have been drawn to ‘the city upon the hill’ as an emotionally potent metaphor for the US in its mission to inspire, assist, and police the world.
Winthrop’s feelings, however, came from and were directed at England. His friend Thomas Hooker, ‘the father of Connecticut’, cut his teeth as a clergyman in Chelmsford when Francis Wainwright lived there. Partly thanks to Wainwright, one assumes, he found the town full of drunks, with ‘more profaneness than devotion’. But Hooker ‘quickly cleared streets of this disorder’. The ‘city upon the hill’, then, was not a blueprint for America, but an exemplar to help England reform itself. Indeed, long before the idea was associated with Massachusetts, it related to English towns – notably Colchester – that aspired to be righteous commonwealths in a country many felt was going to the dogs. Revellers did not disappear from Chelmsford and Colchester – try visiting on a Saturday night – but, as preachers and merchants and warriors, its people did sow the seeds from which grew the most powerful nation in the world.
So if you’re celebrating Thanksgiving this year, or you know someone who is, it’s worth remembering that the first colonists to give thanks were not just generic Old World exiles, uniformly dull until America made them special, but living, breathing emotional individuals with hearts and minds rooted in English towns and shires. To them, the New World was not an upgrade on England: it was a space in which to return their beloved country to its former glories.
Featured image credit: Signing of the Constitution, by Thomas P. Rossiter. Public domain via Wikimedia Commons
The post Give thanks for Chelmsford, the birthplace of the USA appeared first on OUPblog.










November 18, 2014
The legitimate fear that months of civil unrest in Ferguson, Missouri will end in rioting
On 9 August 2014, Officer Darren Wilson of the Ferguson, Missouri (a suburb of St. Louis) Police Department, shot and killed Michael Brown, an unarmed 18-year-old. Officer Wilson is white and Michael Brown was black, sparking allegations from wide swaths of the local and national black community that Wilson’s shooting of Brown, and the Ferguson Police Department’s reluctance to arrest the officer, are both racially motivated events that smack of an historic trend of black inequality within the US criminal justice system.
The fact that the Ferguson Police Department and city government are predominantly white, while the town is predominantly black, has underscored this distrust. So too have recent events in Los Angeles, New York, Ohio, South Carolina, St. Louis, and other places that suggest a disturbing pattern of white police personnel’s use of excessive force in the beatings or deaths of blacks across the nation. So disturbing, in fact, that this case and the others linked to it not only have inspired an organic, and diverse, crop of youth activists, but also have captured the close attention of President Barack Obama, Attorney General Eric Holder, national civil rights organizations and the national black leadership. Indeed, not one or two, but three concurrent investigations of Officer Wilson’s shooting of Michael Brown are ongoing—one by the St. Louis Police Department and the other two by the FBI and the Justice Department, who are concerned with possible civil rights violations. The case also has a significant international following. The parents of Michael Brown raised this profile recently when they testified in Geneva, Switzerland before the United Nations Committee against Torture. There, they joined a US delegation to plead for support to end police brutality aimed at profiled black youth.
The details of the shooting investigations, each bit eagerly seized by opposing sides (those who support Brown and those who defend Wilson) as they become publicly available, still don’t give a comprehensive view of what actually happened between the officer and the teen, leaving too much speculation as to whether or not the Ferguson Grand Jury, who have been considering the case since 20 August, will return an indictment(s) against Officer Wilson.

What is known of the incident is that about noon on that Saturday, Michael Brown and a friend, Dorian Johnson, were walking down Canefield Drive in Ferguson when Darren Wilson approached the two in his squad car, telling them to get out of the street and onto the sidewalk. A scuffle ensued between Brown and Wilson within the police car. In his defense, Officer Wilson has stated that Brown attacked him and tried to grab his weapon. Dorian Johnson has countered that Wilson pulled Michael Brown into his car, suggesting that Brown was trying to defend himself from an overly aggressive Wilson. Shots were fired in Wilson’s police car and Brown ran down the street, pursued by Wilson. Autopsy reports indicate that Brown was shot at least six times, four times in his left arm, once through his left eye and once in the top of his head. The latter caused the youth’s death. Michael Brown’s body lay in the street, uncovered, for several hours while the police conducted a preliminary investigation, prompting even more outrage by black onlookers.
Since Michael Brown’s death, protestors from the area and across the nation have occupied the streets of Ferguson, demanding justice for the slain teen and his family. Nights of initial confrontations between police forces (the Ferguson Police, the St. Louis Police, the Missouri State Troopers and the National Guard have all been deployed in Ferguson at some time, and in some capacity, since the shooting) and though there has been some arson, looting, protestor and police violence, and arrests—even of news reporters—the protests generally have been peaceful. Not only police action during these protests, but their equipment as well, have sparked criticism and the growing demand that law enforcement agencies demilitarize. The daily protests have persisted, at times growing in great number, as during a series of “Hands up, Don’t Shoot” events that were held not just in Ferguson, but in many cities nationwide, including Chicago, New York, Washington, D.C., Los Angeles and Omaha, Nebraska in August and September. The “hands up” stance is to protest Brown’s shooting which some, but not all, witnesses have stated came even with Brown’s hands up in a gesture of surrender to Wilson.
Missouri Governor Jay Nixon, and other state and local officials, along with many of the residents of Ferguson, fear that if the Grand Jury does not indict Darren Wilson for Michael Brown’s murder, civil unrest will erupt into violence, producing an event similar to the Los Angeles Riots of 1992. In Los Angeles, large numbers of persons rioted when it seemed that the legal outcomes of two back-to-back criminal cases smacked of black injustice—the acquittal of four white police officers indicted in the assault of black motorist Rodney King, and the no jail-time sentence of a Korean shopkeeper found guilty for the murder of Latasha Harlins, a black teen. The result was the worst race riot in US history, with more than 50 people killed, the burning of a substantial portion of the ethnic business enclave of Koreatown, and at least a billion dollars in property damage.
Certainly the fear is a legitimate one. The vast majority of US race riots that have centered on black participation have occurred with like conditions as a spark—the community’s belief that a youth or vulnerable person among them has been brutalized with state sanction. The nation has witnessed these events not only in Los Angeles in 1965 and 1992; but also in Harlem in 1935 and 1964; Richmond, California in 1968; San Francisco in 1986; Tampa, Florida in 1967 and 1986; Miami in 1980; Newark, New Jersey in 1967; York, Pennsylvania in 1969; Crown Heights (Brooklyn), New York in 1991; St. Petersburg, Florida in 1996; Cincinnati, Ohio in 2001; Benton Harbor, Michigan in 2003; Oakland, California in 2009 and 2010, and the list goes on. These events all have served as cautionary tales that, unfortunately, have not resulted in either the perception or reality of black equality before the law. It is this legacy that frustrates and frightens Ferguson residents.
The post The legitimate fear that months of civil unrest in Ferguson, Missouri will end in rioting appeared first on OUPblog.











Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
