Neville Morley's Blog, page 16

May 24, 2023

Stargazer

It’s all going a bit third-century crisis down at the pond – which is to say that we’re into day 4 of the great Emperor Dragonfly Emergence 2023, and by my count we’ve now had either 35 or 36. A single Anax imperator emerging from the depths of the pond and breaking out of its exuvia (nymph exoskeleton) is a magnificent experience – they do it at night, to avoid predators before they’ve really got the hang of flying, but I’ve seen them crawling up their chosen reeds in the evening, and then sometimes in the morning a straggler is still extending and drying off its wings before taking to the air with a sudden clatter (for insect-level values of ‘clatter’). With seven or eight of them fighting over the sturdiest reeds in the best positions, the occasion loses a certain amount of dignity, especially when latecomers start locking their legs into position not around a plant stem but around the exuvia of a predecessor, itself locked tightly onto a plant stem.

The total numbers are substantial (having checked my records; yes, of course I have records): pond was established 2020, 15-16 emperors emerged in 2021 and maybe 12 or so in 2022 (it was a bit tricky to be certain, as the tail end of their emergence season overlapped with that of the very similar Southern Hawkers). What is really striking is that this year it’s all packed into such a short period of time, rather than a couple every few nights over several weeks as happened before. My reference books (yes, of course I have reference books) tell me that this is actually normal: emperor dragonflies can spend either one or two years as nymphs, lurking at the bottom of the pond and devouring passing beetles and tadpoles – dragonfly nymphs are scary predators – and while the former emerge sporadically, the latter synchronise their emergence. I imagine the Class of 2021 must have almost all graduated by now, but we’ll have to see this evening. It was actually a relief this morning to see one broad-bodied chaser exuvia amidst all the emperors – is this the Diocletian who will restore a bit of order..?

No, the analogy doesn’t work for an instant – perhaps because I’m more focused on wondering about scholarship on classical ideas about dragonflies. It’s not exactly that I’m now planning an article on the subject; I really, really need to get the book on Thucydides, and a couple of other books to which I’m already committed, out of the way before embarking on any side projects – and anyway I have some thoughts on classical depictions of bats and the history of their association with fear and death to work up first…

But I do have an instinctive wish to start researching around something that fascinates me. There is undoubtedly a case for describing me as having a butterfly mind, randomly fluttering from one topic to the next – epitomised, perhaps, by the way I can pop up on a programme about Vergil’s Georgics to ra,bake enthusiastically about bees. I’m now tempted instead to claim to have a dragonfly mind; zig-zagging around in apparently random directions, but always with ruthless focus in pursuing and seizing a passing topic before moving on to the next thing. And, to mangle the analogy still further, it’s the years spent grubbing around in the mud that provide the basis for present displays of brightly-coloured aerodynamics…

Update: two of today’s emperors developed damaged wings – as sometimes happens, normally if there’s rain as the wings first expand – and, since they couldn’t take off, were eventually eaten by Olga, who has decided to play the role of Persia in this scenario.

 •  0 comments  •  flag
Share on Twitter
Published on May 24, 2023 05:31

May 17, 2023

Out Of Time

I’m writing this on the train back from London on May 10th, after recording an episode of In Our Time with Melvyn Bragg and wonderful colleagues Katharine Earnshaw and Diana Spencer, on the subject of Vergil’s Georgics – yes, there was a reason why I responded to thunderstorms and flooding in the South-West last week with quotes from the storm section of Book 1. I’m posting this on the morning of broadcast – the programme is no longer recorded live on Thursday mornings, which sadly takes away a little of the high-wire thrill of over-caffeinated improvisation, but they can now do wonderful things with editing – so as not to spoil the surprise, and to reduce the amount of time people have to get worked up about quite how far out of my lane I’m wandering this time. With a bit of luck they will now be too busy laughing at my increasingly mannered delivery…

There is a perfectly reasonable explanation for how I ended up in this position which does not involve the suggestion that I am now trying to claim authority in every field of classical studies, honest. The theme which I’d suggested to the IoT team years ago was Roman agronomy more generally, with the idea that I could chat happily about Cato and Varro while others could do the heavy literary lifting on matters Vergilian. The team’s research and planning process did then turn out to involve more questions about the Georgics than I’d expected, but I explained that they’d need to find an actual Vergil expert or two for those, and just talked more about Cato and Varro (and Hesiod). And then the plan for the discussion – yes, there is a plan – gave all those questions to someone else.

To be honest, by this stage I was just relieved no one was asking me to talk about dactylic hexameters; providing a bit of general historical context, summarising the structure of the poem and getting over-excited about bees (as usual) felt like a manageable task. However, the one thing anyone who’s done the programme knows – and I think it’s also detectable in at least some episodes to the casual listener – is that that Melvyn Bragg regularly decides to ignore the plan, redistribute questions or indeed ask new ones, and so I’ve had a fascinating couple of days hastily cramming an assortment of recent scholarship, just in case I was asked about something else. Though I would have drawn the line at Lucretius.

The hilarious thing is that, although Katharine Earnshaw is my colleague at Exeter, this was the first time we’d met in person for at least a year (plague, broken leg, broken foot, more plague) – and almost certainly the longest conversation we’ve actually had on matters agronomical, since I arrived here, despite sharing these interests. Given how much I’ve enjoyed reacquainting myself with the Georgics – to the point of frustration that we really needed another hour for a proper discussion of their complexities and ambiguities – I’m now very much hoping that Katharine and I can build on the experience, and perhaps even retrospectively justify my being involved in the programme.

Just as Vergil’s future ploughman will uncover the rusted weapons and heroic bones of past civil wars and wonder at them, future listeners will puzzle over the question of why so many prominent Latinists got quite so cross about Morley’s inclusion….

 •  0 comments  •  flag
Share on Twitter
Published on May 17, 2023 08:43

May 16, 2023

New Kid In Town

It’s actually remarkable that ChatGPT has been around for only six months; not so much because of how much it’s developed and transformed our lives in that time, but because of the volume of discussion about whether it’s about to transform our lives, restructure the entire economy, destroy traditional approaches to education etc. – it feels like I’ve been reading this stuff for ages. I’m aware that I’ve contributed to this, and am indeed egging myself on, so to speak, by attending exploratory workshops on the topic that encourage me to reflect further on its potential impact and how we should respond, and then write blog posts about it. Am I ahead of the game in trying to get to grips with this new technology and its implications – or am I getting caught up with the hype, in the way that people were once convinced that MOOCs would sweep away old-fashioned universities or got incredibly excited about the pedagogical potential of Second Life?

There is one very obvious argument for taking this one more seriously. To generalise wildly, previous techno-panics in HE have either been entirely top-down – creation of would-be disruptive new forms of course delivery driven by venture capital and university leadership trying to jump on the bandwagon, such as MOOCs – or tangential – identifying or imagining that The Kids are doing something new, like disappearing into virtual worlds, and trying to harness this to higher education. What’s different with ChatGPT is that it’s bottom-up: students are using it already (15,000 visits to the website using U. of Exeter WiFi in January – and I don’t use the Exeter WiFi for this, so it’s not me) – and they’re using it for their assessments.

They’re not human! They’re here already! You’re next!

There is, I think, a reasonable argument to be made that the biggest threat from ChatGPT in relation to assessment is not the thing itself but the likely reaction to it; namely, either a reversion to traditional unseen in-person exams, to check that the plausible but vacuous bullshit is the student’s own regurgitation of short-term memory, or a resort to technological counter-measures (note: professors saying in social media that they asked ChatGPT to identify whether it had generated a given bit of text is not a great look for higher education). As I’ve said before, if ChatGPT can perform well in the assessment tasks we set, given that it is a generator of plausible but vacuous bullshit with no actual understanding, then the problem lies with our assessment tasks, even if we then try to ChatPGT-proof them.

It’s also potentially an issue with marking criteria and how we use them. I can see the logic of breaking things down into individual components like ‘knowledge’, ‘understanding’ and ‘argument’ to try to explain to students what we’re looking for in their work – but the risk is that, firstly, they start to think of these as separate whereas actually they are interdependent, and secondly we start to use them in the same way, giving an essay credit for having a load of information without any trace of understanding, or rewarding a bold argument even in the absence of any supporting evidence. As I say to my students early in the first year, at university level we take it for granted that you should know the material, it’s what you do with it that counts – but to be honest I don’t entirely follow through on that in my marking, not least because the criteria encourage me not to.

So, we need to think again about exactly what we expect students to learn, and make sure that this is what we assess – and consider how much of this can be generated automatically, and how much this matters. Some of the more positive/optimistic takes on ChatGPT suggest that we could think of it as a tool like a spell-checker; we don’t penalise students for using one of those, and we might reward them for a well-presented piece of work as a result. If ChatGPT helps produce a clear, accurate, well-written version of an analysis based on the student’s own knowledge and understanding, then that’s not a problem unless we’re over-valuing form of expression rather than content.

Obviously it is a problem if a student resorts to ChatGPT to substitute for their own knowledge and understanding, since at that point we would be giving them credit not for the effective use of a tool but for getting some-one/thing to do all the work for them. (How far we can tell the difference is another thing). It is worth thinking about motive here. There is certainly a tendency for some academics to adopt a basic stance of “all/most/many students are lazy/cheating bastards who will seize every chance to get away with what they can”. I tend to go to the opposite extreme of assuming that the ones who resort to plagiarism are generally stressed, panicking, struggling, confused and/or desperate; they may not have done enough work, they may not be up to the task, but if they know they’re doing wrong – not all do – this is in the spirit of not feeling they have any choice, rather than of “heh heh this’ll fool the idiots”. I imagine ChatGPT gets fired up in a similar spirit – but I would really like to have some data on this.

Of course, as with plagiarism, a generated essay isn’t likely to do the student an enormous amount of good, except insofar as their sole goal is to submit something with a chance of passing. Part of the danger of the ChatGPT hype, especially the constant reference to ‘Artificial Intelligence’, is that it might encourage confidence in something that is actually rather crap (cf. claims about bespoke essay-writing services). The workshop I attended yesterday did, perhaps just for reasons of balance, include some discussion of possible positive uses of ChatGPT, for example to produce introductory literature reviews and summaries of complex ideas. Really? The thing has no knowledge, no understanding; it generates text strings that are statistically plausible in terms of word sequence but have no necessary connection to reality. I could use it to generate research ideas by spotting points where it goes seriously off-piste; if you don’t have enough prior knowledge, you’re not going to know that – so selling it as a source of basic information is actually more dangerous than selling it as a high-level random ideas generator.

So, the student turns in the work with a fair chance that it’s so terrible they fail anyway – and on top of that they haven’t learned anything that the assessment was supposed to teach them, or even any lessons from its failure; whatever happens, it’s just file and forget. But of course that is true of most assessments anyway; students and lecturers alike treat them as one-off moments of credentialling, even if the latter pay lip service to higher ideals – outrage at the possibility of ChatGPT involvement is much more about the idea the student has got a mark they don’t “deserve”, rather than about the failures of the learning process and the possibility that the student is now in an even weaker position going forward, whether to further study or to work.

Part of my solution to all this is ‘slow assessment’ – a process of ongoing development, feedback, revision, discussion. Just having a single draft-feedback-revision system in my final-year modules has meant that most students – the ones who engaged with the process and actually come and talk to me – produce substantially better work, and my hope is that direct experience of responding to feedback will be valuable in itself, besides the opportunity to help them understand where and why their arguments aren’t working properly, to highlight really good things and so forth. This could all be developed further; collective discussion of work in class, for example, in lieu of conventional ‘here is a load of information’ presentations. The exercise that is in some ways closest to this in form, the dissertation, is in other ways the furthest removed, insofar as it lacks the structure for regular review; it’s a struggle to get even the best students to engage with meetings or start writing early in the year.

In the case of the dissertation, this is exacerbated by baroque rules about how much draft work we’re allowed to look at, for fear of creating an unmanageable workload if all students asked for it. Proper assessment simply is expensive in terms of time, even if it’s more rewarding (and hunting down plagiarism or non-existing ChatGPT references is both time-consuming and very unrewarding). Part of the reason that ChatGPT feels such a threat is that even relatively progressive, down with traditional exams types like me had got into a reasonable routine for working through scores of assessments quite efficiently, by setting tasks so I can quickly see whether or not the student is doing what I expect. Slow Assessment is more individually focused, and hence takes longer and demands more concentration – and, no, out-sourcing it to ChatGPT is not the answer. ChatGPT is never the answer…

Just to illustrate that final point, one of the activities in yesterday’s workshop was to play around a bit with the thing – I guess they weren’t sure how much experience workshop participants would have had – and I took the opportunity to ask it about my own research. This was, as always, illuminating…

Neville Morley has published numerous articles on a wide range of topics in ancient history and classics. Some of his highly cited articles include:

1. Morley, Neville. “The Ancient Economy and Modern Historiography.” Classical Antiquity 21, no. 2 (2002): 283-324. doi:10.1525/ca.2002.21.2.283.

2. Morley, Neville. “Class Conflict and the Roman Republic.” Historia: Zeitschrift für Alte Geschichte 45, no. 4 (1996): 390-416. doi:10.2307/4436295.

3. Morley, Neville. “Beyond the Polis? Greek Regionalism and the Archaeology of the Greek Countryside.” World Archaeology 29, no. 3 (1997): 360-374. doi:10.1080/00438243.1998.9980392.

4. Morley, Neville. “Thucydides and the Idea of Progress.” Classical Quarterly 57, no. 2 (2007): 375-388. doi:10.1017/s0009838807000504.

5. Morley, Neville. “The Invention of the First Professional Archaeologists in Britain and their Mediterranean Interests.” Antiquity 74, no. 286 (2000): 319-327. doi:10.1017/s0003598x00060155.

These articles were published in some of the top journals in the field of classics and ancient history, such as Classical Antiquity, Historia, World Archaeology, and Antiquity.

 •  0 comments  •  flag
Share on Twitter
Published on May 16, 2023 01:56

April 27, 2023

Everyone Is Everybody Else

And so farewell!* To Göteborg (and its unbelievable range of amazing imperial stouts, above all), to the European Social Science History Conference for another two years, and, slightly abruptly, to my role as one of the co-chairs of the Antiquity network of said conference. I think I’ve been doing that role for about ten years – the fact that it involves only short bursts of activity every other year, rather than anything more sustained, means I keep forgetting. On the one hand, it isn’t a huge task and it does offer the chance to try to gather ancient economic and social historians together to exchange ideas and, occasionally, remind historians of other periods that we exist, plus an excuse for visiting some great European cities; on the other hand, I can imagine that from the outside it might look a bit like an attempt at hanging onto a position of power and influence, gate-keeping and extending patronage etc. Which it never has been, but that still meant it was starting to feel like time to pass things on to someone else. It’s just been a bit quick, from an informal chat with a potential successor one day to realising next evening that this was my last time co-organising our bit of the conference. The dogs bark; the cavalcade moves on.

I don’t know if anyone has actually attempted it, but if not it could be an interesting research project to analyse the ESSHC as a microcosm of trends in historical studies over the last twenty years. There is some obvious inertia, insofar as the structure of (mostly thematic) ‘networks’ with their allocations of sessions was pretty well set at the beginning – though changes in the size of allocation from one conference to the next may reflect the number and quality of submissions in different areas rather than just the hustling capabilities of the network chairs. Within different networks one might spot trends, as upcoming researchers seize their opportunity – I can’t say I can see any obvious shifts in ancient studies, but certainly there have been some significant ground-breaking panels over the years, giving a sense of new directions in research.

Conversely, when I started attending the conference, my go-to alternative when there weren’t any antiquity sessions – the other bit of the programme I would scrutinise carefully – was the Theory network, which had thought-provoking discussions of different aspects of historiography, methodology and history of scholarship, with a definite postmodernist slant. These days, there’s little sign of anything like that; there are still Theory sessions, but nothing to tempt me to forego sight-seeing during antiquity-free time slots. I’m sure the debates are signifucant for those engaged with them, but as an outsider I don’t get much of a sense of identity or urgency, of the sorts of issues that demand attention from all other branches of historical study. It’s all a little sad.

I seem to have spent a lot of my time lately in somewhat bleak contemplation of my professional career at half three in the morning – which is to say, as a proportion of my waking hours it’s pretty small, but it does have an out-sized effect on the rest of the day. I suspect this has a lot to do with my still-broken foot, and associated feelings of frustration, uselessness, age and physical decline, and the fact that I have to spend so much time waiting around for buses or limping up and down the endless hills that constitute the Exeter campus means I am even further behind with all the things I’m supposed to have done. There’s nothing like availing oneself of the mobility assistance at Heathrow Airport on the way out to Göteborg, entailing long periods of being left in a corner with occasional bursts of being trollied around like a sack of potatoes, to promote feelings of doubt and erosion of self-image.

One of the things that occurred to me in the small hours one morning was that, idiosyncratic as many of my career choices might appear – my list of publications should probably come with its own epitaph, ‘It seemed like a good idea at the time’ – they are perhaps all too obviously reflections of my graduate experience. Economic and social history, and historiograhy; strident assertion of the primacy of social-science methods over fuzzy inductivism; miscellaneous bits of Theory, especially Weber and Marx; a tendency to obsess about concepts and terminology rather than worrying much about evidence: could I be any more Cambridge?

And then I started to wonder whether this was actually true – not that I am not a random amalgam of those things, because that’s obviously true, but that their lineage might not be quite so obvious to most people. Not so much ‘Cambridge ancient history’ as ‘Cambridge ancient history in the late 80s and early 90s’. At the time, it seemed natural and immutable: we were following in the footsteps of a (frankly terrifying) lineage, Osborne and Rathbone and Morris and Woolf and Alcock and Edwards and von Reden and Hall and all, taught by Cartledge and Garnsey and Hopkins and Snodgrass, looking back to Finley and Jones. How could I not then end up dedicating a chunk of my time over more than a decade to a conference focused on social-scientific approaches to history?

It wasn’t/isn’t just about subject matter and research topics; it’s not even necessarily about the adoption of specific methodologies or theories. It’s more an attitude: of alertness, to concepts, assumptions, forms of analysis, rhetoric – one’s own at least as much as anyone else’s – and of a kind of playfulness, wanting to ask questions more than to offer answers and, if you have to choose, to be interesting rather than necessarily right. And from the outside, I imagine it also looked like a particular sort of arrogance, a sense of being (or aspiring to be) clever in a distinctive manner, not necessarily superior in everything, but certainly superior in the things that really mattered, namely quality of argument and theoretical sophistication. All of which was carefully inculcated through the annual Loxbridge meeting – yes, children, once upon a time the Annual Meeting of Postgraduate Ancient Historians was limited to students from London, Oxford and Cambridge – where we were deployed as proxy forces in the intellectual struggles of our teachers against Those Boring Oxford Empiricists.

It was, shall we say, a special time, at least within this very small and self-obsessed world. It was brought powerfully back to life just last week on Mastodon, when the sociologist Kieran Healy commented on the recent (excellent) essay by Carlos Noreña on the work of Paul Veyne that he (Healy) had bought the translation of Bread and Circuses when it appeared, and had been dismayed to find that substantial sections of the original – the theoretical sections – had not been included on the grounds that English-speaking audiences would have no interest in them. Been there, had that reaction – and glibly concluded that it was bound to happen if you asked someone from Oxford to take charge of such a project. (Note: I have absolutely no idea whose decision it was to abridge the book, but in 1992 the culprit seemed all too obvious).

And now? Not so much. Economic and social history, let alone cultural or gender history, are no longer pitted against traditional political history, but are all seen as different facets of the same enterprise; material and textual approaches are no longer bitter rivals but happy bedfellows; making use of ‘theory’ or adopting ideas from the social sciences are no longer radical gestures and/or the devil’s work, but just the sort of thing one does when pragmatically appropriate, even at Oxford. Indeed, the idea that there is a distinctive ‘Oxford’ approach to ancient history, whether or not defined in terms of how ‘theory’ might be evaluated, seems passé – likewise the notion of a distinctive ‘Cambridge’ approach.

It’s not that there is no controversy, but it seems to have shifted almost entirely into the realm of the public: the wider cultural/political significance of classical antiquity, and how one should respond to this, especially but not only in relation to current ‘culture war’ issues. The debate is less about how one interprets ancient history, and more about how far one orientates the study towards contemporary issues; it’s about the attitude of the researcher more than the implications of their research. Perhaps because the gap between actual academic knowledge of classical antiquity and the assumptions about it within non-academic discourse is so enormous, the two can lead more or less separate lives. We put our energy into arguing about how far it matters that, say, popular conceptions of Roman imperialism or demographic changes in late antiquity are tendentious and problematic, where once we would spend a lot more time arguing about how to understand imperialism or demographic change. It’s not that the contemporary politics of antiquity are not important – but they aren’t the only thing we should be arguing about.

One could argue that the relative absence of controversy in most areas of ancient history – and the absence of clear identities or distinctive methodological positions associated with particular universities – is a positive development, as the discipline has matured and accepted the relevance of a range of approaches rather than dogmatically insisting on a kind of (locally-defined) scholarly purity. It’s a sign of the declining power of the Doktorvater (the patriarchal overtones seem especially relevant in this context) in defining the research of his graduate students, determining their topic and approach; grad students have far more freedom to shape their own studies rather than having to conform to the principles of a ‘school’ in order to survive the experience.

And there is the obvious flattening-out effect of globalisation and the Internet; quite simply, we know much more of what is going on everywhere else, rather than – as in my day – relying on journal reviews (the example that comes to mind is Dominic Rathbone’s reviews in JRS of the monumental multi-volume projects by Italian Marxists in the 1970s and 80s), or indeed on occasional conferences and the ‘Loxbridge’ event to learn how they do things differently elsewhere. Researchers move university and country more often, taking their assumptions with them but also blending them with what they encounter elsewhere. The most powerful ‘networks’ – above all, the networks of Anglophone scholars – become ever more dominant, because they are now more accessible and better able to exert influence beyond their locality., not least in making their assumptions a default.

In terms of conventional globalisation theory, then, we have reflection and reflexivity; as researchers we are now influenced by, and conscious of, ancient history as an amorphous ‘global’ discipline (which conceals the continuing existence of hierarchy and power dynamics), and likely to adopt many of its assumptions, terminology and agenda in order to get to participate. Trying to restore a sense of local academic identity, resisting such homogenisation, is obviously regressive and parochial (and completely pointless) – and would tend to have the effect of reinstating problematic academic hierarchies, forcing students to conform to their supervisors’ prejudices and reducing their prospects in a globalised academic job market.

In retrospect, the Oxford-Cambridge rivalry of my postgraduate years was basically a spat between two local traditions of cheese-making, each convinced that theirs is the only true cheese and their tastes the only reliable arbiter of what makes for a good cheese – and that’s what they trained people to produce. They have now both been exposed to global tastes and the power of the market, and have actually also done a bit of constructive knowledge exchange, learning from one another. Clearly it’s no more than a sign of my indoctrination as an artisan in one specific tradition, before the Internet, that I do feel that something has been lost – that ancient history these days offers a wide choice of cheddar and US-style IPA, but is lacking in Pont l’Évêque and Berliner Weiße.

If I have to think of myself as an outmoded product of antiquated academic practice, it is going to be as a strong, smelly, not-to-everyone’s-but-appreciated-by-a-few-connoisseurs sort of outmoded product…

*Yes, I did start writing this a fortnight ago. It’s been one of those fortnights.

 •  0 comments  •  flag
Share on Twitter
Published on April 27, 2023 03:28

April 6, 2023

Almost, But Not Quite, Entirely Unlike Tea

One of the (relatively few) things I hate about marking student work is the sinking feeling when a suspicion of plagiarism starts to form; the moment when something like an abrupt switch of style or changes in spelling or the wording of a phase makes you look over the pages you’ve already read and start spotting more such possible indications. Turnitin, as we all know, is moderately useless – it generates lots of false positives (well-formatted bibliography entries and properly referenced quotes) while missing stuff that can be found with a quick google – so this suspicion implies the need to invest a load of time in identifying and checking likely sources and marking up the exercise, while experiencing a general feeling of annoyance and disappointment.

Well, I’ve now found something worse: the suspicion that a substantial portion of an essay may be fake. The alleged quotation from, say, Cicero, that doesn’t actually sound much like Cicero; sometimes with a precise reference that doesn’t lead to anything like the quote, sometimes with an unhelpful reference (Republic 20, anyone? Check 1.20, 2.20, 3.20; check 1.XX (=33) to see if that’s it; check p.20 in the edition cited; check p.20 in any other edition to hand…), sometimes with no reference at all. Google key phrases; google key terms; search sections of the text where the topic might be discussed. All of this in the knowledge that it’s impossible to prove a negative, that there could be a bona fide translation out there that hasn’t been digitised so wouldn’t show up, so negative results don’t prove anything. The one thing to be said is that years of experience in hunting alleged Thucydides quotations has given me some relevant skills, but also a sense of their limits.

The ‘glass half full’ approach to all this is to think that at least the students in question have grasped the importance of engaging with ancient sources and supporting their argument with evidence; they’ve just skipped the usual step of finding some actual passages in the sources that make the relevant point, and have simply made them up.

Or: have had them made up – not as deliberate fakes, because why on earth would anyone do that in this situation, but they’ve outsourced the writing of the essay to someone, or something, that’s indifferent to questions of actual truth. My suspicions in this direction were aroused by a bibliography that included a load of stuff that I hadn’t recommended and didn’t recognise – which is of course usually grounds for praise, that the student has gone off and done their own research.*

Okay, suppose you were marking an essay on how the Romans came to terms with autocracy in their political thinking, and came across the following:

Morley, N. (2010) “Tacitus and the ‘state of exception’: how the institution of the dictatorship legitimised the Principate”, Polis: the journal of Greek and Roman political thought 14: 3-47

Relevant topic; credible journal; not exactly one of Morley’s specialist topics, but he writes all sorts of random stuff, and the implied use of modern political theory is entirely typical; nothing to worry about. Except that of course it’s completely fake; no such article exists. I’ve made up this example (but I really ought to write it some time…) as I’m not sure about the ethics of quoting the actual imaginary references from a dubios student essay, but I can assure you that they were equally convincing at first glance: the sorts of scholars you’d expect to be writing on this topic, publishing where you’d expect them to publish.

The good news is that such fake references are much quicker and easier to verify than the quotes are: check the journal’s archive, check the webpage of the supposed author. If you feel like offering the benefit of the doubt, check the title on Google Scholar. What I find remarkable – the reason I didn’t at first question these entries in the bibliography – is the level of knowledge required to create a plausible fake. You need a sense of the names of people likely to work on such a topic, and of the likely journals they’d publish in, and of the form and content of article titles. If you’ve done that amount of reading, why not just write the bloody essay properly?

Well, obviously one possibility is that the student hasn’t done that reading. It seems plausible if not rather likely that this is a Large Language Model (i.e. an ‘AI’ chatbot) scanning the vast corpus of published words and constructing a plausible-seeming reference on a statistical basis: X is a scholar regularly associated with topics in ancient political thought (not necessarily differentiating between Greek or Roman), Polis is a journal that publishes articles on ancient political thought, this string of numbers is associated with something published in Polis, this string of words resembles the strings of words in quote marks associated with Polis, and so forth.

I was suddenly reminded of the Nutri-Matic machine in Douglas Adams’ The Hitch-Hiker’s Guide to the Galaxy (the original radio series, obviously).

The way it functioned was very interesting. When the Drink button was pressed it made an instant but highly detailed examination of the subject’s taste buds, a spectroscopic analysis of the subject’s metabolism, and then sent tiny experimental signals down the neural pathways to the taste centres of the subject’s brain to see what was likely to go down well. However, no one knew quite why it did this because it invariably delivered a cupful of liquid that was almost, but not quite, entirely unlike tea.

One might apply that judgement equally to the bibliographic entries and to the essay as a whole: looks good, but undrinkable. To extend the metaphor: if as a marker you just glance at an essay, rather than properly tasting/testing it, you might be fooled – but the proper level of scrutiny implies a substantial amount of extra time, and a permanent bad taste in the mouth.

The alternatives, given that ChatGPT has apparently got better at faking references in just a couple of months compared with my last discussion of it, are equally unpalatable. One is a reversion to traditional unseen in-person short-term-memory-regurgitation exams, on the basis that they’re harder to fake even if they don’t actually assess what we’re supposed to be assessing. Another is outsourcing assessment to other ‘AI’ programmes; well, I tried the two best-known ones, ZeroGPT and OpenAI’s own AI classifier, and both estimated only 25% AI-generated, ‘this was probably written by a human’ – NOT flagging up the fake bibliography entries. They look genuine to an LLM; so another LLM isn’t going to quibble. The future will see Chatbots serving up glasses of slurry and other Chatbots confirming that they are indeed perfect cups of tea.**

The obvious risk is that the conversation becomes wholly driven by ‘How do we stop students cheating by using Chatbots?’ and/or ‘How do we set things up so we don’t invest lots of time trying to establish whether or not a Chatbot may have been used?’: wholly defensive, rather than focusing on the best way to assess the skills and knowledge we want to assess. I’ve been using ChatGPT content in classes recently as the basis for helping students develop critical analysis – here is a plausible-looking bit of academic prose, now identify its deficiencies – and at the same time exploring the underlying processes insofar as we can identify them (why is it generating occasional howlers within the usual mass of vague assertions?). This could be a way forward in assessment as well: focus more, at least in the first couple of years of university, on evaluating critical skills (just as we already use source analysis of individual passages to build towards using lots of evidence to support arguments).

At the moment, you could say, we ask history students to focus on producing simulacra of academic arguments; 2000-word imitations of the sorts of stuff they’re reading, but unavoidably thin and watery at best. It’s partly, I guess, a kind of ‘learning by doing’, on the assumption that they will pick up the skills and understanding required for genuine high-level historical analysis by repeatedly trying and failing to put them into practice – rather than focusing on teaching and assessing those ‘component’ skills first. Is there a fear that they will be bored or alienated by being asked to do detailed exercises rather than getting to pontificate about the Causes of the Fall of the Roman Republic in 2000 words? An old-fashioned assumption that, whatever we say in public, history is really all about content and old-fashioned sweeping surveys and narrative – or at any rate, that’s what we want to teach ‘cos it’s more fun?

If ChatGPT can produce plausible, passable equivalents of the sort of ersatz historical analysis we get first- and second-year students to generate, AND those exercises are not wholly reliable means of getting those students to the level of producing genuine historical analysis, then what is the point? Especially if we are expected, for credentialling purposes, to devote serious time to trying to determine whether a given piece is human-generated mediocrity with dubious references or AI-generated?

The basic problem – among the basic problems – is that, while it’s relatively straightforward to determine that quotations, let alone references, are fictional, it isn’t obvious how to demonstrate that the author is human or machine. For the specific essays I’ve been reviewing, we decided in the end that this didn’t matter; they’re the first part of a two-stage assessment, in which students get feedback on their first draft and revise the work for resubmission – and so there would be little point in requiring them to re-do the draft for a capped mark (the usual penalty for academic malpractice short of definitively-proven cheating on a grand scale), which I’d then have to spend time marking, when they can instead be expected to sort out these issues in the revised version, i.e. do what they would have to do anyway but make it very clear that they’d recognised the problem and how to remedy it.

Obviously this doesn’t work for a one-off assessment where the question is whether or not to penalise, depending on whether guilt can be established. Which is perhaps another reason for focusing assessment on the development of actual skills, including learning from feedback, rather than just the award of marks. ‘This is an undrinkable cup of tea; go away and do this to make a better one’, rather than just throwing the cup at the machine. Share and enjoy!

*So long as it’s not another Canadian Masters thesis or US undergraduate thesis, things which appear ever more frequently as students respond to our attempts at setting less generic questions, to encourage independent thinking, by looking up the limited number of things on the Internet whose titles suggest they cover exactly the same ground, regardless of age or credibility…

**Would an LLM flag up this gem, produced by ChatGPT? “Socrates was among those who were accused of opposing the Thirty Tyrants and was forced to drink hemlock in 399 BCE as a result.”

 •  0 comments  •  flag
Share on Twitter
Published on April 06, 2023 06:31

March 24, 2023

The Bots Are Back In Town

I’ve been doing a lot of muting on the Twitter this morning – over eighty separate accounts, all of which tweet out short extracts from that early P.G. Wodehouse story I’ve discussed before, as well as other stuff that I can’t be bothered to look up. For the most part, that’s all they do; whereas the accounts I muted back in the autumn generally sent out images or gifs as well as text, apparently advertising things like betting sites for the World Cup, the vast majority of these just individually meaningless fragments of text (the exceptions are a sub-group that include pictures of anime girls).

On the basis of a cursory survey as I muted them, there are at least three distinct categories. There are those with no avatar, no information, not many followers – although, supposedly, they were created 8 or more years ago (can that be faked?) – that just tweet text. There are the ones that tweet anime girls as well as text, that otherwise have the same characteristics as the first group. And then there’s a bunch that look slightly more like normal accounts, with avatars (mostly NFT apes), a bit of information (frequent mention of #crypto, #nft, #C+programmer and various sports), and follower counts in the tens or scores. But they just tweet out fragments of text, so clearly aren’t real either.

Screenshot of a Twitter account Screenshot of a Twitter account

It’s not trolling – I see these and get annoyed simply because I regularly search for ‘Thucydides’ and find them clogging up the results, but you would have to be searching for some pretty specific terms to encounter them normally. It’s more like a kind of pollution – the green algae bloom that rapidly takes over ponds and streams when the conditions favour it; the digital ecosystem clearly includes some mindless organisms that churn out toxic rubbish in the background, invisible unless you go looking for them. It’s not harmless; there are presumably servers out there, wasting energy in supporting this activity.

Are they a bot reserve army, ready to be summoned by ‘Boost Your Follower Count’ businesses when they need to astroturf an issue or make a dodgy research centre seem legitimate? Are they in fact micro-components of a vast collective entity, whose activity only appears mindless and directionless if you don’t have the bigger picture? Are they the digital equivalent of actinomycetes, breaking down the debris of human culture into compost? Am I the only person who’s aware of this phenomenon, and is it a harbinger of the apolocalypse?

 •  0 comments  •  flag
Share on Twitter
Published on March 24, 2023 13:24

March 14, 2023

Rule The World

According to my wife, my falling over and breaking my foot was my body, or perhaps the universe, telling me that I need to slow down and look after myself. I’m not sure how far this is a genuine philosophical position and how far she is grasping at any available argument to try to get me to slow down – she said similar things about the Long COVID that’s drained my energy and intellectual capacity over the last few years – but that could likewise be interpreted either way…

The obvious retort is that, if so, either my body or the universe have got this completely wrong; the last thing that’s going to temper my tendency to overwork is making it harder and slower for me to get done the things that have to be done, even with a strict definition of what those are. If teaching and student consultations take a whole day where once I would have had hours in between to get on with emails and teaching prep, not because I have more commitments but because it takes me longer to get between places and I’m tired out afterwards, then the emails and teaching prep just get bumped to the next day. If it takes me all week to prepare teaching and deal with admin, with no time for anything research-related (reviewing article submissions, for example), then either I do no research or ‘service’ (bad) or I have to find time somewhere else. Making me feel constantly overwhelmed and behind with everything is not the way to make me relax and take things a bit easier; both my body and universe should know that by now.

I would readily concede that I do in fact have a problem, if not multiple problems; the incentive system of academia plus my ingrained work ethic and obsessive tendencies are not a healthy combination. At the end of my life, am I going to regret not publishing an extra article or two rather than spending more time with family and friends? Well, kinda, yeah. Am I willing to be a bit more selfish, and limit my covering for colleagues, supporting students, reviewing articles and the like in order to create more time for writing within a more normal working week? No. I am well aware that something has to give a bit of ground here, either my drive to be an exemplary colleague and dedicated teacher or my intellectual ambitions, but can never decide; in practice, it’s the research and writing, because that is never as urgent, but that is more frustrating in the longer term. We do it to ourselves, we do…

The point of this blog post is not in fact just the usual self-pitying solipsism. It was prompted by an interesting piece on Crooked Timber by Ingrid Robeyns, How To Restore Work-Life Balance in Academia. She makes a series of good points about the importance of work-life balance, the ways in which a culture of overwork has become embedded in universities (in part as a means of making up shortfalls in funding, but also because of skewed incentives), the problem of using competition as a way of allocating resources and so forth. I was particularly struck by one of her early comments, explaining why this matters, between (1) life is more than work and (3) this creates systematic bias against those who cannot overwork, especially those with caring responsibilities:

Second, even if a particular individual would prefer to work more, this should not become the norm. And that is the problem: structural overwork has become totally normalised in academia. It is endemic. That means that it is very difficult to refuse to go along with doing structural overwork, as the typical amount of work expected from an academic can’t be done in 40 hours. If in the current system one sticks to 40 hours, one is very likely to either let down one’s students, or the junior scholars whom one mentors and the service one provides to the field, or else to give up (most of) one’s research time.

Put another way: people like me are part of the problem. It’s not that I want to work more, but I do want to do things that require more time than I’m paid for, and so have always put in the extra hours; until recently, I’ve managed to produce the publications and be a good colleague and try to do my bit for the discipline and make attempts at being a public scholar of some sort. And that has perhaps contributed to a wider normalisation of this level of activity, setting unattainable or at least unhealthy standards for others – and, even though perhaps my body is now having to pay out on a load of cheques that were written years ago, the person most convinced that this level of activity is normal is me. And I should be stopped.

Robeyns includes a range of suggestions for tackling the situation, including the knotty problem that so much of this is about the mostly free choices made by academics, even if those are constrained by norms and expectations. I am struck, however, by how individualised the suggested remedies on this latter issue are – how far they assume that academics, if they can just be persuaded to reflect, will then change their habits. The free rider problem – the people who minimise their non-research commitments, never step forward to help out in an emergency, submit lots of articles but shirk reviewing etc. – is to be solved through general agreement that we all ought to do our fair share and adopt a principle of ‘reciprocity + 1’ (if you submit an article and get two reviews on it, you should do three reviews that year). I can’t help feeling that unless something more is done – maybe not just listing reviewing activities as well as publications on a cv or annual review form, but adding an expectation that serious imbalances will be queried by appointment committees or line managers – then this will have only a marginal effect, probably allowing the worst offenders to do even less as the average well-meaning academic will do slightly more. (The ‘reciprocity+ 1’ principle creates slack in the system to compensate for junior colleagues not being invited to do reviews – but there’s nothing to stop the slack being abused by others).

As for the ‘habitual overworker’ problem – we do need a better name – the only suggestion is for such people to take a long walk, or multiple walks, and reflect on their priorities, to consider whether they should make changes to their ambitions and commitments, for their own sakes. Well, I’m not currently capable of taking any long walks… “Just ask yourself this question: assume you continue as you do until you retire, and on the day you retire, you die. Would you be satisifed? Or would you have regrets? If so, it’s time to reconsider your priorities.” But what if I would have deep regrets at not having managed to write all the books I want to write?

What this discussion brought to mind – which is why I’m bothering to write this post – is the Rule of St Benedict. I’ve loved this work ever since I encountered it as a teenager who’d developed a fascination with English monasticism; any inclination to follow such a path in life has been rare and brief, as in lots of ways I am entirely unsuited to it – but it now occurs to me that it might be worth considering that unsuitability further. One way of understanding the Rule is that it is less about how one dedicates one’s life to God, and much more about how one attempts to manage a community of dedicated people – any community and dedicated to any object. It’s not about the problems of the religious life; it’s about the fact that people are people, and getting them to live together harmoniously is hard.

Benedict’s rule, while establishing high standards for behaviour, is constantly temperered by awareness of human weakness and individual needs, and seeks to anticipate the problems these might cause. “Although man’s nature is of itself drawn to feel pity for these two ages, that is, for the old and for children, yet it is fitting that the authority of the Rule should provide for them. Let their weakness therefore be always taken into account, and the rigour of the Rule with regard to food, be by no means kept with them. Let a kind consideration be had for them, and let leave be granted them, to eat before the regular hours” (37). “We think it sufficient for daily refection, both at the sixth and ninth hour, that there be at all seasons two dishes, because of the infirmities of different people; so that he who cannot eat of one, may make his meal of the other” (39). “And although we read ‘that wine is not at all the drink of Monks’, yet, because in these our times, they will not be so persuaded, let us at least agree to this, not to drink to satiety, but sparingly” (40).

On the one hand, Benedict is all too aware of the free rider problem. “And when they rise to the work of God, let them gently encourage one another, because of the excuses of those who are sluggish” (22). “The Brethren are so to serve each other, that no one be excused from the office of the kitchen” (35) – granted, with an exception for those whose work is “of greater profit”, which might look like a get-out clause for the workshy or those who consider service beneath them, but that’s only if the community is big enough to cope, and “profit” is defined by the needs of the community as a whole. “Let them bear patiently with each other’s infirmities, whether of body or of mind. Let them contend with one another in the virtue of obedience. Let no one follow what he thinketh profitable to himself, but rather that which is profitable to another” (72).

The answer is not that everyone needs to be forced to work equally hard, as failure to meet the expected standard may have different causes. “If any one shall be so negligent and slothful as to be either unwilling or unable to meditate or read, let him have some work imposed upon him which he can do, and thus not be idle. To the Brethren who are of weak constitution or in delicate health, such work or art shall be given as shall keep them from idleness, and yet not oppress them with so much labour as to drive them away. Their weakness must be taken into consideration by the Abbot” (48). Still more, “If any hard or impossible commands be enjoined a Brother, let him receive the injunctions of him who biddeth him with all mildness and obedience. But if he shall see that the burthen altogether exceedeth the measure of his strength, let him patiently and in due season state the cause of this inability unto his Superior, without manifesting any pride, resistance, or contradiction” (68).

Pride is a crucial term here – and it relates not only to those who consider themselves to be above certain tasks, but also to those who have contempt for their weaker brethren – on agricultural work, essence of monkhood being to live by labour of one’s own hands: “Yet let all things be done with moderation for the sake of the fainthearted” (48) – and above all those who seek to go beyond everyone else. This is the counterpart to the free rider problem: the vainglory issue.

Monks are people who renounce worldly things in the service of God; there will always be those who seek to be renouncier than thou. As Benedict notes on the subject of Lenten observance: “In these days, therefore, let us add something over and above to our wonted task, such as private prayers, and abstinence from meat and drink… Nevertheless, let each one acquaint the Abbot with what he offers, and do it at his desire and with his consent; because whatever is done without the permission of the spiritual Father, shall be imputed to presumption and vain glory, and merit no reward” (49). The willingness of some to martyr themselves beyond the norm, to work harder and give up more, is not a sign of virtue but of the opposite, both because of the underlying motives and because of the impact on everyone else.

Now there are obvious problems in transferring the monastic rule to academia: the absolute authority of the Abbot, the strictures against laughter, the emphasis on humility as the core principle of the monk’s behaviour. The idea that you can’t have your own books, or your own special pen. But there is surely enough of a connection, not least because of ideas that academia is mythologised as the collegial life of seekers after knowledge, even if in practice it’s been thoroughly Taylorised and subjected to market forces. Benedict’s Rule shows that competition was never absent from the old way – but whereas today it is elevated as a core principle, then it was seen as a threat to the collective well-being, whether it encourages unhealthy emulation or creates despair and doubt among the others.

In brief, if you want to do the academic equivalent of whipping yourself with scorpions or sitting on top of a pillar for thirty years, you can go and do that – but you forfeit your membership of the community, because such behaviour is corrosive of collective existence. When I complain to my ‘academic lead’ that I’m not managing to get everything done that I want to, the correct answer is less reassurance that I’ll get my mojo back eventually, and more a stern injunction that I should stop being prideful and vainglorious.

I am actually capable of accepting such injunctions, now and again. Or at least once. I’ve always been frustrated with fixed rules about how much time one should make available to undergraduate dissertation students and/or how much of their work we should read. Surely I can sort this out with individual students, given that most don’t actually take advantage of even the regular amount of support to which they’re entitled? But not every colleague can be firm in the face of unreasonable demands, and maybe some of them are more approachable or less off-putting – and so I have come to accept that we need rules that set an upper limit, that ensure the workload is manageable for everyone, however much I might wish to go further. And if I ignore the rules, that puts pressure on others and sets a terrible example. Benedict would not approve.

 •  0 comments  •  flag
Share on Twitter
Published on March 14, 2023 13:18

March 10, 2023

It Wasn’t Me

I massively pissed off my wife a few nights ago, by going upstairs to the ‘study’ (which doubles as the music room, as well as general storage and nursery for chilli seedlings) to work on my jazz composition homework for twenty minutes or so, and re-emerging just under an hour later. I readily accept that this is not acceptable behaviour, and have agreed to try setting myself an alarm next time – because this was genuinely a matter of losing track of time due to total absorption in the task of trying to get a melodic phrase right. You can almost hear it in your mind, you know you’ll know it when you hear it, but there are so many different things to try adjusting in the hope of getting closer to what it’s supposed to be, not to mention the need to try to save the better versions in case you want to return to them, that suddenly an hour has gone by and you’re not necessarily any closer to success.

Writing can be very similar. One of the reasons I often struggle to get started properly on a piece is that I can get obsessed with trying to get the opening paragraph, or even the opening sentence, perfect – it’s the riff or the hook that is going to pull the audience in and give the whole thing structure, even if I don’t repeat the phrase in the same way as in a musical composition. [thinks: maybe I should try that…] It’s one of the reasons I like blogging, that it feels more like jazz improvisation where a phrase can be ‘good enough’ in the moment rather than needing to be a timeless statement of brilliance. These posts are not quite a stream-of-consciousness splurge, honest, but I do generally feel much less inclined to tinker with them endlessly.

Of course, if the words don’t come, there are alternatives… I’m in the middle of marking student source analysis exercises for my Ancient Tyranny course, and as ever I regularly find that my feedback has to spend quite a lot of time talking about what the author has done with other people’s words. I have yet to encounter anything that looks as if it was constructed by ChatGPT (whose latest triumph is misattributing a quotation to Euripides and mangling the plot of Alcestis in order to justify this), unless they are very clever at adding deliberate spelling and grammar errors at the end; no, these are serious attempts, from people who are still learning. There are the ones who construct entire arguments by stringing together quotations; the ones who have picked up a useful idea and then clearly forgotten where they found it; the ones whose note-taking needs better differentiation between verbatim quotes and their own commentary; the ones who have taken good notes, and know that they need to put things into their own words, but just don’t do a good enough job so that Turnitin all too easily lays bare the components and the composition process…

There are, you could say, many ways of getting this wrong – ‘plagiarism’ is a very broad term, which partly makes sense for student disciplinary processes (and still more as a spectre intended to terrify them into good habits) and partly means we then have to invoke ‘poor academic practice’ instead as a less heinous label for the majority of cases. The obvious reason for this is that there are several different things to get right, and the emphasis on putting things into your own words and properly crediting sources has multiple goals: showing the sources of information and ideas, engaging with a wider range of material rather than just quoting/paraphrasing a single source, demonstrating command of material rather than just parroting, establishing a critical distance from modern ‘authorities’ as a key step in developing one’s own perspective and ideas.

In some moods, I feel that the last point is the most important, that our goal is to give students the drive, skills and confidence to want to develop their own ideas, using the scholarship they read as raw materials and/or inspiration, and to feel embarrassed if all they did was shuffle pre-existing components according to someone else’s plan. You can, I’m sure, tell that I was the sort of child who regarded the advent of ever more complex pre-moulded Lego pieces rather than just basic blocks as a betrayal and symptom of general capitalist decadence…

Of course, I’m conflating two different things; just as the great thing about Lego is that even the most specific-to-this-model part can be turned to other purposes, there is a proud tradition of making art using ‘found objects’ and creating texts out of other texts. I remain a devotee of W.G. Sebald’s beutiful, haunting books, echoing the fragmentation of human consciousness and the European cultural tradition, for all the controversy about whether his use of other people’s writing is acceptable. It was therefore a bit uncomfortable when John Hughes, the Australian novelist accused of serial plagiarism, claimed to have been pursuing a similar artistic approach – albeit having previously first denied plagiarism and then apologised for it as a mistake, which rather undermines the idea that it was deliberate (in the right way) all along. (Decent summary and discussion here – h/t Yasmin Haskell).

It’s not a defence that works for most academic writing, which has yet to embrace collage as a technique. In the face of such an accusation, if owning up and apologising is rejected as an option, it is necessary for an academic to deny absolutely the existence of plagiarism, either by asserting that the copied texts were never anything more than raw materials requiring no acknowledgement of any ‘author’ or by insisting that the use made of them was acceptable. Perhaps it’s enough to create doubt and debate; one might note the vagueness and multi-faceted nature of ‘plagiarism’, the variations of practice and expectation between different disciplines, the fact that great works of the past were entirely careless about bothering to mention their sources. One might question the motives of the accusers, implying resentment and envy or ideological position, or even suggest the existence of a conspiracy to bring the author down for quite different reasons.

This may be unlikely to convince very many people, but it creates the impression that there are multiple sides to the story rather than the simple tale of a researcher who was careless or lazy, who failed to meet professional standards. At this point, perhaps the primary audience for such excuses is the author themself…

I found myself wondering about the apparently opposite case, of attributing one’s own words to another. Could it be a different way of evading the problem of not having the (right) words – not borrowing the words of someone else, but distancing oneself from the words so it matters less if they aren’t especially good? One foregoes the evocation of one’s own authority, such as it is – but that doesn’t necessarily mean the words have to stand on their own; that would be the case for an anonymous publication, but not if they are ascribed to a different author with his/her own authority – if I attribute my ideas on historiography to Thucydides, or my novel about migrants in the United States to a Guatemalan undocumented person, or my reflections on contemporary Eastern European society to a Jewish lawyer from Minsk.

A borrowed identity as author of my words, rather than borrowed words for my identity as an academic authority; perhaps these are mirror images rather than opposites, if one could imagine the same person doing both.

Part of John Hughes’ attempt at claiming to be the acceptable sort of appropriator of other people’s words was to compare himself to Pierre Menard, author of the Quixote, in Borges’ story – implying that, by reinscribing those words in a new context, he had indeed made them his words. I don’t think that would fly as a defence against an accusation of academic plagiarism either, but it could be rather funny if someone were to try it. You could read Borges rather as implying that there is a perfect arrangement of words to be found, such that Menard could not write Cervantes’ work any differently however hard he tried – but that way lies endless obsessive tinkering with the same sentence, hoping to recognise that perfect arrangement when it turns up.

My current goal is certainly to write better – but also to get things written, to get beyond the anxiety that they’re not good enough. Perhaps I need to write as if it isn’t me writing, as if I were the sort of super-confident, eloquent and authoritative historian I’d like to be – or, to write a draft in that persona, and then ruthlessly plagiarise myself…

 •  0 comments  •  flag
Share on Twitter
Published on March 10, 2023 10:37

February 24, 2023

Oh, Well

What makes for a decent academic legacy? How should one want to be remembered, and by whom? Such a potentially morbid and self-regarding train of thought is not in fact prompted by the fact that I’ve now added a broken foot to the Long COVID, insomnia and constant general tiredness that are making me feel old and useless.

It is partly inspired by having to give advice to assessment-writing students about finding scholarship; not just “no, I’m not going to give you a number for how many modern publications you need to list”, but also “if you cite something from 1934, you do need to be confident that it’s worth citing”. It took me a while to twig why so many students were referencing obscure articles from fifty or more years ago in their work – when I consult something on JSTOR, I now realise, I’m totally focused on the something, and pay no attention at all to the sidebar of ‘if you enjoyed rbis article, you might also enjoy this random selection of other stuff with vaguely similar words in the title’. But clearly at least some students have taken this to be an authoritative recommendation, and – like the similarly expanding use of undergraduate dissertations from the US – potentially evidence that they’re doing their own research, which is obviously a good motive even if the results are problematic.

I do sometimes wonder whether I should just issue a blanket edict against this sort of thing; my current advice is equivocal, which might well serve to confuse things further. It’s not just that it’s good that they are trying to go beyond the module reading list, which is only ever a selection and is probably reflective of my preferences as much as of the subject; there might indeed be useful stuff in very old publications, and even, conceivably, in an American UG dissertation. The problem is that you already need to know a lot about the topic to be able to evaluate the possibilities of such usefulness, rather than heading straight there because the title apparently fits the assessment task more closely than the titles of more recent stuff.

Where I become really equivocal is in offering some sort of opinion on ‘how old is too old?’ – simply because whatever time limit I suggest immediately brings to mind things that I would then be excluding which really shouldn’t be excluded. Forty years? Seriously, you’re writing off Hopkins’ Conquerors and Slaves? Fifty years? Look, it’s not as if I would actually expect any of them to read Brunt’s Italian Manpower, but…

Which brings me back to the ‘legacy’ issue. It did occur to me that a thirty-year rule would, in the not too distant future, start ruling out my own publications. It is nice to see Metropolis and Hinterland still occasionally cited – but would readily accept that it now seems to be cited almost dutifully, as relevant but vague context, rather than because anyone is bothering to argue with it. As I tell my students, it’s not that old publications are necessarily wrong, they just become irrelevant, as the discussion moves on. After a certain point, I guess one is hoping to be rediscovered by the sort of eager graduate student who is reading everything of any possible relevance to their project, who then decides that Morley made one or two good points that have been unjustly neglected…

There is, perhaps, an alternative. The original idea for this blog post actually came over a year ago, after a student, during the seminar on Pylos and Sphacteria in my Thucydides course, suddenly asked “What’s Grundy’s Well?” In response to my entirely blank expression, he showed me the map included on p.715 of Hammond’s World Classics edition of Thucydides (‘after Rhodes, Thucydides History IV.I-V, 24’ it says), and there, right in the middle of Sphacteria, it is: Grundy’s Well. And so I had to do a bit of research.

‘Grundy’ is of course George Beardoe Grundy, 1861-1948, Oxford ancient historian specialising in military history and battlefield topography with a sideline in Saxon charters. Grundy’s first book, in 1894, had been a study of the topography of the battle of Plataea, and his second, on the Persian Wars, likewise included a substantial focus on the modern landscapes of key events and their relation to ancient descriptions; the importance of actual inspection is emphasised multiple times in his 1948 memoir, Fifty-Five Years at Oxford, which includes a powerful account of the dangers of malaria in many of the regions he visited including Sphacteria.

It’s therefore unsurprising that when he moved on to Thucydides with his 1911 book Thucydides and the History of his Age, his approach included the detailed surveying of places described in order to evaluate whether Thucydides’ account was trustworthy – including whether he may have visited the place, or relied on talking to Spartan captives in Athens, or just made it up (there is a long debate, helpfully summarised in Hornblower’s commentary, about whether Thucydides’ account of the geography of Pylos and Sphacteria, and hence of the military action, is basically fictitious).

Grundy explored the island of Sphacteria and identified a ‘well’, which he took to be the source of water which the Spartan soldiers relied upon when besieged (mentioned at 4.26 and 4.31), and tasted the water. A. Gomme’s historical commentary (published 1956) talks about the debates around the topography of Sphacteria, and refers to “the well identified by Grundy”, also citing another scholar who suggested that it was a cistern – a tank designed to collect rainwater – rather than a proper well. By the time Peter Rhodes published his text and commentary on Book IV in the 1980s, this had become “Grundy’s Well”, and I think the crucial step from “the well Grundy identified” to “Grundy’s Well” was taken by W.K. Pritchett, Studies in Greek Topography Vol. 1 (1965) – but I don’t have a copy and it’s not online, so haven’t got round to checking.

There are, I can imagine, far worse ways to be commemorated; the Hammond translation is likely to remain in print for years to come (it suddenly occurs to me that I haven’t checked whether the relevant map in the Landmark Thucydides also includes the well), and the specific debate to which this was a contribution is also unlikely to vanish completely so long as people read Thucydides and worry about his accuracy in different areas (okay, if the political theorists appropriate him completely, maybe not). I wonder if there is a plaque to mark the spot.

Of course, it’s probably not enough to get anyone reading Grundy’s publications again – and, speaking as someone who has read his two books on Thucydides, to be honest they’re not missing a huge amount. (In its time, however, the first volume was both successful and quite radical, at least for Oxford; Grundy noted in his memoir that he was regarded by some as “an impudent heretic”, and that a friend who was one of the Electors told him that the book had cost him the Professorship of Ancient History; he consoled himself with the sales figures, two-fifths of which had been in Germany where reviews were critical but serious). With one very obvious exception: the hilarious poem with which he opened his final book, Thucydides and the History of his Age Volume II (1948), which I quoted at length in the Preface to my own book on T and the Idea of History. To give just the final stanza:

A work undying was his aim.

He to an endless future spake.

He has made good the proudest claim

That ever writer dared to make.

But in fact Grundy’s academic career included at least one other Easter Egg, so to speak; another thing you would not generally expect from an Oxford ancient historian and which only a few are likely to stumble across. In 1917, he published an essay that represents a genuinely original attempt at presenting ‘political psychology’ as a historical rather than philosophical enterprise. It was inspired by a powerful sense of how far the leaders on all sides in the run-up to war had made decisions based on entirely mistaken conceptions of how other people, en masse, would react; they assumed that theories based on their own situation would be universal – thus, a nation dominated by fear would assume that others would be equally susceptible to it – rather than recognising variation between different peoples and times.

There’s a handy summary of the article by one Floyd W. Rudmin at https://www.humiliationstudies.org/documents/RudminGrundyBio6.pdf, and it’s also mentioned by Ben Earley in The Thucydidean Turn, in his discussion of Alfred Zimmern. A little surprisingly, Ben doesn’t mention the extent to which Grundy’s conception of political psychology drew on Thucydides. This is pretty clear from his opening account of the classical roots of the approach, noting that the Sophists and Plato had recognised the importance of knowledge for political life, and continuing:

Thucydides, one of the greatest thinkers of all times, seems to have held that the actions of men in masses might be calculable by those who knew the history of the past, or, at any rate, that there was a tendency for different bodies of men to act in similar ways in similar circumstances at different times…

He makes it clear that the essential cause of the resemblance lies in human nature itself, and is thus psychological. In the composition of his work he spent most pains on the speeches, that feature of his history in which he depicts the psychological background of the course of the events he has set himself to narrate, and in them he seeks to trace to their sources those springs of emotion which lead men to actions which affect the history of the human race…

The value of history lies from his point of view in the possibilities it affords of forming a judgment of the probabilities of the action of masses of men in certain types of circumstances. The historian must not confine himself to mere narrative, but must be an interpreter of the average soul of humanity. (157)

Most of the article, admittedly, is a general rant about state education, arguing that the German example doesn’t discredit the idea (any more than science is discredited by burglars being able to use a blow-pipe; no, me neither) but simply shows that it needs to be put to better ends. It then moves on to emphasise the differences between Western Europe and the Eastern Mediterranean, this time emphasising how far Western countries had naively assumed that the latter were constitutional monarchies on the lines of themselves.

Basically, it’s an incoherent mess. But you might say that it’s surprising that it’s done at all. And there is something quite attractive in the thought that perhaps, in a century or more’s time, someone might browse through a few of these blog posts or one of my more obscure articles and think, well, I wasn’t expecting that.

Assuming that Morley’s Law of Thucydidean Misquotation hasn’t in fact been enshrined as one of the cardinal principles of the New Internet, obviously. Or that they haven’t named a variety of broad bean after me.

G.B. Grundy (1917) ‘Political psychology: A science which has yet to be created’, in The Nineteenth Century And After, vol.81 iss.479, pp. 155-170.

https://archive.org/details/sim_twentieth-century_1917-01_81_479/page/156/mode/2up

 •  0 comments  •  flag
Share on Twitter
Published on February 24, 2023 07:30

February 17, 2023

(I Don’t Want A) Photograph

Many years ago, with my ‘faculty teaching quality assurance’ hat on, I had to go and observe a lecture as several students in the class had complained that the lecturer’s presentations were heavily plagiarised. This seemed an entirely bizarre and improbable accusation, but they were insistent and credible – and it did indeed turn out to be the case; it wasn’t just that large chunks of text from scholarly publications were included on the PowerPoint slides without any attribution, but these passages, and associated material that wasn’t posted on the slides, were read out word for word without any indication that these weren’t the lecturer’s own words. I don’t know how the students had first twigged this, but once you have that sort of suspicion, it was incredibly easy just to google a few phrases and identify the source.

Looking back, what I still find astonishing about this episode is not so much that something like it might happen occasionally – I guess we’ve all been in the position of trying to prepare a class at the last minute on a topic that we don’t actually know much about – but that the lecturer’s response, when I arranged a meeting to talk through this clearly serious issue, was something to the effect of: Yes, and? The fact that this was happening week after week was not, as I’d been expecting, the result of them simply not keeping on top of the demands of the job for whatever reason, but the result of them not seeing any issue with the practice.

I had thought it would be enough simply to mention the dread word ‘plagiarism’ and agree that it wouldn’t happen again; an hour or so later, a pretty frustrating hour of my talking about proper scholarly practice and the need to set the right example for students and the fact that the students were losing all faith in the lecturer’s knowledge and expertise, we concluded with my insistence that it mustn’t happen again, with no sense that this was being accepted for any reason other than a grudging recognition of my authority to make such a demand.

I still find this baffling, not least because I wasn’t offered any sort of counter-justification, but simply an attitude of ‘How is this a problem?’ against which all my arguments seemed impotent. Obviously it wasn’t that the publications being quoted were regarded as unscholarly and hence not deserving of credit (unlike a blog, say…); was it then that lecturing was not seen as the sort of activity where credit – or indeed originality – was necessary? I would have to admit that my instincts in this area were shaped by the first year of my undergraduate degree, where I simply switched off during lectures that were basically summaries of the core textbook and then just stopped attending as I could cover the material quicker and better by reading it – but at least in those instances the lecturer was actually the author of the core textbook. Reading out someone else’s book? Really?

One thing I find myself wondering at the moment is what would have happened if the lecturer had responded to my admonitions by turning to the law – invoking constructive dismissal or bullying or the like. Improbable? Recent events in the world of medieval manuscripts suggest not. If taken-for-granted norms of scholarly practice, like the avoidance of plagiarism, aren’t actually held by all academics (or at least not in exactly the same way), then it is easy to imagine that equally norm-defying methods might be adopted in response.

And because these are just norms and traditions, the sorts of expectations one acquires in the course of one’s training rather than something explicitly mandated and codified, it’s also easy to imagine how they could be questioned from outside – in a court of law, for example. “So, Professor Morley, are all your lectures completely original in every respect?” “No, I draw on the research and writing of others. But I generally rely on a range of such sources, and I produce a synthesis of them in my own words.” “And does that make them into your ideas?” “Not exactly, but it presents them in a new way, and it also models for students the fact that they should not simply copy other people’s work and present it as their own.” “So, copying a scholar’s work accurately is unacceptable, but producing your own version of it is somehow fine? Is this a sound basis for your defamation of my client..?”

Something can be plagiarism, and hence unacceptable academic practice, without breaking the law on copyright. There is certainly room for debate about how much similarity, or how much inadequate referencing, might be excusable as poor academic practice rather than outright misconduct – I’ve sometimes felt this in reading discussions of the regular ‘outing’ of German politicians’ doctoral dissertations as plagiarised – but there’s generally no doubt that there is, somewhere, a line that shouldn’t be crossed. Similarly, a lecture might draw heavily on a published source, even to the point of using exactly the same examples, and still remain within acceptable bounds as being the lecturer’s considered and assimilated version of the material; not claiming to be original, but still representing their own understanding, rather than just an unacknowledged reproduction.

Might this be a purely or largely Western mindset? I’m conscious of the risk of making some racist/culturalist assumptions while trying not to universalise my own cultural prejudices, but one hears of classes in some Asian countries where the whole point is to convey the contents of an officially-sanctioned textbook, where any originality or personal perspective from either teacher or student is entirely undesirable. That rests on the idea that there is a reliable pre-determined truth to be transferred from book to student with the lecturer as conduit, which is not the usual understanding of university-level humanities classes. I see part of the point of any such class as being exploration of the ways knowledge is constructed; but even if your view of first-year lectures, at least, is that they’re about the presentation of facts, it’s still a big leap to the idea that therefore the lecturer doesn’t need to offer their own account of the facts.

Obviously what this really brings home to me is that, according to the recent training hoop I had to jump through on ‘Handling Difficult Conversations’, I was far too focused on “oh my gods the students are going to sue us and what if this goes public?” and the intended outcome of the conversation (“Stop doing this!”), and too little focused on listening to understand, entering into my interlocutor’s world etc. – and now it’s too late…

 •  0 comments  •  flag
Share on Twitter
Published on February 17, 2023 09:00

Neville Morley's Blog

Neville Morley
Neville Morley isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Neville Morley's blog with rss.