Samir Chopra's Blog, page 123
March 2, 2013
Should Free Software Go Into the Public Domain?
I’ve just finished an interesting Twitter conversation with Glyn Moody (author of Rebel Code: Linux and the Open Source Revolution, still one of the best books on the free and open source software phenomenon). Moody has written a very interesting article over at TechDirt, which wonders whether the time has come to put free and open source software into the public domain rather than releasing it under a variety of licenses which rely for their efficacy on copyright law. (Moody’s article finds its provenance in a paper by Clark Asay, who argues that FOSS could be released into the public domain and yet still thrive as a collaborative project.)
My initial response to Moody’s article was skeptical. (Full disclosure: I have not read Asay’s article but will soon do so.) Several years ago, in our book Decoding Liberation: The Promise of Free and Open Source Software, Scott Dexter and I had argued for the superiority of FOSS licenses like GPL over permissive licenses like the BSD because of the worry that the latter made free-riding possible. (Those arguments are still relevant though I will not repeat them here; please do check out the link.)
Moody addresses this worry by quoting Asay:
if a firm were to take and close a project, they almost certainly would not obtain the free labor that contributors around the world are willing to provide to open-licensed projects. Without that free labor, firms would lose the most significant advantages of an open model of innovation, and the free labor would likely remain loyal to the open version of the project. Firms thus already have incentives to open and contribute as much of their materials as possible, since doing so will attract free labor and trigger innovation in directions that better suit the firm and its strategic direction.
and then goes on to say:
The key point is that the code without the community that creates it is pretty much dead. A company may gain a short-term advantage in taking public domain code and enclosing it, but by refusing to give back its changes, it loses any chance of collaborating with the coders who are writing the future versions. It will have no influence, and no way of raising issues of particular concern that help it with its products. Instead, it will have to keep up the development of its own version of the code single-handed. That’s likely to be costly at best, and may even be impossible except for the very largest companies (Apple is an example of one that has succeeded, basing its Mac OS X operating system on the free BSD version of Unix.)
As I noted in my conversation with Moody, I’m considerably less sanguine than he is about these prospects. I do not doubt that FOSS has made great inroads in the world of software (Moody quotes figures like ’94% of top supercomputers run Linux; 75% of smartphones run Android; tablets next…’). What I do doubt is whether the value of free software is understood at a more conceptual level so that the closing of a formerly open project would be viewed as a bad thing by the developer community (and by users). Moody thinks so, of course, hence our polite disagreement. (I also think new laws will be needed to protect developers from patent infringement claims.)
In any case, I think the argument is an interesting one especially as one might think that copyright protection was only required for FOSS because of the onerous copyright regimes that it exists in and that a move to the public domain would become easier in an environment that understands FOSS’ promise better and so would be less tolerant of the closing of a formerly open project (like Apple closed BSD). Again, this will only happen in a different legal regime.
Hopefully, I’ll get the time to read the Asay article and respond to it more thoughtfully sometime soon. In the meantime, comments welcome.


March 1, 2013
Glaucon and the Basic and Advanced Polis, Contd.
Yesterday’s post on Glaucon and the preferred forms of the polis for him and Socrates sparked off an interesting discussion on Facebook with Alex Gourevitch. I’m reproducing it here as Gourevitch’s responses are wonderfully rich and worth responding to carefully.
Here is the sequence of comments on Facebook, followed by my response last.
Alex:
I still think it’s better to be a human being dissatisfied than a pig satisfied.
Samir:
Indeed. I’m just not sure the inhabitants of the basic polis are pigs; that description matches the often rapacious, gluttonous inhabitants of Glaucon’s preferred state.
Alex:
They are pigs because they will eat anything. They are easily satisfied. They lack culture and refinement, which you only have if you are leisured – which is what, I think, the reference to reclining on couches is supposed to be about. It’s not just about having lots of desires to satisfy, but time to reflect on and develop one’s desires. Of course, that requires a social surplus, and someone else to do the work, which is what introduces class relations. We go from being pigs to being wolves. So you must tame the wolves. That is the question of justice, I think, for Plato. But it’s one also defined by the circumstances of justice. Socrates, Glaucon and Adeimantus set up the problem in such a way that the only way to imagine a surplus necessary to sustain leisure and culture is by conquering others, taking their land, and enslaving them. At the time, that may very well have been correct – Aristotle says the same thing about why slaves are necessary (the tripods of Haphaestus, looms spinning themselves). But one can imagine other ways, like machines/technology, so that everyone can have that leisure to develop their tastes and participate in culture. Note, by the way, that the theory of justice that develops out of the original problem as Glaucon and Socrates set it up is an attempt to restore that natural harmony of the ‘healthy state’ but through rational principles. In fact, Glaucon wants to be convinced of the idea that it is better to perfectly just and perfectly unjust. So even he does not deny that there is something superior to the condition of the pigs to that of living with wolves. Don’t you think?
Samir:
Does Socrates’ description of the basic polis really sound like people who don’t have leisure? Sitting by the fire, drinking wine in moderation, roasting nuts? They aren’t eating just anything. They live in peace to old age too. Perhaps they work into their old age rather than retiring. So what? There is a false opposition set up here. If you don’t grant the opposition that Glaucon sets up little remains of the desire for the advanced polis, which as you note, brings war, class conflict, and the problem of justice and law. It is almost as if Glaucon didn’t pay attention to the description Socrates provides. I would agree with him (and you) if the state described by Socrates was indeed pig like: scrounging for roots, eating dirt, the hardscrabble life from birth to death. But that is not what Socrates has in mind.
Alex:
Very interesting. I see the plausibility of your reading but I think it rests on overstating the hardscrabble life of the pig as the central issue. In Glaucon’s eye the key feature of the pig is not that it scrounges but that it is indiscriminate. I take that to be one of the oldest metaphors about pigs – they eat anything. The connection to leisure is then that in the primitive division of labor of the ‘healthy city’ everyone works, they have an occupation, but there is no leisure, no culture, and indeed no philosophy. It is only the original act of injustice, the primitive accumulation, as it were, that creates the leisured class, sets reflection in motion, and brings about a philosophical attitude towards the human condition. Of course, what we find upon reflection is injustice. And we can retrospectively appreciate what is harmonious and good about the healthy state, but it is still a state of pigs. It is a state of pigs not because they work from dawn until dusk per se but because there is no demand for leisure and culture, and that demand is not there because people are satisfied with what they have. Needs are limited to a ‘natural’ range, to what can be supplied through a very simple division of labor and a few objects. Everyone is happy to do their work and consume what they can. They are indifferent to the limted range of their lives.
Samir:
I’m not sure the text supports the reading you attribute to Glaucon. He listens to Socrates’ description of the basic polis and calls its inhabitants pigs anyway, seemingly without having paid attention to the leisure that is built into it. Your reading, and his, only works if this ignored. I’m getting stuck on this point, because I’m willing to concede the rest of your points if indeed all that happened in the basic polis was mere adherence to occupations. Thus I don’t see the necessity for the ‘original act of injustice’ or the ‘primitive accumulation’ either. I would also find it strange that Glaucon/you term them pigs when given a description of their working days: Is no reflection possible while at work? Is no reflection possible by being in the moment of one’s daily activities? You identify ‘culture’ with the arts; I think I have a broader reading of culture that is more inclusive of a broader range of human activities, many of which are possible in the basic polis. And thus I don’t buy the ‘limited range’ view of the polis that you have. But perhaps most importantly, it seems to me that we have lost a great deal by preferring a state that includes class conflict and war. There’s something depressingly Nietzschean about this vision, as if war is the inevitable price we must pay for the fine arts.
More importantly, I think there is a fairly convincing argument to be made that Plato finds the basic polis, despite the attention he pays to the advanced polis, a morally superior one. Remember what he terms the life of the philosopher: unconcerned with material acquisition but only with the pursuit of the truth. The basic polis provides this without the temptations of the advanced polis. A frugal life is possible here without evoking our worst instincts; it can give us time for the pursuit of the truth without necessarily owning or consuming the ‘finer things’ that Glaucon thinks are possible in the advanced polis. Indeed, Plato’s philosophers would be unmoved by the material wealth of the advanced polis; the contemplative time provided by the basic polis is enough. The basic polis makes possible a society where laws and government might play a minimal role; it might be the kind of community anarchist political philosophies have in mind. The rudimentary polis can get along without being a state; the advanced polis has to be one. And I find it hard to believe that the state represents an advancement on the basic polis.
Note: My arguments above are not original to me. I read them many years ago in David Melling’s lovely little book on Plato, which I’ve often recommended to my students. I stumbled upon the book again recently and was moved to write yesterday’s and today’s posts in response.


February 28, 2013
Glaucon’s Porcine Preference for the Advanced Polis
I never particularly liked Glaucon. His responses to Socrates‘ description, in Plato‘s Republic (372 (a-d)), of the basic polis are a good reminder of why.
Socrates quoth:
First of all, then, let us consider what will be the manner of life of men thus provided. Will they not make bread and wine and garments and shoes? And they will build themselves houses and carry on their work in summer for the most part unclad and unshod and in winter clothed and shod sufficiently? And for their nourishment they will provide meal from their barley and flour from their wheat, and kneading and cooking these they will serve noble cakes and loaves on some arrangement of reeds or clean leaves, and, reclined on rustic beds strewn with bryony and myrtle, they will feast with their children, drinking of their wine thereto, garlanded and singing hymns to the gods in pleasant fellowship, not begetting offspring beyond their means lest they fall into poverty or war?
What is Glaucon’s interjection?
No relishes apparently, for the men you describe as feasting.
Socrates recovers from the silliness of this and responds, gamely:
True, I forgot that they will also have relishes—salt, of course, and olives and cheese and onions and greens, the sort of things they boil in the country, they will boil up together. But for dessert we will serve them figs and chickpeas and beans, and they will toast myrtle-berries and acorns before the fire, washing them down with moderate potations and so, living in peace and health, they will probably die in old age and hand on a like life to their offspring.
Glaucon’s response:
If you were founding a city of pigs, Socrates, what other fodder than this would you provide?
The ever-polite Socrates responds:
Why, what would you have, Glaucon?
The real ‘pig’ in all of this, Glaucon, respond:
What is customary; they must recline on couches, I presume, if they are not to be uncomfortable.
Waddaprick. The basic polis sounds pretty nice, especially when you consider that the kind of polis envisaged by Glaucon requires–as he admits a little later in the dialogue (373 (d-e)–the introduction of the doctor and the soldier. (Healthcare and the Military! Sound like budgetary problems to me.) The first occupation addresses the rash of diseases that will be caused by the ‘richer’ lifestyle of the more advanced polis–Socrates’ argument for the need for doctors in the advanced polis is an interesting anticipation of modern thinking about diseases of affluence. More perniciously, the advanced polis results inevitably in a desire for territorial expansion: the standing army with its budgetary demands and its endless conscriptions, its creation of wars, the scourge of human history, is a function of the mode of organization of the state it defends.
Glaucon disdains the frugal nature of the basic polis, seemingly unaware that the richer polis he has in mind is the one that will actually encourage porcine behavior.
Excerpted from: Plato in Twelve Volumes, Vols. 5 & 6 translated by Paul Shorey. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1969. Available online at the Perseus Digital Library.


February 27, 2013
Professorship and ‘The Perennial Taker of Courses’
In ‘In Greenwich, There Are Many Gravelled Walks‘ Hortense Calisher writes,
Robert was a perennial taker of courses–one of those non-matriculated students of indefinable age and income, some of whom pursued, with monkish zeal and no apparent regard for time, this or that freakishly peripheral research project of their own conception, and others of whom, like Robert, seemed to derive a Ponce de Léon sustenance from the young.
I have special fondness for the non-matriculate; I began my academic career as one, taking two graduate classes in philosophy before I began formal doctoral studies. And before I registered for them, when I informed my mother that I planned to quit my day job eventually to seek a full-time academic career, her immediate, and immensely gratifying, reaction was, ‘That’s great! If you become a professor you take classes for the rest of your life at your university!’ I hadn’t thought that the opportunity to be an endless dilettante, browsing through each semester’s course offerings and picking one, would present itself as the most obvious advantage of a professor’s life, but my mother certainly thought that way.
I haven’t managed to do so. But I did try. After I returned to New York from my post-doctoral work in Sydney, I sat in on Spanish 101. Learn a language, travel, cook–you know, the standard aspirations. I attended quite a few classes, but found it difficult to keep up with homework given my teaching and service duties (and of course, my own academic interests). I didn’t make it to the end of the semester; sometime shortly after the mid-term (in which I got a decent, but not excellent, grade), I dropped out.
A year later, I tried again. This time around, having convinced myself that the problem the last time around had been the lack of a formal component to my dabbling, and with an eye on a graduate seminar on the Frankfurt School offered through the History department at the CUNY Graduate Center, I registered, taking advantage of the tuition exemption for employees of the City University.
This time around, things went marginally better. I did most of the readings, attended all the classes, and even wrote a paper on Horkheimer, which was probably quite amateurish, but which was very helpful in making me more familiar with his writings. But again, I found things not entirely to my liking. I was still busy with teaching and service and writing, and the time needed to travel to Manhattan for the seminar and do the readings seemed onerous. (Perhaps I didn’t enjoy the company of graduate students. Too many of them seemed to instantiate dreaded archetypes of that demographic: the hasn’t-done-the-readings-but-will-still-pontificate-on-it and the can’t-shut-up-and-stay-on-point varietals being the most pernicious. I certainly wasn’t deriving any ‘Ponce de Léon sustenance’ from them.)
So that was my last attempt to replicate the non-matriculate days. I became ever busier with my own writing and confined my dilettantism to unguided, unstructured dabbling on my own. And I had found other outlets for it: teaching new classes or revising syllabi for classes taught previously and blogging being the most prominent among them. Besides, once you’re a full professor, its all pretty much dabbling in any case.


February 26, 2013
Walking the City: Random Walks Through Manhattan Streets
In Street Life: Becoming Part of the City, Joseph Mitchell wrote:
What I really like to do is wander aimlessly in the city. I like the walk the streets by day and by night. It is more than a liking, a simple liking–it is an aberration. Every so often, for example, around nine in the morning, I climb out of the subway and head toward the office building in midtown Manhattan in which I work, but on the way a change takes place in me–in effect, I lose my sense of respectability–and when I reach the entrance to the building I walk right past it, as if I had never seen it before. I keep on walking, sometimes only for a couple of hours, but sometimes until deep in the afternoon, and I often wind up a considerable distance away from midtown Manhattan–up in the Bronx Terminal Market maybe, or over on some tumbledown old sugar dock on the Brooklyn riverfront, or out in the weediest part of some weedy old cemetery in Queens. It is never very hard for me to think up some excuse that justifies me in behaving this way…
I lived in Manhattan from 1993 to 2000 and often walked ‘aimlessly in the city’; Manhattan’s layout encouraged such roaming. It felt like a gigantic playground, laid out so as to invite exploration. I moved across the Hudson to 95th Street and West End Avenue in 1993, and soon began walking regularly to and from my classes on 42nd Street (between 5th and 6th Avenues). I wanted to vary my walk, so I chose different methods for changing my routes: sometimes crossing straight over to Broadway and then walking uptown, sometimes heading for Central Park West, sometimes letting the lights regulate my path. The feeling of stumbling onto a never-before explored city block never grew old; I often thought of checking them off a list but felt too lazy to do so, trusting that time and my randomizing algorithms would eventually exhaust the possibilities. When I moved to the Lower East Side (5th Street between Avenues A and B) in 1997, I continued walking to 42nd Street, and was able to conduct my explorations while heading uptown. As always, I found storefronts, buildings, street characters, food, and sundry other urban features and residents I would not have had I stuck exclusively to taking the subway.
Manhattan encouraged expansive walking. I dreamed up extravagant routes and sometimes acted on these plans. On one such jaunt, I walked from 5th Street to 110th (the northern edge of Central Park), moving from 59th to 110th along Central Park East, turned west, walked south along Central Park West down to 59th again, turned east to Lexington Avenue, walked south till 28th, where I stopped for some Indian food, before heading back home. I planned too, to walk the entire length of Broadway, never pulled it off, but haven’t given up that dream yet.
Walking on Manhattan streets reminded me, as always, that the best way to experience a city is from street level; the pace is right, its features pop into focus, you can stop and stare and sample. A city is made up of streets; walking on them is still how one best finds out what makes it tick.


February 25, 2013
Why The Talking Dead is a Bad Idea
Last night, I declined to watch the Oscars and chose The Walking Dead instead. If you’re going to watch zombies, why not watch a more interesting group of them? Snark aside, I had not seen most of last year’s crop of nominees, other than the mildly diverting Argo, and more to the point, I’ve burned out on the Motion Picture Academy’s annual orgy of self-congratulation. (Last year’s post on the Oscars described the genesis of this gradual turning away, one which started much, much earlier for the Grammys, and is now firmly in place for most awards of a similar kind.)
So, my choices for the evening settled, I turned to AMC. This represents a novelty of sorts for me. My following of television series has been restricted to watching the commercial-free episodes available on Netflix or bittorrent sites. But my hankering for the Grim Grimesmeisters Hijinks had grown too acute, so there I was, braving myself to sit through the barrage of commercials that would inevitably accompany the latest installment of Zombie Apocalypse Bulletins. (I had begun this brave adventure last week, with the second episode of season three.)
The commercials were painful, but far more bothersome was AMC’s show The Talking Dead, which followed the new episode, an hour-long discussion of the episode with in-studio guests, a studio audience and a ‘surprise cast character.’ I had stopped watching after fifteen minutes the previous week, and this time around, my patience ran out after five.
The problem with The Talking Dead, and with any other show like it, which aims to dissect, discuss and lay threadbare an ongoing television show and wax ‘analytical’ about it, is that it dispels fantasy all too quickly. The point of watching a show like The Walking Dead (or Breaking Bad, or The Wire, or ) is to enter an alternate reality for a while, to be caught up in its story and characters, to come to believe, if only fleetingly, that the trials and tribulations of those on screen are real. A discussion show blows this imperative out of the water. It reminds us relentlessly, that the characters are just actors, often uninteresting people in their non-character personas, that directors, writers, and producers are pulling the strings and are often insufferably pompous, that locales are studio lots. It connects the artfully constructed parallel universe to ours far too quickly; it raises the hood and peeks at the innards a little too closely. The Walking Dead in particular is supposed to be a grim show; it has little humor (both in the comic book and the series); the goofiness of The Talking Dead is especially grating.
I realize that I’m taking the on-the-surface silliness of The Talking Dead too seriously, so let me reiterate that the point being made here is a general one: too much inquiry into an ongoing fantasy is a bad idea. The serious fan should stay away; suspend disbelief, watch the show, and when you’re done, keep it that way. Till the next episode.


February 24, 2013
Op-Eds and the Social Context of Science
A few years ago, I taught the third of four special interdisciplinary seminars that students of the CUNY Honors College are required to complete during the course of their degrees. The CHC3 seminar is titled Science and Technology in New York City, a moniker that is open, and subject to, broad interpretation by any faculty member that teaches it. In my three terms of teaching it, I used it to introduce to my students–many of whom were science majors and planned to go on to graduate work in the sciences–among other things, the practice of science and the development and deployment of technology in urban spaces. This treatment almost invariably required me to introduce the notion of a social history of science, among whose notions are that science does not operate independent of its social context, that scientists are social and political actors, that scientific laboratories are social and political spaces, not just repositories for scientific equipment, that scientific theories, ‘advances’ and ‘truths’ bear the mark of historical contingencies and developments. (One of my favorite discussion-inducing examples was to point to the amazing pace of scientific and technological progress in the years from 1939 to 1945 and ask: What could have brought this about?)
If I were teaching that class this semester, I would have brought in Phillip M. Boffey‘s Op-Ed (‘The Next Frontier is Inside Your Brain‘, New York Times, February 23) for a classroom discussion activity. I would have pointed out to my students that the practice of science requires funding, sometimes from private sources, sometimes from governmental ones. This funding does not happen without contestation; it requires justification, because funds are limited and there are invariably more requests for funding than can be satisfied, and sometimes because there is skepticism about the scientific worth of the work proposed. So the practice of science has a rhetorical edge to it; its practitioners–and those who believe in the value of their work–must convince, persuade, and argue. They must establish the worth of what they do to the society that plays host to them.
Boffey’s Op-Ed then, would have served as a classic example of this aspect of the practice of science. It aims to build public support for research projects in neuroscience, because, as Boffey notes at the very outset:
The Obama administration is planning a multiyear research effort to produce an “activity map” that would show in unprecedented detail the workings of the human brain, the most complex organ in the body. It is a breathtaking goal at a time when Washington, hobbled by partisan gridlock and deficit worries, seems unable to launch any major new programs.
This effort — if sufficiently financed — could develop new tools and techniques that would lead to a much deeper understanding of how the brain works. [link in original]
And then Boffey is off and running. For Congressmen need to be convinced; perhaps petitions will have to be signed; perhaps other competitors who also hope to be ‘sufficiently financed’ need to be shown to be less urgent. And what better place to place and present these arguments than the nation’s media outlets, perhaps its most prominent newspaper?
The scientist as polemicist is one of the many roles a scientist may be called on to play in his work in science. Sometimes his work may be done, in part, by those who have been persuaded by him already. Boffey’s arguments, his language, his framing of the importance of the forthcoming legislation, would, I think, all serve to show to my imagined students this very important component of the practice of science.


February 23, 2013
Walking, Head Down, on a Damp and Grey Day: How Virtuous It Is
On days like this, many residents of the US eastern seaboard are apt to question their decision to ever inhabit these spaces. The temperature is in the thirties (that’s just a couple of degrees above freezing point for all the folks living in Celsius-land); a steady, persistent drizzle is falling; and the most familiar color of all here on the East Coast, grey, has been used to paint, yet again, New York’s urban landscapes. Many of us will stay indoors today, but those who venture out will find that that experience brings its own reward, one which I suspect underwrites the tolerance that long-term East Coasters have for this benighted clime. Which is that walking, head down, through near-freezing temperatures while water drips off your hat, beanie, jacket or whatever–because you know, many New Yorkers, like Pacific Northwesters, disdain umbrellas when rain of this intensity is falling–is often prone to provoking an acute sense of virtuousness in oneself.
Why would that be? For one thing, the mere fact of being outdoors puts you on the side of the Spartans. You have disdained comfort, the domestic hearth, and have ventured forth boldly. Not for you the safety of the familiar, the quotidian. No, suffused with the spirit of the intrepid, you have dared to look into your closet, laced and buttoned up, and sallied out. And once outdoors, the physical particulars of the day are conducive to a very distinctive mode of daydreaming.
As you walk, head bowed, grimly determined to make it through and past the damp and cold, you enter a zone similar to that entered by many who persistently engage with the uncomfortable: the once seemingly impossible barriers that your task seemed to have raised start to melt away, leaving you with the pleasing possibility that your abilities have the magical effect of making life more tractable. This is gratifying in the extreme.
But even more importantly walking in bad weather forces a mode of concentration upon us that is increasingly hard to find and persist with in our normal, constantly-interrupted, notified, pinged, paged, and remindered existence: for that span of time that the walk persists, its just you and the execrable weather. And when things are that intimate, when using the smartphone might not be, you know, all that smart, why not just retreat a little bit into the ever more unfamiliar space of introspection?
I suspect these ventures into that space are often found by us to be pleasurable, that we enjoy our retreats into these rare moments of solitude. Thoughts move a little differently, they are not so easily displaced by external stimuli. Because, lets face it, on an East Coast day like this, who wants to look about and around, and stop and stare? Better to press on.
And that pressing on is really the clincher, I think. Nothing quite makes you imagine yourself as the relentless, courageous, explorer like a walk in really, really, shitty weather.
And yes, I did go out today.


February 22, 2013
‘If It’s Dead, Kill It’: The Second Compendium of the Walking Dead
Last year, I discovered The Walking Dead (the television series and the comic book). Like most fans of the television series, I’m all caught up now with the second half of the third season. Given the disappointing nature of the first two episodes of the second half, I’m glad that I have something else to take care of my Walking Dead jonesing: the massive second compendium of the comic book (Compendium Two, Image Comics, 2013), which collects issues 49 through 96. (The series is up to issue 108 by now, so it will be a while before the third compendium will be released; in terms of tracking the relationship between the comic book and the television series, the third season is right about where the first Compendium ends.)
I’ve written on this blog before about the relationship between the comic book and the television series so I will not get into that again. Rather, reading the second Compendium has provided me an opportunity to make some educated guesses about where the show might be going, and even more interestingly, to examine the particular vision the creators of the comic book have about the post-zombie-apocalypse world.
Most prominently, it is clear the most interesting conflicts in the zombie world are not with the dead but with the living. While zombies are deadly, and require vigilance, violence and nous to keep at bay, the human survivors are more insidious and harder to combat. Allusions to Hobbesian states of nature and methods to alleviate them are never too far from the surface in the comic book especially in the two Woodbury-like developments encountered in the second compendium.People are prickly, selfish, angry, paranoid, greedy, and all of the rest; turns out, in a world ruled by zombies those qualities are merely enhanced, not ameliorated. For the most part, this is what gives the comic book (and the television series) its edginess: there is almost always perpetual conflict between those who have survived. Like the first compendium, there is grotesque violence directed at humans even as we note that acts of violence directed against the dead have now become mild amusements. And this is what makes the zombie world just so bothersome: there is no getting away from plain folks. Hell really is other people. (The second compendium also, finally, starts to allude to what really would be the biggest problem of all: an inconsistent and fast dwindling food supply.)
There is internal conflict too. Rick Grimes continues to be (literally) haunted by his memories as do other characters in a variety of ways. And there is a great deal of mourning, painful introspection and just second-guessing, for the numbers of the dead continue to pile up, each death generating its own profuse regret and bitterness. Indeed, if you’ve survived, you’re traumatized and will act out that trauma in one way or the other. This makes some episodes in the compendium a little tedious, as reading them approximates listening into a therapy session. Which should remind us: the busiest service providers in a zombie world would be grief counselors and psychotherapists. The Walking Dead are not just the zombies, they are the living too.


February 21, 2013
The Mad Men Are Serious Downers
I’m only three episodes deep into Mad Men, and I’m already struck by how grim the show is. There’s misogyny, sexism, racial and ethnic prejudice, sexual prudery (of a kind), depressing suburban life, loveless marriages, loveless affairs, rigid gender roles, corporate language, the vapidity of advertising, and smoking indoors. And alcohol, lots of it. Mainly martinis and scotch, consumed at all hours of the day, in offices and homes, and during kids’ birthday parties. (I’m not sure if I’ve missed out on anything; I’m sure fans will correct me if I have.)
In using ‘grim’ as a description for the show–which I intend to keep watching for the time being just because it is morbidly fascinating–I do not mean to look past the stylish dressing, the carefully designed interiors, the loving caresses of the whisky and martini glasses, the nostalgia for a time when boys could be boys, white folk could be white folk, and women knew just how to be women, that apparently captivate so many of the show’s fans. Rather, I find that adjective appropriate because despite the apparent cheeriness and cleverness of the office banter, the endless drinking and dining in fashionable Manhattan restaurants, and the freedom to drink in one’s office, no one in the show seems to have had the most minuscule ration of any kind of happiness doled out to them. This is one serious downer of a show.
This should not be entirely surprising. Advertising consumer products requires the careful manufacture and sale of a fantasy, one underwritten by a corporate imperative. What Mad Men does quite well, whether deliberately or not, is to depict participation in that fantasy-mongering as an ultimately soulless, dispiriting enterprise. After all, if you’re shoveling it all day and all night, wouldn’t you find your life a serious drag? Once this is realized, the near-constant drinking suddenly becomes much more understandable; who wouldn’t need a few stiff ones to navigate through the lives these folks lead? Pour me a large one, please.
The dispiriting effect of Madison Avenue is not restricted to the office and the boardroom; it spreads out into homes and suburbs too. As an advertising account executive, if you spend one-third of your life talking in platitudes, and spinning yards and yards of not particularly clever mumbo-jumbo, there is a good chance you’ll bring home that contagious emptiness with you and let it infect everyone and anyone around you. Resuming drinking at home seems like a good way to deal with these domestic blues.
The show’s writing is clever in parts, and the pretty displays of archaic behaviors and attitudes are certainly generative of the morbid fascination I mentioned above. For the time being, I will plough on, hoping that the Mad Folk don’t harsh my mellow too severely in the weeks to come.
Note: I read Daniel Mendelsohn‘s memorable review of Mad Men a while ago, long before I had seen a single episode of the show. I intend to reread it once I’m a couple of seasons deep.

