Samir Chopra's Blog, page 83
July 26, 2014
The Difficulty of the Memoir
As my About page indicates, I am currently working on “a memoirish examination of the politics of cricket fandom” (contracted to Temple University Press, for the series Sporting, edited by Amy Bass). Writing it has proven harder than I thought.
I began writing the book late in 2001 and had a hundred-thousand word draft ready late in 2004. I wrote with little guile, wanting to get my memories committed to paper, organizing in them nothing more sophisticated than a simple linear narrative. First this happened, then this, and so on. I organized the material in the only way I knew: by chunking it into simple temporal segments. I gave the draft to a couple of readers, and then forgot about it because I had other writing projects at hand.
Five years later, I submitted my draft to a couple of trade publishers. One sent me a rejection, the other never replied. I then sent it to an editor recommended me by an acquaintance, and she rejected it too. I then sat on the book for another couple of years before making contact with Amy and sending it to her. She liked the project, and after a full review process at the press, I signed a contract.
And then I returned to work on a nine-year old draft. Unsurprisingly, I found a great deal of material I did not like. More importantly, I soon ran into a greater difficulty: it is hard to tell a coherent story about yourself – especially for public consumption.
We are the central characters of our lives. The stories we tell ourselves about ourselves are subject to constant, ongoing revision; we are good at forgetting, suppressing, and embellishing the little details that make it up. (By our actions and our pronouncements we are also spinning one version of this story for everyone else.) This closeness of the narrative and its constantly shifting nature means that writing about it was always going to be challenging.
And how. I frequently find myself quite puzzled by the character in the story I am writing. I don’t fully understand him and would like to make him more comprehensible. But doing so, perhaps by greater confessional revelation or forensic detail, is not as straightforward as it seems. We have forgotten a great deal, and we often remember incorrectly. And sometimes, in an attempt to make more palatable the unvarnished truth, we might introduce incoherence elsewhere in the narrative structure–there is a thread that binds, and it can snap if stressed too much. It is all too easy to second-guess oneself: What do I really need to tell the reader? Was this a good idea to begin with? We might construct a too-sanitized picture of ourselves, suddenly struck by timidity at the thought of exposure. Lastly, we sometimes sense that we have layers and layers of complex detail that need unpacking; a really coherent story about ourselves, one that we often take hundreds of hours to recount in a therapist’s office, might simply be too much for the written page; writing it sounds like a lifetime’s labors. And it would be tedious in any case, of little interest to anyone but ourselves.
I am not yet close to solving these challenges; I expect write that dreaded email asking for an extension–beyond the summer, to the end of the year–all too soon.


July 25, 2014
Isaac Bashevis Singer on A Rabbi’s Crisis
In Isaac Bashevis Singer‘s “I Place My Reliance on No Man” (collected with other short stories in Short Friday) Rabbi Jonathan Danziger goes to pray in his synagogue one Monday morning. As he prays, he encounters a crisis:
When the rabbi came to the words, ‘I place my reliance on no man,’ he stopped. The words stuck in his throat.
For the first time he realized that he was lying. No one relied on people more than he. The whole town gave him orders, he depended on everyone. Anyone could do him harm. Today it happened in Yampol, tomorrow it would happen in Yavrov. He, the rabbi, was slave to every powerful man in the community. He must hope for gifts, for favors, and must always seek supporters. The rabbi began to examine the other worshippers. Not one of them needed allies. No one else worried about who might be for or against him. No one cared a penny for the tales of rumormongers. ‘Then what’s the use of lying?’ the rabbi thought. ‘Whom am I cheating? The Almighty?’ The rabbi shuddered and covered his face in shame….Suddenly, something inside the rabbi laughed. he lifted his hand as if swearing an oath. A long-forgotten joy came over him, and he felt an unexpected determination. In one moment everything became clear to him…
Rabbi Jonathan Danziger then asks one of the congregants, Shloime Meyer, if he can work for him, picking fruits in his orchard. He will no longer serve as rabbi. His mind is made up. That life is behind him.
As the story ends, the rabbi wonders:
Why did you wait for so long? Couldn’t you see from the start that one cannot serve God and man at the same time?
Danziger might have imagined that as rabbi he would spend his days studying the scriptures, engaging in learned debates about their interpretations, dispensing sage advice to the perplexed, and being respected and admired for his great learning and moral rectitude. Instead, his certifications met with disfavor and disapproval, and his parishioners found a veritable litany of complaints to level against him. He might have contemplated a life spent in contemplation of the sacred, but instead he found himself immersed in the profane.
Rabbi Danziger’s resolution of his crisis is perhaps novel, but his crisis is not. He has come to realize like all too many of us, that our exalted visions of our work and our life, are sadly incongruent with the actual lived reality of our lives. (The What People Think I Do/What I Really Do meme often captures this quite well.) Our levels of awareness about this fact can vary. Some rabbis might be just as immersed as Danziger in the all too worldly goings on about them, but might disregard this evidence in favor of holding to their preconceived notions of their imagined life. Such illusions might be desirable too. The mundane realities of life sometimes require, as a palliative of sorts, some elaborate storytelling about what we have let ourselves in for. But only if they do not create the kind the painful dissonance that finally forced Danziger to put down the holy scrolls and head for the orchards. The maintenance and sustenance of that inner discord can be more damaging than the price paid for a life left behind. In those cases, it might be better to seek the kind of reconceived life that Danziger sought.


July 24, 2014
Christopher Hitchens: Pro-War, Anti-Death Penalty
A few days ago, Corey Robin wondered on his Facebook status:
Something I never understood about Christopher Hitchens: how such a fervent opponent of the death penalty could be such an avid supporter of war.
Supporters of the death penalty, of course, are notoriously fond of war (they also tend to be ‘pro-life’ in the debate on abortion). But why would a ‘fervent’ opponent of state-sanctioned murder be an ‘avid’ supporter or war, another form, one might say, of state-sanctioned murder?
The answer may, I think, be found in the kind of fascination war exerted over Hitchens. He did not think of it as merely an instrument of politics, one wielded to bring about very specific political objectives. Rather, it held him in a kind of aesthetically inflected thrall: he found it beautiful, stirring, exciting. Many, like Hitchens, are entranced by the beautiful images that war furnishes for our imagination; evidence for this claim can be found in the large number of coffee-table books that purport to be illustrated histories of war. These images need not be just those of exploding munitions and ruined buildings; war utilizes weaponry and men, and photographic and artistic depictions of these, utilized and engaged in combat (or waiting to be) are among our most iconic representations. Gleaming aircraft, sleek, water-plowing battleships, smoothly recoiling guns, men (and now women) in svelte uniforms, buttoned up, hard and unforgiving. It’s hard to resist the appeal of these. War provides many visual horrors, of course, but these are all too often swamped by the aforementioned cavalcade. (I’m leaving aside for now, the enduring place that war holds in our imagination as a zone for the establishment of masculine credentials and brotherhood.)
The death penalty, in sharp and instructive contrast, is almost uniformly grubby and sordid. It is underwritten by retribution, an ignoble business at best; it is wrapped up in tedious layers of penal codes, legal wrangling, and procedural disputes; it happens quietly and grimly, away from the public eye, the punishment that dare not speak its name. All associated with it are diminished; the condemned have lost their human dignity well before they ascend the gallows, the jailers and clergymen and executioners appear merely as bureaucratic functionaries, executing–no pun intended–with nary a trace of flair or style, the bookish orders laid out in the court document sanctioning the killing. There is no glamour, no sheen, no gleaming edges in the death penalty. It is dull, dull, dull. Especially in this guillotine-free age.
If the death penalty could have been lifted, somehow, out of the unappealing morass of state bureaucracy, judicial procedure, and clumsy modes of execution, if it could somehow have brought with it some of the frisson that war provides, then I do not doubt that Hitchens would have been all for it.
Note: One should also not forget that Hitchens considered himself a contrarian. Perhaps his opposition to the death penalty was formed at a time when public support for it ran high; his support of the Iraq War was probably viewed by him as a gleeful flipping of the bird to his former mates on the Left.


July 23, 2014
Freedom in the Absence of Social Convention
In reviewing Arturo Fontaine‘s La Vida Doble, “a harrowing examination of violence during the Pinochet period,” whose heroine is Lorena, “a female terrorist who is tortured, changes sides, and becomes a torturer herself”, David Gallagher writes:
But why in fact do good fathers and meek husbands and generous lovers undertake such cruel torture? Here Lorena sees the torturer as someone who becomes isolated from any sort of moral standard while granted absolute impunity for what he does, no matter how vile. In the glib manner of a French student of the Sixties, she speculates about two opposing views of what happens when social conventions have no effect. One is that you recover the innocence of the noble savage. The other—the relevant one in this case—is that you revert to a state of primal savagery. Because there are no limits, she tells the “novelist,” an inner monster springs to life, one we all potentially harbor. Once there is no possibility of punishment, “the monster we carry within us, the beast that grows fat on human flesh, is unleashed within the good father or the daughter of a good family.”
Notice that Lorena establishes a dichotomy–there are only two possible modes of behavior possible when social conventions cease to constrain us. But we might speculate too–perhaps for the benefit of a future novelist–that the absence of social conventions might result in a new kind of freedom, one in which, rather than revert to the two states described above, human beings experiment with finding new orderings of moral and ethical values. Certainly, it isn’t clear why these “two opposing views” are the only possibilities open to human beings, why our options in the face of the absence of social conventions would be so limited.
The “two opposing views” that Gallagher refers to are influential, of course, but that might be due to a lack of imagination on the part of those speculating about a convention-free world. In the absence of convention it would also seem just as likely that rather than being innocent savages or beasts, we might merely be utterly confused and bereft, content to experiment with modes of behavior and interaction that might provide some guidance for how to proceed in this newly ordered world.
The “lack of imagination” I refer to above, is an almost inevitable consequence, of course, of a deeply essentialist view of human nature, one committed to the idea that the visible human persona consists of two layers: an abiding, enduring, inner self temporarily covered by a thin epidermis of social convention. But a more existential view would suggest that when social conventions are removed, we have no way of saying what will remain. Perhaps the new being that will emerge will delight in alternating between innocence and bestiality, perhaps it will develop ever more complex characteristics, perhaps it will grow in dimensions–moral, psychological, and emotional–that we cannot yet fathom, gripped as we are by conventional modes of thought. When we think of how constraining social conventions–fundamentally and broadly understood–are, such speculation should not strike us too outlandish.


July 22, 2014
Making the Abstract Concrete
A few weeks ago, I posted the following quip as my Facebook status:
You don’t really get _Civilization and its Discontents_ till you bring up a child.
And then, a week or so later:
Apropos of my recent comment that you don’t really get _Civilization and its Discontents_ till you raise a child: I don’t think you really get Quine’s inscrutability of reference thesis till you start to shepherd a child through the early language acquisition phase.
There is a more general point to be made here, of course: that seemingly abstract academic theories spring sharply into focus when they are viewed through the lens of personal, emotionally tinged experiences. And child-rearing is perfectly designed provide visceral contact with their truths.
Consider then, my first example above. The child’s first contacts with the civilization that is its host come via it parents, those responsible for not just feeding, bathing, clothing, and otherwise protecting it, but also, all too soon, for inculcating it into the ways of the world. It has to be warned–in an appropriately modified tone of voice–not to bite and scratch, or harm itself; it has to be restrained–again, sometimes for its own safety, sometimes for that of others; it has to be corrected in countless ways from proceeding along its own path, and guided into trajectories more amenable to those deemed more appropriate for its development. And so as I noted:
Sometimes I’m saddened terribly; something wild and primeval is being constantly tamed, molded, channeled, impressed on. Too essentialist, I know, and not existential enough, but still….
This channeling, this impressing, continues as the child comes into contact with others besides parents, of course, but it is the parent who has most proximal contact with the changes wrought in the child, and is thus most likely to be affected in turn by them. The changes in one’s child can produce some melancholy as we realize the coming to be be, and passing away, of different identities; while we happily welcome the growing child into the community of language speakers and concept-wielders, we might regret too, just for a bit, the absence of the babyish bundle, all coo and gurgle, that was once ours to hold tight and close.
And then again, as a friend of mine noted in response to the last quote above:
Yeah, but I’m glad they stop smearing their feces on the wall.


July 21, 2014
Dreams of the “Undiscovered Country”
Hamlet suggested that “What dreams may come after / When we have shuffled off this mortal coil / Must give us pause” and that “The dread of something after death / The undiscovered country, from whose bourn / No traveler returns, puzzles the will.”
The eternally indecisive Danish prince was right, of course: many, if not all, of us have wondered what lies in store for us after death. The more certain among the materialistically minded reassure themselves that oblivion awaits, a blankness and a void like that of the deepest sleep, like the kind that was our lot before we were ejected into this world naked and helpless and conscious. Others–convinced of the claims of some of the world’s great religions–speculate that eternal torment or pleasures of some form lies in store. And perhaps yet others, stranded at some indeterminate point between these viewpoints of spiritualism and materialism, fret that our knowledge of the relationship of consciousness to the material body is limited and that states of being that we have no epistemic access to, and thus no conception of currently, might be our postmortem fate.
Such uncertainty, of course, is an invitation to the very anxiety referred to by Hamlet: Perhaps our consciousness–in some shape or form–might survive the destruction of our corporeal self; if so, what form would it exist in? What states would persist? Would we–perish the thought–remain locked into some endlessly painful or terrifying state of being? One did not have to believe in divinely dispensed heavens or hells to believe that the riddles of existence might have facets to them painful or pleasurable to the remnants of a once thriving consciousness. (You could call this kind of thinking a holdover of a theistic or eschatological way of thinking.)
At times in the past, I sometimes found myself in precisely such a state of mind and found that my greatest fears amounted to two kinds of states. The first was one in which I felt as if smothered by an impenetrable darkness that lay suffocatingly over me, and which could not be pushed away; my movements were restricted by an all-enveloping black veil. I would be conscious of this darkness but unable to move, unable to illuminate it; it was a sensory deprivation tank of sorts but one in which I could sense and see the darkness pressing in on me. In the second kind of state, I imagined myself–without any sense of corporeal being–to be suspended in a realm that can best be analogized with the space we can imagine lying between those imposing maps of gigantic galactic clusters: endlessly expansive and relentlessly empty.
I found both these allusive suggestions of a postmortem persistence of some fragment of consciousness chilling. (In the second case, almost literally so.)
These lost their grip on my imagination when I realized that in both cases, they reflected deeply held phobias and anxieties of a sort. The first was the fear of being buried alive (those childhood tales of immurement had left a mark) and the second was the fear of being lost or left alone (yup, the childhood impress again.)
I had merely transferred my fears from the here and now to the hereafter–so vivid were they that I imagined them persisting endlessly, even after death.


July 20, 2014
Evicted From The Twenty-Twenty Club
In 1998, I learned I no longer had twenty-twenty vision. This knowledge did not come to me suddenly. On a couple of occasions at work–on the open-plan office floor of an online brokerage–I noticed I could not clearly read the lettering on the ticker-tape that ran across some of the large monitors that hung from the ceilings. And then, a little later, more decisively, out for a walk one night with a girlfriend, I was brought up short by her ability to read street numbers and names off signs well before I could. What, I wondered, was going on? An optometrist quickly put me in the know: I was ever-so slightly myopic in both eyes, with the left just a little worse.
I come from a family of pilots; twenty-twenty vision ran in my family. We did not wear glasses. Well, actually, hang on a second. Toward the end of his flying career, my father developed a cataract in one eye and in the middle of his, my brother was diagnosed with mild myopia (he continued to fly with prescription glasses). Perhaps developing mild myopia at the age of thirty-one was not so surprising.
It was still shattering news though. For weeks after my diagnosis, I moped around, unable to drag myself to the local opticians to order a pair of eyeglasses. It was, I realized, after a brutal ankle injury from a few years before, another disruption of a pristine ordering of my body. My third-degree sprain had left my ankle permanently weakened and unstable, and now this myopia meant a central sensory organ had undergone another irreversible decline. First, locomotion was affected, and then that which guided locomotion. I was no longer whole; I was flawed, damaged somehow. I did not think I possessed bodily perfection before, but I did not consider myself–extremely fortunately–to be laboring under any manner of handicap. Now, they were piling up, radically transforming a self-image ragged at all too many edges. The radical decline promised me as a gift for chronological advancement had commenced.
The day I finally, reluctantly, picked up my prescription glasses and tried them on, I was bemused by the way the world snapped into focus. How long had I not noticed these innumerable blurrings that were now removed, made distinct? The gradual decline had been sneaky and insidious, a hidden fifth column doing its dirty work in my optical corridors. I was overcome by an intense longing for days gone by–when I could watch movies, or distant sunsets, or navigate darkened streets without an ugly prosthetic device sitting on my nose. I was no longer human; I was a cyborg of sorts.
Sixteen years on, of course, I have accepted my altered and corrected vision–in a fashion. I carry my glasses everywhere, though I only put them on when needed. I still envy those in the twenty-twenty club, of course. And on occasion, I still remember the rising tide of panic that swamped me when the optometrist leaned over and softly said, “How long have your eyes been like this?”


July 19, 2014
No Atheists in Foxholes, My Ass
Here is vignette #7 from Ernest Hemingway‘s In Our Time:
While the bombardment was knocking the trench to pieces at Fossalta, he lay very flat and sweated and prayed oh jesus christ get me out of here. Dear jesus please get me out . Christ please please please christ. If you’ll only keep me from getting killed I’ll do anything you say. I believe in you and I’ll tell everyone in the world that you are the only one that matters. Please please dear jesus.
No atheists in foxholes, indeed.
This little bon mot, intended to deflate the pretensions of skeptics and disbelievers has a long and dishonorable history; it is often trotted out, a triumphant smirk spreading across the countenance of the faithful as they surmise they have honed in on the Achilles heel of the atheist. The atheist stands indicted: he is merely a fair weather disbeliever. When the chips fall, he will duck for cover under the shelter provided by the Good Lord, just like the rest of us. (There is another, crafty, way to interpret it, of course: that only believers go to war. But I don’t, ahem, believe that.)
I wonder if the faithful ever stop to think–I know, silly question–about how awful an argument for faith this is. It suggests that our true believing nature will be revealed when shells are cascading down around us, when, in short, we are possessed by extreme fear, anxiety, and panic.
But why would anyone imagine that a psychological state riven by such extreme sensations and affects is one in which we will rationally come to hold beliefs? One might as well just say that in these states, we witness the breakdown of rational decision-making and belief formation, that the beliefs held by those in foxholes are forced upon them by their circumstances.
Similar arguments are made in other domains, and they are just as silly. Consider, for instance, a familiar claim made about reversions to states of nature–as in post-apocalyptic scenarios:
[A] standard moral associated with post-apocalyptic cinema or literature–one proclaimed with varying degrees of explicitness–is, ‘This is what humans would be like if the pre-political, pre-social “state of nature” were to be restored, if laws, the restraints of conventional morality, and all forms of social and political organization were removed’….The apocalypse thus acts as a pretense shredder, showing our supposed social, cultural and moral sophistication to be shallow and superficial, a fair-weather orientation that is only maintained by the force and rule of the law and the comfort of the good times. So long as no desperation is called for none will be shown. But the seven deadly sins will be on ample display once those conditions no longer hold true.
But:
There is an alternative moral to be drawn of course: that the human nature revealed to us in these depictions of an apocalypse’s aftermath is not the ‘true’, ‘real’ or ‘natural’ one at all. Instead what is shown in post-apocalyptic art are traumatized human beings whose responses–to their environment, to each other–are pathological precisely because of the nature of the changes undergone. The death, disease and pestilence of the apocalypse, for one. Post-apocalyptic visions are thus indeed revelatory, not because they show us how we were ‘before’ we ‘became civilized’ but because they show what our response would be to the dramatic, traumatic loss of our political and social orders.
To conclude, let me complete my excerpting of the vignette above:
The shelling moved further up the line . We went to work on the trench and in the morning the sun came up and the day was hot and muggy and cheerful and quiet. The next night back at Mestre he did not tell the girl he went upstairs with the Villa Rossa about Jesus. And he never told anybody.
Note: Italics and capitalization of Hemingway vignette as in original 1986 Scribner Classic edition, pp. 67


July 18, 2014
Noam Chomsky, My Palestinian Student, and a Gift
A few years ago, at Brooklyn College, I taught a class on the formal theory of computation. We covered the usual topics: finite state automata, context-free grammars, Turing machines, computational complexity. As we worked through the theory of context-free grammars, I introduced my students to the concept of their Chomsky normal forms. As a quick preliminary, I noted that this form was due to Noam Chomsky, “the MIT linguist well-known for his seminal work on the formal theory of linguistics.” I paused, and then went on, “Interestingly enough, Professor Chomsky is equally well-known for his radical political views and activism, especially regarding American foreign policy, Israel etc.” These quick remarks made, I went on to the business of production rules, non-terminal symbols etc.
Once class ended, I walked back to my office, and began the my usual post-class activities: checking email, drinking left-over coffee etc. As I did so, there was a knock on the door. A student from my class stood there. I had seen him before in class, but he had never spoken up yet. Now, he did so. He introduced himself with an Arab name. (I’m embarrassed to say I do not remember his name.) Then, he spoke again, “Professor, I just wanted to thank you. You brought up Chomsky in class, but you didn’t just say he was a linguist. You talked about his politics too.” Surprised, I said, “Well, I didn’t say that much. Just a quick note really.” My student, though, would have none of it, “Well, professor, too many other professors would simply not mention that aspect of him, as if it was an embarrassment. As a Palestinian, it made me really happy to hear you bring him up.” I didn’t quite know what to make of this, so I thanked him for his kind words. We then chatted for a bit about his background, his family, and that was that. (I remember asking him what passport he carried, a question that always fascinates me when it comes to the modern world’s stateless.)
As the semester went by, my student and I only spoke a few more times. He was unfailingly polite and courteous, and diligent with his work. We might have talked once more about Israel and Palestine, perhaps when some Middle East crisis du année had occurred. Finally, the semester wound down; I assigned the students their final exams, graded them, handed in their grades. Shortly thereafter, one day as I worked in my office, there was a knock on the door again. Once again, it was my student. In his hand, he held what looked like a gift-wrapped item. He thanked me for the class and then handed over the package, saying “This is for you, just a thank-you.” I was a little nonplussed and tried to decline, but again, he was persistent, pressing it into my hands, saying it was just a trifle. Finally, I thanked him and opened my gift. It was a copy of Avi Shlaim’s The Iron Wall: Israel and the Arab World. He went on, “I think you might find this interesting.” I agreed.
I lost contact with my student shortly thereafter; I often wonder where he is now. I often wondered too, what he must have felt like, unable, all too often, because of the settings he found himself in, unable to say what was on his mind; I wondered how much he had heard that he couldn’t respond to; I wondered how limited he must have felt his various avenues of expression to be if the mere mention of Chomsky’s activism by a professor in a classroom had felt like an affirmation of a kind.


July 17, 2014
The Asymmetric Fallout of Operation Protective Edge
‘Collateral damage‘ and ‘friendly fire‘ seem to be two euphemisms with which we–as a civilization–are doomed to be persistently reacquainted. Especially if war continues to retain its popularity as an instrument of foreign policy or even law and order maintenance.
Which brings me, of course, to Israel, Gaza, and Hamas. Cycle of violence narratives are wearisome, and the Israeli-Palestinian one is no exception. Now again, there is violence against Israeli citizens, and then violent retaliation, which as on too many previous occasions, kills innocent men, women, and children. The discourse triggered by this latest eructation in the Middle East unsurprisingly follows a familiar pattern. Here are Israeli talking points: Israel is locked in an existential battle for its survival by any means necessary; Hamas is committed to the destruction of Israel; no nation can tolerate indiscriminate violence directed against its citizens; Hamas uses civilians as human shields’; the Israeli Defense Forces (IDF) does the best it can, scrupulously limiting harm to civilians; the real blame rests on Hamas. The public relations disaster this latest episode seemingly engenders–accusations of war crimes and the use of disproportionate force, the gory images of dead children, the gap between Israeli pronouncements and their actions–mean little to Israel’s minders: they know news and commentary flowing out of Gaza does little to convince or sway anyone. Most minds are already made up on the Israel-Palestine ‘conflict.’
Which is not good news for the Israelis, but it’s worse for the Palestinians. Certainly, there is ample worldwide rhetorical support for their cause, but the material circumstances of their being and the imbalance in the reckoning of Israeli and Palestinian resources–the former backed up by the not inconsiderable economic and military might of the US, and by its reluctance to exert its diplomatic will to bring a halt to the fighting–mean this conflict, and the others like it that will follow, will weaken the Palestinian cause further. If Hamas’ hope is that by firing rockets–remarkably poorly directed and carrying little explosive punch–into Israel, it will provoke Israel into the kinds of actions that will increase Palestinian resentment and find more recruits–worldwide–for its cause, then it has reckoned accurately but perhaps not too wisely. (Its refusal of a ceasefire shows further lack of clear thinking.) There is diminishing support for the Palestinians in Israel, especially among those formerly undecided; Israelis themselves–as the retaliatory lynching of Palestinians and social media evidence demonstrates–are becoming increasingly radicalized and descending into a rhetorical space marked by bloodcurdling calls for genocidal acts against Palestinians. Indeed, they may even count on criticism of Israel as provoking a useful defensiveness and circling of the wagons.
The radicalization of resistance to Israel does not have the same implications for Israel as the radicalization of Israel has for Palestine. Once–perhaps in some mythical time–Israeli liberal and progressive factions could be counted on to mount some rhetorical and active resistance to that nation’s actions against the West Bank and Gaza; now, those same groups have shrunk and have ceded the discursive space to those of Netanyahu’s ilk. For two parties locked in war, extremist tendencies in the polity of the more powerful one can only have worse consequences for the other.
A pox has already fallen on both houses; one bears the brunt just a little more.

