Miles Watson's Blog: ANTAGONY: BECAUSE EVERYONE IS ENTITLED TO MY OPINION , page 29
October 8, 2017
Scenes from the Class War
In my life I've had many jobs, from suit-and-tie to business casual to “just don't wear shorts.” But with the possible exception of the tough-guy street clothes I donned when conducting field visits as a parole officer, I could never have been mistaken for a member of what we generally call “the working class.” I never wore mechanic's overalls, or a reflective vest, or a shirt with my name stitched to it. I never sat in the back of a pickup truck on the way to work or carried a lunch pail or needed an extra seat on the Metro for my hardhat and tools. The space I occupied, even when I was making very poor wages, wages poor enough to be considered poor and never mind working class, was in fact a middle-class space. But this statement requires a bit of clarification.
Like most Americans, I have generally associated social class with money (or at least its absence). The government does it, too; there are very sharp lines drawn between “poor,” “working poor,” “lower middle class,” “middle class,” etc., and these lines are drawn by virtue of yearly income. But as George Orwell so brilliantly pointed out in his book The Road to Wigan Pier, class is often a state of mind and not merely a matter of income. In the England of the 1930s, he pointed out, a greengrocer and a naval officer had approximately equal incomes, but no one, not even the grocer, would have made the claim they were of the same social standing: the grocer, even if he possessed a superior income, was understood to be part of the “lower classes” while a naval officer, even one deeply in debt, with no land or title, stood well above him on the social register simply by virtue of his commission.
America's class system has similar nuances, at the middle-class level anyway, but ours tend to be weighted in favor of expectation – specifically, the sort of life we expect to lead, the amount of money we expect to have, and the way we expect to be treated by others in social situations. My own roots in regards to class are worth noting. My maternal grandfather was raised in a Catholic orphanage and at the age of fifteen was already a combat veteran, having lied about his age to join the U.S. Navy during WWI. He ended up becoming quite wealthy in the Roaring 20s, but the Depression ruined him and he spent decades slowly re-entering the adoptive class from which he'd been expelled. When he died prematurely in 1956, he was still considerably short of his earlier success, and it was only due to his wise investments that my grandmother and her daughter, my mom, were able to survive in comfortable circumstances. My paternal grandfather, on the other hand, was an itinerant electrician whose house in Chicago, which he shared with his wife and three children, was scarcely larger than the studio apartment in which I now somewhat claustrophobically inhabit. His children -- my father and uncle and aunt -- had to cut their own paths to the middle class via the power of hard work and scholarships. Thus, prosperity in my family tree was a relatively new phenomenon when I arrived on the scene in 1972. Yet knowing no other mode of existence, I possessed a middle-class outlook which I maintained – or which maintained itself, by a species of momentum – not merely through childhood and my teenage and college years, but through the long periods of poverty and near-destitution which came afterward. Part of this outlook took the form of expecting that the lifestyle into which I had been born would continue when I left the parental/collegiate nest and struck out on my own. I never thought about it consciously, but my general attitude was that somehow, even with an entry-level job, there would be no backsliding in my lifestyle or outlook. Indeed, even when I was near-destitute, averaging a negative bank balance the week before any pay period, coming home to a mailbox full of duns, eating spaghetti twice a day and going to bed hungry, everything about me remained fundamentally middle-class: dress, manners, speech, worldview, personal habits, even my choice of friends. I may have been poor, but I never considered myself part of the poor or of the working class, and neither did anyone else. Though I ate in truck-stop diners and drank in blue-collar bars, I was never mistaken, even accidentally, as someone who “worked” (meaning sweat) for a living. People knew at a single glance that I had been to college, that I preferred reading to television, that I'd never be able to change a tire by myself and that I'd probably use “perhaps” in a sentence if given half a chance. And this very correct classification carried with it both privilege and burden.
On the privilege side, I knew that I would be accorded greater respect, and often deferred to, by people in the service economy – waiters, countermen, cashiers, shopwalkers. Police officers would treat me with more civility than a prole or a poor man, and those I stopped to ask for the time or directions would be more accommodating. Poor women often regarded me as more appealing than men of their own class, whereas middle-class ladies would regard me as one of their own. Even rich girls would find me unthreatening, if nothing else. On the job front, I cut a better profile in interviews than someone who might not have been as articulate or as assured in a professional setting; therefore I could and did land work for which I was unqualified, purely on the basis of polish I had acquired as a result of my semi-genteel upbringing.
On the burden side, I was also despised by the very same class from which, in purely financial terms, I was actually indistinguishable. Some men wanted to start fights with me for no other reason than my speech-patterns, while some women went out of their way to let me know they considered me weak and effeminate for the same reason. Mechanics, plumbers and electricians – even tow-truck drivers – took one look at me and saw dollar signs, knowing in their hearts that I'd not know the difference between a cylinder head and a pile of jelly donuts. Homeless people who stared right through working-class pedestrians made beelines for me, often very aggressively, assuming I had money to spare (I vividly remember one homeless man turning away from me in disgust when I offered him a handful of silver change: clearly my clothes and manners had told him “cash only.”) The general consensus among working-class or poor people I encountered seemed to be that I was spoiled, soft, arrogant, condescending and in possession of an education which had no practical value and was therefore completely useless. And they were not entirely wrong. Class prejudices may be odious and immoral, but they exist, and like most generalizations and stereotypes, they are grounded in reality.
Life is full of strange and unexpected shifts of fortune, however, and recently an event transpired which allowed me a perspective on social class which I'd been previously denied. After a number of years working in Hollywood making video game trailers -- a nerd's dream-job, and a lazy man's as well -- I returned to the world of make-up effects. Now, when I refer to "make up effects" I am not referring to beauty makeup, nor to special effects, nor prop-making, nor visual effects, but to the craft (and in some cases the art) of manufacturing foam-latex and silicone appliances for movies and television shows. Every zombie, every demon, every monster, every badly wounded person, dead dog or cadaver seen in a movie or a TV show is constructed, wholly or in part, from foam latex (rubber) or silicone. And the process of making these things is about as messy and laborious as anything can be.
For starters, both foam latex and silicone are messy as hell to deal with. The former, in its liquid state, has the consistency of warm cake batter or icing, and has an especial affinity for sticking to clothing and arm-hair -- indeed, many "foam runners" shave their arms to eliminate the pain of having to remove the latex from their skin. The latter, in its liquid state, is probably the most aesthetically disgusting substance you will ever encounter, somewhere between seminal fluid and snot. Throw in everything else we have to work with -- acetone, 99% alcohol, styric acid, plaster of Paris, paste wax, etc., etc. -- and you have a perfect storm of dust, talcum powder, liquid and goo flying about all day long. This shit ends up not merely on clothing but in your hair, your ears, your nostrils, and sometimes in much more embarrassing areas as well. Any day in the summer where I take less than three showers is remarkable in itself.
Now, it so happens that both latex and silicone pieces are formed in molds, and these molds are often enormous and extremely heavy, since the positives are sometimes fashioned out of stone. The molds are held together by bolts, and when they are opened after baking, they have to be opened using both power drills and crowbars -- a task which requires both the raw animal strength of a lumberjack and the gentle dexterity of an eye surgeon. Likewise, every piece of equipment in the shop has to be cleaned after it is used, which means whisks, bowls, foam guns (picture an enormous syringe, the size of a shotgun), mold straps, etc., have to be scrubbed several times a day: if you have any form of arthritis or tendonitis, this repetitious strain on your hands will introduce you to levels of pain you weren't previously familiar with. In some cases, large foam pieces like cowls or full-body suits have to be washed and then wrung-dry in a mangler which would look quite at home in a museum dedicated to the Spanish Inquisition. And all of this, and much more, is done over and over again in conditions of deafening noise, blistering heat (have you ever been inside a walk-in oven?), suffocating dust and time constraints which would shame a mayfly. During shooting season, when we are running pieces for up to four television shows at a time, not to mention the occasional film, it is not uncommon to remain on one's feet for eight hours a day, in a ceaseless blur of exhausting movement: lifting, carrying, bending, kneeling, crawling, straining. (There have been times after work when I was too physically tired to drive home and had to rest for ten or fifteen minutes in my car before turning over the engine.) The point I'm trying to make is that what I do is very messy, and that in my work-clothes, I cannot be mistaken for anything but a member of the working class. My normal ensemble goes as follows:
Hot weather: Ball cap, sunglasses, bandanna (the Maryland State flag), sleeveless t-shirt encrusted with four of five different colors of old foam latex, spattered with paste wax, dusty with talcum powder and styric acid and shop dust. Work belt, similarly dirty, with heavy work gloves jammed in one pocket and latex gloves jammed in another. Cargo pants or jeans, out at one knee, filthy, and covered at the knees by thick pads whose logos have been long since rubbed away by hours of kneeling on the shop floor. Sneakers or boots, so layered in old foam the actual shape of the footwear is sometimes tough to determine.
Cold weather: All of the afformentioned, with a wool “burglar's cap” instead of a baseball hat, and a work-ruined khaki jacket with a broken zipper and ink stains all down the front.
In addition to this I usually have earbuds either screwed into the wells of my ears or dangling from the edge of my bandanna. Occasionally safety glasses or some other large tool, like an industrial box-cutter, swing from my belt. And of course I seldom bother to shave during the week, so I'm carrying a three-to-five day growth of beard as well. And this is the condition in which I both arrive to work at seven in the morning and leave at four o'clock in the afternoon, which means that any errands I run on the way to, or from, work, find me wandering about in my dirty MUFX gear. In the months since I've been doing this, I've noticed a difference -- sometimes subtle, sometimes profound -- in the way the world reacts to me.
For starters, when I walk down the street, or into a grocery or convenience store, the first thing I encounter is a decided sense of solidarity with other members of the blue-collar brigade. The guys stocking shelves at Ralph's, the cashiers at 7/11, the men in diners on break from road-construction projects, the Teamsters and production assistants on film shoots, all of these people who used to look through me as if I simply weren't there now offer a single, sincere-looking nod of commiseration. The landscapers and gardeners of Los Angeles, inevitably Mexicans or Mexican-Americans, also have included me into their fraternity. They sweat for a living, and they can see that I do, too: barriers presented by race, ethnicity and age all seem to dissolve in that single nod, because regardless of the color, hue or smoothness of one's skin, the sweat which pours from it looks exactly the same. This sense of group camaraderie manifests in small but profound ways. A few weeks ago I was walking to the door of the 7/11 across the way from my house, and a tough-looking biker dude of about fifty years reached the door a half-second ahead of me. This is precisely the sort of guy that, when I was a college student, government worker or entertainment industry flunky, used to regard me with hostility and contempt. I daresay if I had been in my slacker street clothes, he would have slammed the door in my face. But noticing my rig, my dirtiness, and my air of physical exhaustion, he seemed to recognize me as one of his own...and he held the door for me and let me enter first. The gesture was trivial but his glance seemed to say, "I get it, man; I know what it's like, and I can relate."
Looking so clearly like a WORKING MAN can have humorous side-effects as well. When I stroll into the bank to pay my rent, I notice a certain physical wariness from the well-dressed middle and upper-class people around me. They stand a little further away -- probably because of the dirt -- but also because they cannot relate to me, or rather they think they can't. I am that home-grown alien species, the workman, who is not supposed to arrive unless summoned, speak unless spoken to, or leave without permission. At the same time, they also tend to act slightly intimidated by my presence, as if I might be prone to punching someone in the face rather than simply saying "excuse me" when I want them to move. The slight but obvious discomfort they seem to feel, being stuck in a social situation (meaning a line) with someone they would not ordinarily associate with, and the often condescending or patronizing manner they assume should they try to pass the time, always makes me laugh a little up the sleeve I'd have if I only I were wearing them.
On top of this, I have often observed the loudmouth, trouble-making sorts you sometimes encounter on the streets of a major city give me no trouble at all when I'm in my workman's gear. They don't even make eye contact as we pass. It's true the blue-collar set is generally more ready, more willing and more able to kick ass if called upon to do so, and perhaps they understand this: or it may be they're simply savvy enough not to pick fights with a man who has two ridiculously sharp knives clanking on his belt. In any event, nobody -- not drunks, not homeless people, not smart-ass teenagers -- thinks it's worth their while to bother me, and this comes as a pleasant change from the days when I was a visibly middle-class guy in a neighborhood of the working poor, and couldn't go to the corner store without wondering if I was going to have to blast someone out of their socks.
From the standpoint of the sexes, I've noticed quite a bit of difference in the way I'm regarded by both men and women of all kinds. Men of higher social standing and better dress seem slightly intimidated and uncomfortable when I walk into a room with sweat glistening and tools all a-jangle. Somehow their masculinity is compromised by my mere presence, even when I am physically smaller or less muscular than they. This can be traced to basic middle and upper-middle class insecurity about blue-collar workers in general, for at the core of every male member of the MC and UMC lies feeling of inadequacy when confronted by someone who can actual do things. You may have a degree from Stanford and a Masters from Harvard, you may speak six languages and play the cello like a madman, you may be in perfect physical condition and clock six figures and know everything about international relations, but if you can't change the fuses, unclog a drain, shingle a roof, use a lathe, plane a piece of wood, handle an angle grinder, or tinker with an engine, deep inside you don't really feel like you have anything between your legs. The look of pathetic helplessness, of inadequacy and impotence, that any male white collar wears when standing beside an auto-mechanic, tow-truck driver or plumber hard at work is gross evidence of this. The white collar wants to take refuge in social superiority, in money, in the power of his intellect, but every time he attempts a sneer he is reminded that none of these things has any value in a crisis. The very things which makes him such a commodity at work, or in the dating scene, become liabilities when anything really practical needs to be done. What's more, awareness of his own physical softness often plagues him. In comparison with that callused, work-hardened, oil-smeared dude mucking about under his sink, he feels almost effeminate. Thus the grotesque sight of a man making $300,000 a year trying to make casual small talk and act buddy-buddy with a journeyman plumber who makes a tenth of that -- not because he feels any sense of kinship, but because his own inadequacies drive him to prove he can be "one of the guys," too.
In regards to women, I noticed changes of a different sort. If you take the work-version of myself out of a place like a bank, whose very nature tends to remind everyone of their social status, and place him in an area where there is more natural commingling, like a grocery store, reactions are more positive than I was expecting. There is less overt snubbing and more frank curiosity, possibly because -- so studies have told us -- women are genetically programmed to appraise and judge men based on their ability to protect and provide: and while a blue-collar man might not be able to provide much monetarily, his physical toughness and his practical ability around the house somewhat compensate for this. I've also noticed a little more respect from my female neighbors, when they see me trudging up my driveway in the afternoon, exhausted and filthy, tool-belt flung over one shoulder, work boots a-clattering on the concrete. Since, even in the 21st century, women probably perform an outsize percentage of physical, practical tasks around the home in comparison with their husbands, I'm convinced that they have greater innate appreciation for those whose jobs are of a generally physical nature -- a generalization, to be sure, but I think a fairly accurate one.
Perhaps the most profound change I noticed, however, comes not from without but from within. When I was working in video games, while the job itself was easy, the hours were extremely long, even brutal. Hundred-hour weeks were not unheard of, and as I once spoke about in this very blog, there was a time I went 30 straight days without a single day off. I used to drive down to Hollywood at nine AM in such a state of mental torpidity that I could scarcely operate the controls of my car, and often I wouldn't return until 3:30 the next morning. Yet one hot summer day, as I was sulkily plodding down Hollywood Way toward Warner Bros., I noticed on my left a crew of hardhats hard at work hammering shingles into a black rooftop. Even at that hour the heat of the sun was murderous, and the roofers, clad in their hats, neck bandannas, long-sleeved shirts and carpenter's jeans, looked as miserable as humans can look. I thought, "If you're feeling sorry for yourself, just imagine what their day is going to be like." I don't pretend that my present work is fully as arduous, or as dangerous, as roofing or construction, but it is hard enough for me, and it has served to remind me that despite the long periods of poverty and near-poverty I have faced in life, I never before really grasped what it meant to be of any other social class but the one I was born into. My middle-class attitudes, which carried me through life, sometimes buoying me up and other times dragging me down, are finally peeling away. I have, at the age of forty-five, finally managed to grasp that being born in a fairly comfortable set of circumstances does not entitle me to live within those circumstances for the rest of my life. Indeed, the gravitational tendency in capitalism is always toward poverty -- it is easy to be poor and to stay poor, but reaching the middle or upper middle class is damned difficult, and staying there, even if that class is your point of origin, is harder still. The scaled-down style in which I live, which I intended only to be temporary when I moved from Los Angeles to Burbank four years ago, has lately assumed a more permanent character. My material expectations have changed, possibly for the worse, but at the same time, I have freed myself of a dangerous delusion. I know now that in order for me to have what my parents earned, momentum is not enough. A sense of entitlement is not enough. Even my education is not enough. Only work -- real work -- will suffice. And as Orwell once said, that at least is a beginning.
Like most Americans, I have generally associated social class with money (or at least its absence). The government does it, too; there are very sharp lines drawn between “poor,” “working poor,” “lower middle class,” “middle class,” etc., and these lines are drawn by virtue of yearly income. But as George Orwell so brilliantly pointed out in his book The Road to Wigan Pier, class is often a state of mind and not merely a matter of income. In the England of the 1930s, he pointed out, a greengrocer and a naval officer had approximately equal incomes, but no one, not even the grocer, would have made the claim they were of the same social standing: the grocer, even if he possessed a superior income, was understood to be part of the “lower classes” while a naval officer, even one deeply in debt, with no land or title, stood well above him on the social register simply by virtue of his commission.
America's class system has similar nuances, at the middle-class level anyway, but ours tend to be weighted in favor of expectation – specifically, the sort of life we expect to lead, the amount of money we expect to have, and the way we expect to be treated by others in social situations. My own roots in regards to class are worth noting. My maternal grandfather was raised in a Catholic orphanage and at the age of fifteen was already a combat veteran, having lied about his age to join the U.S. Navy during WWI. He ended up becoming quite wealthy in the Roaring 20s, but the Depression ruined him and he spent decades slowly re-entering the adoptive class from which he'd been expelled. When he died prematurely in 1956, he was still considerably short of his earlier success, and it was only due to his wise investments that my grandmother and her daughter, my mom, were able to survive in comfortable circumstances. My paternal grandfather, on the other hand, was an itinerant electrician whose house in Chicago, which he shared with his wife and three children, was scarcely larger than the studio apartment in which I now somewhat claustrophobically inhabit. His children -- my father and uncle and aunt -- had to cut their own paths to the middle class via the power of hard work and scholarships. Thus, prosperity in my family tree was a relatively new phenomenon when I arrived on the scene in 1972. Yet knowing no other mode of existence, I possessed a middle-class outlook which I maintained – or which maintained itself, by a species of momentum – not merely through childhood and my teenage and college years, but through the long periods of poverty and near-destitution which came afterward. Part of this outlook took the form of expecting that the lifestyle into which I had been born would continue when I left the parental/collegiate nest and struck out on my own. I never thought about it consciously, but my general attitude was that somehow, even with an entry-level job, there would be no backsliding in my lifestyle or outlook. Indeed, even when I was near-destitute, averaging a negative bank balance the week before any pay period, coming home to a mailbox full of duns, eating spaghetti twice a day and going to bed hungry, everything about me remained fundamentally middle-class: dress, manners, speech, worldview, personal habits, even my choice of friends. I may have been poor, but I never considered myself part of the poor or of the working class, and neither did anyone else. Though I ate in truck-stop diners and drank in blue-collar bars, I was never mistaken, even accidentally, as someone who “worked” (meaning sweat) for a living. People knew at a single glance that I had been to college, that I preferred reading to television, that I'd never be able to change a tire by myself and that I'd probably use “perhaps” in a sentence if given half a chance. And this very correct classification carried with it both privilege and burden.
On the privilege side, I knew that I would be accorded greater respect, and often deferred to, by people in the service economy – waiters, countermen, cashiers, shopwalkers. Police officers would treat me with more civility than a prole or a poor man, and those I stopped to ask for the time or directions would be more accommodating. Poor women often regarded me as more appealing than men of their own class, whereas middle-class ladies would regard me as one of their own. Even rich girls would find me unthreatening, if nothing else. On the job front, I cut a better profile in interviews than someone who might not have been as articulate or as assured in a professional setting; therefore I could and did land work for which I was unqualified, purely on the basis of polish I had acquired as a result of my semi-genteel upbringing.
On the burden side, I was also despised by the very same class from which, in purely financial terms, I was actually indistinguishable. Some men wanted to start fights with me for no other reason than my speech-patterns, while some women went out of their way to let me know they considered me weak and effeminate for the same reason. Mechanics, plumbers and electricians – even tow-truck drivers – took one look at me and saw dollar signs, knowing in their hearts that I'd not know the difference between a cylinder head and a pile of jelly donuts. Homeless people who stared right through working-class pedestrians made beelines for me, often very aggressively, assuming I had money to spare (I vividly remember one homeless man turning away from me in disgust when I offered him a handful of silver change: clearly my clothes and manners had told him “cash only.”) The general consensus among working-class or poor people I encountered seemed to be that I was spoiled, soft, arrogant, condescending and in possession of an education which had no practical value and was therefore completely useless. And they were not entirely wrong. Class prejudices may be odious and immoral, but they exist, and like most generalizations and stereotypes, they are grounded in reality.
Life is full of strange and unexpected shifts of fortune, however, and recently an event transpired which allowed me a perspective on social class which I'd been previously denied. After a number of years working in Hollywood making video game trailers -- a nerd's dream-job, and a lazy man's as well -- I returned to the world of make-up effects. Now, when I refer to "make up effects" I am not referring to beauty makeup, nor to special effects, nor prop-making, nor visual effects, but to the craft (and in some cases the art) of manufacturing foam-latex and silicone appliances for movies and television shows. Every zombie, every demon, every monster, every badly wounded person, dead dog or cadaver seen in a movie or a TV show is constructed, wholly or in part, from foam latex (rubber) or silicone. And the process of making these things is about as messy and laborious as anything can be.
For starters, both foam latex and silicone are messy as hell to deal with. The former, in its liquid state, has the consistency of warm cake batter or icing, and has an especial affinity for sticking to clothing and arm-hair -- indeed, many "foam runners" shave their arms to eliminate the pain of having to remove the latex from their skin. The latter, in its liquid state, is probably the most aesthetically disgusting substance you will ever encounter, somewhere between seminal fluid and snot. Throw in everything else we have to work with -- acetone, 99% alcohol, styric acid, plaster of Paris, paste wax, etc., etc. -- and you have a perfect storm of dust, talcum powder, liquid and goo flying about all day long. This shit ends up not merely on clothing but in your hair, your ears, your nostrils, and sometimes in much more embarrassing areas as well. Any day in the summer where I take less than three showers is remarkable in itself.
Now, it so happens that both latex and silicone pieces are formed in molds, and these molds are often enormous and extremely heavy, since the positives are sometimes fashioned out of stone. The molds are held together by bolts, and when they are opened after baking, they have to be opened using both power drills and crowbars -- a task which requires both the raw animal strength of a lumberjack and the gentle dexterity of an eye surgeon. Likewise, every piece of equipment in the shop has to be cleaned after it is used, which means whisks, bowls, foam guns (picture an enormous syringe, the size of a shotgun), mold straps, etc., have to be scrubbed several times a day: if you have any form of arthritis or tendonitis, this repetitious strain on your hands will introduce you to levels of pain you weren't previously familiar with. In some cases, large foam pieces like cowls or full-body suits have to be washed and then wrung-dry in a mangler which would look quite at home in a museum dedicated to the Spanish Inquisition. And all of this, and much more, is done over and over again in conditions of deafening noise, blistering heat (have you ever been inside a walk-in oven?), suffocating dust and time constraints which would shame a mayfly. During shooting season, when we are running pieces for up to four television shows at a time, not to mention the occasional film, it is not uncommon to remain on one's feet for eight hours a day, in a ceaseless blur of exhausting movement: lifting, carrying, bending, kneeling, crawling, straining. (There have been times after work when I was too physically tired to drive home and had to rest for ten or fifteen minutes in my car before turning over the engine.) The point I'm trying to make is that what I do is very messy, and that in my work-clothes, I cannot be mistaken for anything but a member of the working class. My normal ensemble goes as follows:
Hot weather: Ball cap, sunglasses, bandanna (the Maryland State flag), sleeveless t-shirt encrusted with four of five different colors of old foam latex, spattered with paste wax, dusty with talcum powder and styric acid and shop dust. Work belt, similarly dirty, with heavy work gloves jammed in one pocket and latex gloves jammed in another. Cargo pants or jeans, out at one knee, filthy, and covered at the knees by thick pads whose logos have been long since rubbed away by hours of kneeling on the shop floor. Sneakers or boots, so layered in old foam the actual shape of the footwear is sometimes tough to determine.
Cold weather: All of the afformentioned, with a wool “burglar's cap” instead of a baseball hat, and a work-ruined khaki jacket with a broken zipper and ink stains all down the front.
In addition to this I usually have earbuds either screwed into the wells of my ears or dangling from the edge of my bandanna. Occasionally safety glasses or some other large tool, like an industrial box-cutter, swing from my belt. And of course I seldom bother to shave during the week, so I'm carrying a three-to-five day growth of beard as well. And this is the condition in which I both arrive to work at seven in the morning and leave at four o'clock in the afternoon, which means that any errands I run on the way to, or from, work, find me wandering about in my dirty MUFX gear. In the months since I've been doing this, I've noticed a difference -- sometimes subtle, sometimes profound -- in the way the world reacts to me.
For starters, when I walk down the street, or into a grocery or convenience store, the first thing I encounter is a decided sense of solidarity with other members of the blue-collar brigade. The guys stocking shelves at Ralph's, the cashiers at 7/11, the men in diners on break from road-construction projects, the Teamsters and production assistants on film shoots, all of these people who used to look through me as if I simply weren't there now offer a single, sincere-looking nod of commiseration. The landscapers and gardeners of Los Angeles, inevitably Mexicans or Mexican-Americans, also have included me into their fraternity. They sweat for a living, and they can see that I do, too: barriers presented by race, ethnicity and age all seem to dissolve in that single nod, because regardless of the color, hue or smoothness of one's skin, the sweat which pours from it looks exactly the same. This sense of group camaraderie manifests in small but profound ways. A few weeks ago I was walking to the door of the 7/11 across the way from my house, and a tough-looking biker dude of about fifty years reached the door a half-second ahead of me. This is precisely the sort of guy that, when I was a college student, government worker or entertainment industry flunky, used to regard me with hostility and contempt. I daresay if I had been in my slacker street clothes, he would have slammed the door in my face. But noticing my rig, my dirtiness, and my air of physical exhaustion, he seemed to recognize me as one of his own...and he held the door for me and let me enter first. The gesture was trivial but his glance seemed to say, "I get it, man; I know what it's like, and I can relate."
Looking so clearly like a WORKING MAN can have humorous side-effects as well. When I stroll into the bank to pay my rent, I notice a certain physical wariness from the well-dressed middle and upper-class people around me. They stand a little further away -- probably because of the dirt -- but also because they cannot relate to me, or rather they think they can't. I am that home-grown alien species, the workman, who is not supposed to arrive unless summoned, speak unless spoken to, or leave without permission. At the same time, they also tend to act slightly intimidated by my presence, as if I might be prone to punching someone in the face rather than simply saying "excuse me" when I want them to move. The slight but obvious discomfort they seem to feel, being stuck in a social situation (meaning a line) with someone they would not ordinarily associate with, and the often condescending or patronizing manner they assume should they try to pass the time, always makes me laugh a little up the sleeve I'd have if I only I were wearing them.
On top of this, I have often observed the loudmouth, trouble-making sorts you sometimes encounter on the streets of a major city give me no trouble at all when I'm in my workman's gear. They don't even make eye contact as we pass. It's true the blue-collar set is generally more ready, more willing and more able to kick ass if called upon to do so, and perhaps they understand this: or it may be they're simply savvy enough not to pick fights with a man who has two ridiculously sharp knives clanking on his belt. In any event, nobody -- not drunks, not homeless people, not smart-ass teenagers -- thinks it's worth their while to bother me, and this comes as a pleasant change from the days when I was a visibly middle-class guy in a neighborhood of the working poor, and couldn't go to the corner store without wondering if I was going to have to blast someone out of their socks.
From the standpoint of the sexes, I've noticed quite a bit of difference in the way I'm regarded by both men and women of all kinds. Men of higher social standing and better dress seem slightly intimidated and uncomfortable when I walk into a room with sweat glistening and tools all a-jangle. Somehow their masculinity is compromised by my mere presence, even when I am physically smaller or less muscular than they. This can be traced to basic middle and upper-middle class insecurity about blue-collar workers in general, for at the core of every male member of the MC and UMC lies feeling of inadequacy when confronted by someone who can actual do things. You may have a degree from Stanford and a Masters from Harvard, you may speak six languages and play the cello like a madman, you may be in perfect physical condition and clock six figures and know everything about international relations, but if you can't change the fuses, unclog a drain, shingle a roof, use a lathe, plane a piece of wood, handle an angle grinder, or tinker with an engine, deep inside you don't really feel like you have anything between your legs. The look of pathetic helplessness, of inadequacy and impotence, that any male white collar wears when standing beside an auto-mechanic, tow-truck driver or plumber hard at work is gross evidence of this. The white collar wants to take refuge in social superiority, in money, in the power of his intellect, but every time he attempts a sneer he is reminded that none of these things has any value in a crisis. The very things which makes him such a commodity at work, or in the dating scene, become liabilities when anything really practical needs to be done. What's more, awareness of his own physical softness often plagues him. In comparison with that callused, work-hardened, oil-smeared dude mucking about under his sink, he feels almost effeminate. Thus the grotesque sight of a man making $300,000 a year trying to make casual small talk and act buddy-buddy with a journeyman plumber who makes a tenth of that -- not because he feels any sense of kinship, but because his own inadequacies drive him to prove he can be "one of the guys," too.
In regards to women, I noticed changes of a different sort. If you take the work-version of myself out of a place like a bank, whose very nature tends to remind everyone of their social status, and place him in an area where there is more natural commingling, like a grocery store, reactions are more positive than I was expecting. There is less overt snubbing and more frank curiosity, possibly because -- so studies have told us -- women are genetically programmed to appraise and judge men based on their ability to protect and provide: and while a blue-collar man might not be able to provide much monetarily, his physical toughness and his practical ability around the house somewhat compensate for this. I've also noticed a little more respect from my female neighbors, when they see me trudging up my driveway in the afternoon, exhausted and filthy, tool-belt flung over one shoulder, work boots a-clattering on the concrete. Since, even in the 21st century, women probably perform an outsize percentage of physical, practical tasks around the home in comparison with their husbands, I'm convinced that they have greater innate appreciation for those whose jobs are of a generally physical nature -- a generalization, to be sure, but I think a fairly accurate one.
Perhaps the most profound change I noticed, however, comes not from without but from within. When I was working in video games, while the job itself was easy, the hours were extremely long, even brutal. Hundred-hour weeks were not unheard of, and as I once spoke about in this very blog, there was a time I went 30 straight days without a single day off. I used to drive down to Hollywood at nine AM in such a state of mental torpidity that I could scarcely operate the controls of my car, and often I wouldn't return until 3:30 the next morning. Yet one hot summer day, as I was sulkily plodding down Hollywood Way toward Warner Bros., I noticed on my left a crew of hardhats hard at work hammering shingles into a black rooftop. Even at that hour the heat of the sun was murderous, and the roofers, clad in their hats, neck bandannas, long-sleeved shirts and carpenter's jeans, looked as miserable as humans can look. I thought, "If you're feeling sorry for yourself, just imagine what their day is going to be like." I don't pretend that my present work is fully as arduous, or as dangerous, as roofing or construction, but it is hard enough for me, and it has served to remind me that despite the long periods of poverty and near-poverty I have faced in life, I never before really grasped what it meant to be of any other social class but the one I was born into. My middle-class attitudes, which carried me through life, sometimes buoying me up and other times dragging me down, are finally peeling away. I have, at the age of forty-five, finally managed to grasp that being born in a fairly comfortable set of circumstances does not entitle me to live within those circumstances for the rest of my life. Indeed, the gravitational tendency in capitalism is always toward poverty -- it is easy to be poor and to stay poor, but reaching the middle or upper middle class is damned difficult, and staying there, even if that class is your point of origin, is harder still. The scaled-down style in which I live, which I intended only to be temporary when I moved from Los Angeles to Burbank four years ago, has lately assumed a more permanent character. My material expectations have changed, possibly for the worse, but at the same time, I have freed myself of a dangerous delusion. I know now that in order for me to have what my parents earned, momentum is not enough. A sense of entitlement is not enough. Even my education is not enough. Only work -- real work -- will suffice. And as Orwell once said, that at least is a beginning.
Published on October 08, 2017 20:01
September 18, 2017
THE LAST SHOT OF THE CIVIL WAR
Going backwards in time is a strange and rewarding thing. It's strange because it is impossible, and yet in certain circumstances you can in fact do it. It's rewarding because – unlike nostalgia, another form of time-travel – it allows you to be in the past but not of it. And I assure you, the distinction has a difference. I realized this only moments after setting foot on Manassas Battlefield Park.
Among Civil War battlefields, of which I'm told there are more than ten thousand, great and small, Manassas, which is also known as Bull Run, has a unique distinction. It was the scene of not one but two major battles, both of tremendous importance, fought almost exactly one year apart. The first, on July 21, 1861, was also the first real contest of the war – the kickoff, one might say, of a ball game that was eventually to kill 600,000 people. The second, on August 28 – 30, 1862, much larger and bloodier, is believed by some historians to have contributed measurably to eventual Southern defeat, even though – like the first – it ended in Confederate victory.
This is dry fact. But the reality is very different, very immediate. When you are driving southwest on Virginia Route 28 you are visibly part of the Twenty-first Century. A smooth ribbon of asphalt stretches before you. The windows of towering office buildings glitter. Traffic helicopters prowl overhead. Nature is not really part of the equation: even the sky is blocked off by an endless series of hulking traffic signs. Turn onto Grant Avenue in Manassas, however, and everything begins to change. The clock ceases its relentless march forward and begins to spin back, faster and faster still, until, just a short while later, you glide up the access road into the Battlefield Park. And suddenly it's not 2017 anymore. It's the middle of the nineteenth century. Notwithstanding the few cars in the parking lot, which seem not to matter at all, or even to exist (they take on the quality of heat mirages; there but not there), you have arrived in another era. Green fields roll and roll into the distance, crisscrossed with wooden rail fences. Rows of cannons, their bronze muzzles gone turquoise with years, sit in the near-exact firing positions they occupied over 150 years ago. A lone farmhouse, made of stone, sits solemnly before a three-grave cemetery shrouded in iron. In the distance, woods press hard against the lighter green of the meadows. As far as the eye can see there is nothing of modern technology to blight the landscape, not even – this is the impression the place gives, even if it isn't the actual truth – power lines or aeroplanes droning overhead. Not so much as the distance shimmer of a skyscraper. Nothing except the chirp of crickets and the buzz of the occasional fly. There are a few statues, it is true, but old Stonewall Jackson looks quite at home sitting on his horse watching the battle which gave him his nickname. As for the monuments, some of them were built by the soldiers who fought here themselves. If they don't belong, nothing does.
When the first fight took place – it's called “First Manassas” by Southerners and “First Bull Run” by Northerners, after the habit of the former to name battles after the nearest town, while the latter dubbed them after the nearest watercourse – the two halves of the country had been at war for about three months. Neither side, however, started the war with anything that could be properly called an army, and it had taken that long for the two combatants to manufacture them. Those armies, Union and Confederate, marched into battle full of confidence and ignorance, led by men who had never led real armies before, officered by men who in many cases had bought their commissions and knew nothing about soldiering, and manned by boys who thought the war would be a grand single-day adventure. They wore uniforms in every conceivable color and type, used tactics that hadn't changed since Napoleon's day, and carried a dizzying array of flags – state flags, regimental flags, national flags – which looked damnably similar and were confused with each other once the fighting commenced. And they were shadowed by crowds which had come by coach and carriage from as far as Washington and Richmond – often carrying champagne and picnic baskets – in hopes of claiming eyewitness to a historical event. They certainly saw one, but not the one they had envisioned, for First Manassas had all the grace of a barroom brawl.
For openers, neither the Union nor the Confederacy could properly control the forces they had marched into the field, and it was little wonder: before the war, when America was still a united country, the total strength of its Army was just under 28,000 men. At First Manassas, each side boasted armies larger than that – 35,000 for the Stars and Stripes, 32,000 for the Stars and Bars. Most of those soldiers were raw recruits who had volunteered for exactly 90 days of military service, and were soldiers in name only. But when the commander-in-chief of the Union army, Winfield Scott, had pointed this out to President Lincoln, the president had replied, in his usual, quotable way: “I know you are green, but they are green also. You are all green alike.” He was right. Once the fight commenced, soldiers marched in the wrong directions, threw away their equipment when it got too heavy, fired on their own men by mistake and sometimes just as accidentally refrained from firing on the enemy. Of the 67,000 total troops under blue or gray command that day, only half actually made it to the battle, and of the participants, quite a few took to their heels when the shooting started, unable to withstand the noise, terror and confusion. Indeed, when the fight was over, neither the Union army, which fled back to Washington, beaten, nor the rebel, which stood victorious, were fit for further battle or even coherent forces. This was not due to losses, for by the standards of the war which followed, the casualties were light: the Union suffered just under three thousand, the Confederacy just under two. (The total number of dead was less than one.) No, what wrecked both armies – what prevented the Confederates from marching on and taking the now-helpless city of Washington, and perhaps ending the Civil War right there – was the shattering effect of battle itself. Like two steam locomotives striking head-on, it scarcely mattered which gave and which got; neither was capable of further movement.
More than men or armies were blown to bits that day, however. So too were the arrogant, romantic delusions maintained by many on both sides: after First Manassas, few clung to the belief that modern war was glorious, or that the enemy would give up easily, or that the conflict could be won by inexperienced generals leading mobs of raw recruits. Both sides dusted themselves off and prepared for a long, gruesome struggle.
They got one. A year later the fighting had spread to the Far West, the Gulf Coast and even to the Atlantic Ocean, but neither side had been able to win a truly decisive battle. In the East, the close proximity of the Union and Confederate capitals – at Washington, D.C. and Richmond, respectively – had locked the armies there into a continuous push-and-shove, with each trying to menace the other's while protecting their own. President Lincoln had become deeply frustrated with Winfield Scott's replacement as commander-in-chief, General George McClellan, and had entrusted command of the Union forces in northern Virginia to a hitherto successful Western general named John Pope, whose sole virtue seemed to be his aggression (when warned by his cabinet that Pope was a lair and a braggart, Lincoln replied, “I know Pope's family from Illinois. They're all liars and braggarts. I don't see as to why being a liar and a braggart would disqualify a man from being a good general.”) Thus Pope. On the other hand, the Confederate President, Jefferson Davis, had also found, in Robert E. Lee, a leader for whom aggression came as naturally as breathing; but Lee's reputation as a human being was considerably better. At the opening of hostilities, Lee, who had opposed secession and openly stated he would have abolished slavery if he thought it would prevent civil war, was actually offered command of the Union army. But his loyalty to Virginia – to his “country,” as he called it – was greater than his loyalty to the United States. Thus Lee.
When they met, in the summer of 1862, the war had lost much of its earlier, amateurish character. Incompetent officers were fewer, and many of the soldiers had become march-hardened and battle-ready, accustomed to the grim rigors of campaigning in dust, mud, rain and snow. Equipment was better and more uniform, and communications and supply, even on the impoverished Southern side, much improved. The armies, too, had grown enormously: the Union marched 77,000 men to Second Manassas (more than Napoleon had at Waterloo), while the Confederacy could count some 50,000 in its ranks. What had not changed for the better, at least for one side, was the difficulty the Union always seemed to have – did, in fact have, until two years later – in bringing to bear all the forces it had on hand: one in five of the blue soldiers who crossed Bull Run were never engaged. Nevertheless, Pope had a numerical advantage of about twelve thousand men, and if numbers alone decided battles, he certainly would have won. But then again, if numbers decided battles, the entire war really would have ended the first Manassas around.
(It is an embarrassing truth of history -- embarrassing to Northerners, anyway -- that the South operated at a disadvantage in almost every great conflict of the war, yet somehow managed not only to hold out for four bloody years, but on several occasions seem to come dangerously close to victory. If the war itself were a movie, the tag-line for the Confederacy would have been, "Always outnumbered, always outgunned." There were very definite reasons for this. In 1861, the North numbered some 21 million people, while the South could count only 9 million, one-third of which were slaves. Therefore the Union had a much larger pool of manpower upon which to draw from for its armies, and could generally replace its losses rapidly, while the South suffered from a permanent shortage of men. Likewise, the South, being agrarian, had only a fraction of the North's industrial power and a totally inadequate railroad network: on paper, they were beaten even before the war began. But wars are not fought on paper anymore than they are fought with numbers, and Lee believed that if he harnessed the splendid morale of his troops to his own superior generalship, he could find a way to destroy the blue adversary.)
Like many battles, Second Manassas turned on a series of mistakes, thus adding some credence to the Russian axiom that wars are not won by the most competent army, but by the least incompetent one. Lee had difficulty controlling his wilful chief subordinates, Stonewall Jackson and James Longstreet, while Pope – who was "hated by everyone in his command from his immediate staff to his generals to the youngest drummer boy" – managed not to notice 25,000 enemy soldiers marching up to attack his flank at the height of the battle, a blunder which proved fatal. The flank was crushed, and Pope was extremely lucky that he had enough resolute soldiers in his command to prevent a total rout. Nevertheless, the defeat was complete and humiliating. A Union general summed up the feelings of many when he wrote bitterly of the aftermath: “A splendid army almost demoralized, millions of public property given up or destroyed, thousands of lives of our best men sacrificed for no purpose.”
Lee's triumph was great and did much to bolster his burgeoning reputation as a military genius, as well as his army's reputation for near-invincibility in the face of stacked odds. But history teaches us that victory can be as dangerous, in its own subtle way, as defeat. Emboldened by his win, Lee proposed to Jefferson Davis an invasion of Maryland, during which he intended to draw out and annihilate the Union Army; unfortunately for him, General McClellan -- restored to command by Lincoln after Second Manassas -- got hold of these plans and ambushed Lee near Sharpsburg, MD, on Antietam Creek. The resulting battle was the bloodiest single day in American history, and while Lee gave better than he got in terms of casualties, he had less men to lose and was able to escape total destruction only because McClellan, once again, proved too cautious in his pursuit.
Antietam (or Sharpsburg, depending on which side of the Mason-Dixon Line you live on) was a direct consequence of Second Manassas, and the consequences kept coming. The withdrawal of Lee's bloodied army back into Virginia emboldened Lincoln into signing the Emancipation Proclamation, which, though it had no actual, immediate effect on slavery (since it only “freed” the slaves in Confederate-controlled territory), had massive political effect. Prior to Sharpsburg, the strategic goal of the United States was to restore “the Union as it was” – in other words, crush the rebellion but leave the institution of slavery intact. After Sharpsburg, it became clear that it was not possible to wage war only on secessionists; the cause of secession, and the keystone of the Southern economy, slavery, had to be smashed as well. The Proclamation had another, added benefit for Lincoln: it went a long way toward destroying any hope Jefferson Davis had for obtaining foreign recognition of the Confederacy. The European powers were dependent on the South for cotton, but very reluctant to endorse slavery. So long as the Union had been unwilling to dismantle the institution, Europe could climb into bed with the Confederacy with a clear conscience, for there was little to choose between the two sides from a moral perspective; but once the elimination of slavery became part and parcel of the Union's war aim, it became politically impossible to side with an aspiring nation which kept millions in chains.
The Confederacy finally toppled in 1865. In that same year, just before mustering out of the Grand Army of the Republic of the United States, a group of Union soldiers assembled at the Manassas battlefields to commemorate what had happened there. Using muscles toughened by years of conflict, they erected a simple monument of brick and stone to their fallen comrades from both battles -- "the patriots," they called them, and truer words were never spoken. Adorning the monument were five heavy artillery shells drawn from their own ammunition caissons. A photo of the commemoration ceremony, on display at the Battlefield Park today, shows the soldiers crowded unsmiling around the monument, clad in their distinctive uniforms and looking every bit as tough as the shells. In 1975, the National Park Service conducted a restoration of the monument, which was beginning to show its age. Their engineers discovered, to their astonishment, that the shrapnel shells mounted on the weather-beaten brickwork remained live despite the intervening 110 years. Bomb experts were called in to defuse them, but during the process one of them exploded. And to my knowledge this was the last shot fired in the Civil War.
Among Civil War battlefields, of which I'm told there are more than ten thousand, great and small, Manassas, which is also known as Bull Run, has a unique distinction. It was the scene of not one but two major battles, both of tremendous importance, fought almost exactly one year apart. The first, on July 21, 1861, was also the first real contest of the war – the kickoff, one might say, of a ball game that was eventually to kill 600,000 people. The second, on August 28 – 30, 1862, much larger and bloodier, is believed by some historians to have contributed measurably to eventual Southern defeat, even though – like the first – it ended in Confederate victory.
This is dry fact. But the reality is very different, very immediate. When you are driving southwest on Virginia Route 28 you are visibly part of the Twenty-first Century. A smooth ribbon of asphalt stretches before you. The windows of towering office buildings glitter. Traffic helicopters prowl overhead. Nature is not really part of the equation: even the sky is blocked off by an endless series of hulking traffic signs. Turn onto Grant Avenue in Manassas, however, and everything begins to change. The clock ceases its relentless march forward and begins to spin back, faster and faster still, until, just a short while later, you glide up the access road into the Battlefield Park. And suddenly it's not 2017 anymore. It's the middle of the nineteenth century. Notwithstanding the few cars in the parking lot, which seem not to matter at all, or even to exist (they take on the quality of heat mirages; there but not there), you have arrived in another era. Green fields roll and roll into the distance, crisscrossed with wooden rail fences. Rows of cannons, their bronze muzzles gone turquoise with years, sit in the near-exact firing positions they occupied over 150 years ago. A lone farmhouse, made of stone, sits solemnly before a three-grave cemetery shrouded in iron. In the distance, woods press hard against the lighter green of the meadows. As far as the eye can see there is nothing of modern technology to blight the landscape, not even – this is the impression the place gives, even if it isn't the actual truth – power lines or aeroplanes droning overhead. Not so much as the distance shimmer of a skyscraper. Nothing except the chirp of crickets and the buzz of the occasional fly. There are a few statues, it is true, but old Stonewall Jackson looks quite at home sitting on his horse watching the battle which gave him his nickname. As for the monuments, some of them were built by the soldiers who fought here themselves. If they don't belong, nothing does.
When the first fight took place – it's called “First Manassas” by Southerners and “First Bull Run” by Northerners, after the habit of the former to name battles after the nearest town, while the latter dubbed them after the nearest watercourse – the two halves of the country had been at war for about three months. Neither side, however, started the war with anything that could be properly called an army, and it had taken that long for the two combatants to manufacture them. Those armies, Union and Confederate, marched into battle full of confidence and ignorance, led by men who had never led real armies before, officered by men who in many cases had bought their commissions and knew nothing about soldiering, and manned by boys who thought the war would be a grand single-day adventure. They wore uniforms in every conceivable color and type, used tactics that hadn't changed since Napoleon's day, and carried a dizzying array of flags – state flags, regimental flags, national flags – which looked damnably similar and were confused with each other once the fighting commenced. And they were shadowed by crowds which had come by coach and carriage from as far as Washington and Richmond – often carrying champagne and picnic baskets – in hopes of claiming eyewitness to a historical event. They certainly saw one, but not the one they had envisioned, for First Manassas had all the grace of a barroom brawl.
For openers, neither the Union nor the Confederacy could properly control the forces they had marched into the field, and it was little wonder: before the war, when America was still a united country, the total strength of its Army was just under 28,000 men. At First Manassas, each side boasted armies larger than that – 35,000 for the Stars and Stripes, 32,000 for the Stars and Bars. Most of those soldiers were raw recruits who had volunteered for exactly 90 days of military service, and were soldiers in name only. But when the commander-in-chief of the Union army, Winfield Scott, had pointed this out to President Lincoln, the president had replied, in his usual, quotable way: “I know you are green, but they are green also. You are all green alike.” He was right. Once the fight commenced, soldiers marched in the wrong directions, threw away their equipment when it got too heavy, fired on their own men by mistake and sometimes just as accidentally refrained from firing on the enemy. Of the 67,000 total troops under blue or gray command that day, only half actually made it to the battle, and of the participants, quite a few took to their heels when the shooting started, unable to withstand the noise, terror and confusion. Indeed, when the fight was over, neither the Union army, which fled back to Washington, beaten, nor the rebel, which stood victorious, were fit for further battle or even coherent forces. This was not due to losses, for by the standards of the war which followed, the casualties were light: the Union suffered just under three thousand, the Confederacy just under two. (The total number of dead was less than one.) No, what wrecked both armies – what prevented the Confederates from marching on and taking the now-helpless city of Washington, and perhaps ending the Civil War right there – was the shattering effect of battle itself. Like two steam locomotives striking head-on, it scarcely mattered which gave and which got; neither was capable of further movement.
More than men or armies were blown to bits that day, however. So too were the arrogant, romantic delusions maintained by many on both sides: after First Manassas, few clung to the belief that modern war was glorious, or that the enemy would give up easily, or that the conflict could be won by inexperienced generals leading mobs of raw recruits. Both sides dusted themselves off and prepared for a long, gruesome struggle.
They got one. A year later the fighting had spread to the Far West, the Gulf Coast and even to the Atlantic Ocean, but neither side had been able to win a truly decisive battle. In the East, the close proximity of the Union and Confederate capitals – at Washington, D.C. and Richmond, respectively – had locked the armies there into a continuous push-and-shove, with each trying to menace the other's while protecting their own. President Lincoln had become deeply frustrated with Winfield Scott's replacement as commander-in-chief, General George McClellan, and had entrusted command of the Union forces in northern Virginia to a hitherto successful Western general named John Pope, whose sole virtue seemed to be his aggression (when warned by his cabinet that Pope was a lair and a braggart, Lincoln replied, “I know Pope's family from Illinois. They're all liars and braggarts. I don't see as to why being a liar and a braggart would disqualify a man from being a good general.”) Thus Pope. On the other hand, the Confederate President, Jefferson Davis, had also found, in Robert E. Lee, a leader for whom aggression came as naturally as breathing; but Lee's reputation as a human being was considerably better. At the opening of hostilities, Lee, who had opposed secession and openly stated he would have abolished slavery if he thought it would prevent civil war, was actually offered command of the Union army. But his loyalty to Virginia – to his “country,” as he called it – was greater than his loyalty to the United States. Thus Lee.
When they met, in the summer of 1862, the war had lost much of its earlier, amateurish character. Incompetent officers were fewer, and many of the soldiers had become march-hardened and battle-ready, accustomed to the grim rigors of campaigning in dust, mud, rain and snow. Equipment was better and more uniform, and communications and supply, even on the impoverished Southern side, much improved. The armies, too, had grown enormously: the Union marched 77,000 men to Second Manassas (more than Napoleon had at Waterloo), while the Confederacy could count some 50,000 in its ranks. What had not changed for the better, at least for one side, was the difficulty the Union always seemed to have – did, in fact have, until two years later – in bringing to bear all the forces it had on hand: one in five of the blue soldiers who crossed Bull Run were never engaged. Nevertheless, Pope had a numerical advantage of about twelve thousand men, and if numbers alone decided battles, he certainly would have won. But then again, if numbers decided battles, the entire war really would have ended the first Manassas around.
(It is an embarrassing truth of history -- embarrassing to Northerners, anyway -- that the South operated at a disadvantage in almost every great conflict of the war, yet somehow managed not only to hold out for four bloody years, but on several occasions seem to come dangerously close to victory. If the war itself were a movie, the tag-line for the Confederacy would have been, "Always outnumbered, always outgunned." There were very definite reasons for this. In 1861, the North numbered some 21 million people, while the South could count only 9 million, one-third of which were slaves. Therefore the Union had a much larger pool of manpower upon which to draw from for its armies, and could generally replace its losses rapidly, while the South suffered from a permanent shortage of men. Likewise, the South, being agrarian, had only a fraction of the North's industrial power and a totally inadequate railroad network: on paper, they were beaten even before the war began. But wars are not fought on paper anymore than they are fought with numbers, and Lee believed that if he harnessed the splendid morale of his troops to his own superior generalship, he could find a way to destroy the blue adversary.)
Like many battles, Second Manassas turned on a series of mistakes, thus adding some credence to the Russian axiom that wars are not won by the most competent army, but by the least incompetent one. Lee had difficulty controlling his wilful chief subordinates, Stonewall Jackson and James Longstreet, while Pope – who was "hated by everyone in his command from his immediate staff to his generals to the youngest drummer boy" – managed not to notice 25,000 enemy soldiers marching up to attack his flank at the height of the battle, a blunder which proved fatal. The flank was crushed, and Pope was extremely lucky that he had enough resolute soldiers in his command to prevent a total rout. Nevertheless, the defeat was complete and humiliating. A Union general summed up the feelings of many when he wrote bitterly of the aftermath: “A splendid army almost demoralized, millions of public property given up or destroyed, thousands of lives of our best men sacrificed for no purpose.”
Lee's triumph was great and did much to bolster his burgeoning reputation as a military genius, as well as his army's reputation for near-invincibility in the face of stacked odds. But history teaches us that victory can be as dangerous, in its own subtle way, as defeat. Emboldened by his win, Lee proposed to Jefferson Davis an invasion of Maryland, during which he intended to draw out and annihilate the Union Army; unfortunately for him, General McClellan -- restored to command by Lincoln after Second Manassas -- got hold of these plans and ambushed Lee near Sharpsburg, MD, on Antietam Creek. The resulting battle was the bloodiest single day in American history, and while Lee gave better than he got in terms of casualties, he had less men to lose and was able to escape total destruction only because McClellan, once again, proved too cautious in his pursuit.
Antietam (or Sharpsburg, depending on which side of the Mason-Dixon Line you live on) was a direct consequence of Second Manassas, and the consequences kept coming. The withdrawal of Lee's bloodied army back into Virginia emboldened Lincoln into signing the Emancipation Proclamation, which, though it had no actual, immediate effect on slavery (since it only “freed” the slaves in Confederate-controlled territory), had massive political effect. Prior to Sharpsburg, the strategic goal of the United States was to restore “the Union as it was” – in other words, crush the rebellion but leave the institution of slavery intact. After Sharpsburg, it became clear that it was not possible to wage war only on secessionists; the cause of secession, and the keystone of the Southern economy, slavery, had to be smashed as well. The Proclamation had another, added benefit for Lincoln: it went a long way toward destroying any hope Jefferson Davis had for obtaining foreign recognition of the Confederacy. The European powers were dependent on the South for cotton, but very reluctant to endorse slavery. So long as the Union had been unwilling to dismantle the institution, Europe could climb into bed with the Confederacy with a clear conscience, for there was little to choose between the two sides from a moral perspective; but once the elimination of slavery became part and parcel of the Union's war aim, it became politically impossible to side with an aspiring nation which kept millions in chains.
The Confederacy finally toppled in 1865. In that same year, just before mustering out of the Grand Army of the Republic of the United States, a group of Union soldiers assembled at the Manassas battlefields to commemorate what had happened there. Using muscles toughened by years of conflict, they erected a simple monument of brick and stone to their fallen comrades from both battles -- "the patriots," they called them, and truer words were never spoken. Adorning the monument were five heavy artillery shells drawn from their own ammunition caissons. A photo of the commemoration ceremony, on display at the Battlefield Park today, shows the soldiers crowded unsmiling around the monument, clad in their distinctive uniforms and looking every bit as tough as the shells. In 1975, the National Park Service conducted a restoration of the monument, which was beginning to show its age. Their engineers discovered, to their astonishment, that the shrapnel shells mounted on the weather-beaten brickwork remained live despite the intervening 110 years. Bomb experts were called in to defuse them, but during the process one of them exploded. And to my knowledge this was the last shot fired in the Civil War.
Published on September 18, 2017 18:17
September 2, 2017
Meatless and Sober in America: A (sort of) Horror Story
The first thing you notice when you quit drinking, even temporarily, is how much it effects the people around you. Not yourself, necessarily; just the people who normally interact with you.
"What, you're not drinking?" They say, worry-bafflement lines furrowing their brows. "Are you on medication or something?"
That's one response. Another is contemptuous amusement. "How long you think that's gonna last?" -- always delivered with a sneer. Then there are those who equate drinking with masculinity and therefore assume your testicles fell off when you were doing jumping jacks at the gym the previous day. (I won't tell you what those people say, but it rhymes with "hag.") What it all amounts to is that the change you have made changes the way they deal with you. Some are suspicious, as if they feel threatened. Others are angry, as if you've suddenly passed harsh judgement on their own lifestyle. Some utterly lose interest in you. "Give me a call when you're off the wagon," is a common response, as if we are entering a voluntary and mutual suspension of friendship.
A few others are encouraging. "Good for you!" They exclaim, and, perhaps being nondrinkers themselves, immediately find you more approachable and interesting than they did yesterday, when your conversation was littered with talk about beer, bars and hangovers. But you've switched sides! Turned coat! Jumped ship! You're playing for the other team now! Suddenly they see you as if for the first time, and like what they see. I've literally bumped friendly acquaintances to full-fledged friends merely by eschewing the grape for a few months.
Another thing you notice, if you quit drinking longer than temporarily, or permanently, or even vastly reduce your drinking to the occasional beer or glass of wine, is how it effects your own behavior. The time formerly spent in bars or pubs or dissipating oneself in front of a TV with a beer in hand is now free to do -- what? The first weeks of total sobriety were a clumsy attempt to answer that question. I found I exercised more, having more energy; and also needing ways to interact with people that didn't involve staring at them from over the rim of a shot glass, took my exercise more socially. Instead of the gym with earbuds wedged in place, swimming or yoga classes with a friend. Instead of hiking solo, hiking with pals who, being sober or nearly so, weren't hung over on Saturday mornings and could do a hard eight miles in the Verdugo Mountains. Instead of taking my laptop to the sleazy biker bar across the street to do my writing between sips of beer, I took it to the library in the park, which was certainly quieter and had less people in it who would stab you in the femoral artery with a screwdriver if you looked cross-eyed at them.
I also started doing things, like going to the movies, which do not require alcohol as part of the social ritual. (True, they require soda, popcorn and candy, all things I shouldn't be consuming, but to hell with it -- I'm not a fucking monk, people.) In this way I was able to re-discover my passion for cinema, which had waned in recent years. But it did also put me in the position of having to drive home from the theater through Hollywood, and the crowds of drunks who always seem to be having a better time than me. Certainly they were wearing less clothing.
It changed my behavior in more subtle ways as well. I live across the street from said dive bar and from two stores that sell alcohol; also down the block from another bar (which poses as a restaurant, but nobody is fooled). I used to frequent all these places to satisfy my thirst for beer and Irish whiskey. Now I look at them the same way I regard vacuum-cleaner repair stores: as places I have absolutely no fucking interest in setting foot in. It makes my neighborhood more closed-off than it did before, changes the way I move around. When, on my nightly walk, I pass by yet another liquor store on Burbank Boulevard, which is very brightly lit at a rather dark part of the street and thus resembles a huge pinball machine, I am now struck not by temptation but how vulgar the place looks. I used to nip in there on impulse, sometimes to peruse the isles, sometimes to buy a half-pint of the True to curb my restlessness or ease my boredom or take the edge off some frustration I was experiencing. But not anymore. I just walk by, disinterested and vaguely disgusted, because, goddamn it, I never liked pinball. Also because the guys walking out always do so with a furtive air, as if they've just left a strip joint. Did I look like that, when I slunk out with a bottle of Jameson in my hand?
It goes yet further. Because I no longer drink very much, I also tend to avoid situations where drinking is the center of activity. Like barbecues or parties thrown by folks I know who are heavy drinkers. Everyone is thrusting beer and wine and booze on me the moment I walk in, and I don't want it, and I get tired of explaining why. Once, I simply walked around with someone's abandoned half-empty beer for hours, and that convinced people I was one of the brotherhood, but I sure did get tired of holding that warm, sweaty can. So did watching people dissolve into slurring, red-eyed drunks who kept touching their noses to make sure they were still there. Suddenly I understood how all my non-drinking friends had felt in college, enduring every manner of low-farce buffoonery for years while simultaneously being told how "lame" they were for not ending the night in a puddle of vomit.
Of course there are times when I do miss the drinking life a little, and occasions -- three this year to be exact -- when I have guzzled myself blue. Each incident was one which merited exiting the wagon. The first was when I found out my novel Cage Life had won Best Indie Book of 2016. The second was at a wake in Hollywood for the make up effects artist Elvis Jones, who had died in Central America while on location. The last was on my birthday. (The morning after each binge was a reminder of why I decided to cut back so drastically in the first place.) I can't promise there won't be more of these one-off debaucheries, but by and large I think there will be less. I've discovered, or rather re-discovered, what life is like on the soberer side of things, where alcohol is consumed like slices of pie -- one at a sitting -- and not like Doritos, devoured until the supply is exhausted. And I kind of like it. But I can't say it has been easy. No, strike that: I can say it's been easy, but I can't say it's been convenient. Because there are a thousand things that constantly conspire to annoy you and try to make you renege on your pledge -- not so much devils on your shoulder as devils in your path, jabbing you with their tiny pitchforks and calling you a fucking f-----t.
It's the same thing when you go vegetarian. Even as a simple experiment, which my present vegetarianism is, sticking to one's guns is a gigantic pain in the ass. You begin to grasp the universality of meat, the way it pervades every aspect of American culture, in ways far more nuanced than alcohol does. Take my experience at the airport yesterday, for example. I arrived at Dulles at quarter past one o'clock for a four o'clock flight back to LAX. After the usual security procedures and so forth, I was at my gate by two. That left two hungry hours before departure, so I perused the vast array of eateries outside my gate. They were, in order: A hamburger joint, a hot dog stand, a cheese-steak shack, a pub that specialized in burgers and wings, a coffee/salad/sandwich shop, and an Asian restaurant. I zeroed in on the Asian place only to find that of their ten entrees, ten had meat in them, so unless I wanted to buy ten vegetable spring rolls at $4.95 a pair, I was out of luck. I backpedaled to the coffee shop, but the only meatless thing they had there was the coffee -- which, for all I know, had fucking bacon in it, Homer Simpson style. At last, about 100 yards from my gate, I found a pizza place which offered six types of craft pizzas, one of which didn't have meat on it. In addition to being largely on the wagon and experimenting in sort-of vegetarianism, I'm also on a low-sodium diet, so eating a footlong pizza was probably a stupid idea, but by that point I was too hungry to care.
Since I was out of town for a solid week, I had nothing in the fridge to eat, and so naturally, this morning, I went to the diner down the street for breakfast. This diner has a menu that weighs about 3.5 pounds and is as thick as my first novel. In that menu, among all those hundreds of choices, are about ten options that are vegetarian-friendly, and if you cut out the dinner salads you have about six. If you're a vegan or the type of vegetarian who doesn't eat eggs, you're basically fucked. Luckily I do eat eggs, but it's still pretty tedious to see a menu the size of a Tolstoy volume and realize your basic choices are iceberg lettuce and Saltines. But never mind the diner. Many of my favorite meals at my favorite restaurants are now, if you'll pardon the pun, off the table. The Shepherd's pie at the Buchanan Arms? Done. A cheeseburger and a milkshake from In 'n Out? Nope. The salt and pepper chicken at Tender Green's? Sorry. The very act of "grabbing a sandwich from somewhere" has become problematic bordering on impossible.
How do real vegetarians do it? Even in Los Angeles, where vegetarians and vegans are common stuff, everything and everyone seems out to make life as difficult as possible for you. What isn't meat-based or garnished with meat is often cooked in beef or chicken stock or some other damned thing that once mooed or cackled. And if you're trying to avoid dairy as well, forget about it. You may as well stay home and invoke the Dark Arts to try to make tofu taste like something other than wet Play-Doh. Shit, even a 7/11 is a veritable death-trap of lurking dairy. The raisins have yogurt. The chocolate has milk. The muffins have -- well, I don't know what the muffins have, but I know my vegan friends can't eat them. And God help you if you get invited to a friend's house for dinner. The poor sod who thought it would be a good idea to have you over will soon regret it when they realize they have to cook you a whole separate meal. But it gets worse! Just try going to a sporting event sober and meatless and dairy-free -- just fucking try it! You'll be shelling peanuts inside five minutes, because that's the only goddamn thing you can eat -- assuming, of course, the oil fits your diet. (My friend Tracy tried to order sweet potato fries the other day, only to be told the oil isn't "vegan-friendly.") Another friend of mine, Lindsey, was reduced to eating Boston baked beans one day at the beach because the only alternatives were fried fish and hot dogs. I know there are times in life we feel the system is rigged against us and it's just self-pity talking, but this time the conspiracy theory is a fact: the system really is rigged! If it isn't a cow or doesn't come out of one, America doesn't want you to eat it. And if it isn't loaded with alcohol or sugar or caffeine, America doesn't want you to drink it. Remember when Oprah Winfrey got sued by Texas cattle barons for "slandering beef?" That shit wasn't an Onion article, it actually happened! The barons didn't win, of course, but the fact that they were even able to bring the lawsuit to trial ought to tell you something. Beef and booze are big business. You avoid them both at your peril and at your serious inconvenience.
Now, before you raise your moral guard, don't bother. I'm not going to adopt the horrible habit of denouncing something now that I've (mostly) given it up. I've known many people who've found Jesus, or stopped using drugs, or embraced veganism or fitness, who have become intolerable prigs and born-again preachers who live to tell you you're going straight to hell, spiritually or physically, if you don't follow in their footsteps. They are nothing but pains in the ass and I have no intention of emulating them, for to do so would involve staggering hypocrisy on my part. God knows I got a lot out of drinking and probably even more out of eating Tyrannosaur-sized portions of red meat, pork, fowl and fish for nearly all of my life. I suppose I'm just curious how the other half lives, and how radical changes in diet will effect my weight and general health now that I'm (gasp!) a middle-aged man. But now that beef jerky and cheese pie and inch-thick T-bones are no longer a part of my life -- at least for now -- and alcohol has become a sort of dessert-treat rather than a staple of my diet, I'm discovering what generations of other people, including my non-drinking vegetarian father, stumbled upon before me: it's not so much that you are what you eat, as you are what you don't.
"What, you're not drinking?" They say, worry-bafflement lines furrowing their brows. "Are you on medication or something?"
That's one response. Another is contemptuous amusement. "How long you think that's gonna last?" -- always delivered with a sneer. Then there are those who equate drinking with masculinity and therefore assume your testicles fell off when you were doing jumping jacks at the gym the previous day. (I won't tell you what those people say, but it rhymes with "hag.") What it all amounts to is that the change you have made changes the way they deal with you. Some are suspicious, as if they feel threatened. Others are angry, as if you've suddenly passed harsh judgement on their own lifestyle. Some utterly lose interest in you. "Give me a call when you're off the wagon," is a common response, as if we are entering a voluntary and mutual suspension of friendship.
A few others are encouraging. "Good for you!" They exclaim, and, perhaps being nondrinkers themselves, immediately find you more approachable and interesting than they did yesterday, when your conversation was littered with talk about beer, bars and hangovers. But you've switched sides! Turned coat! Jumped ship! You're playing for the other team now! Suddenly they see you as if for the first time, and like what they see. I've literally bumped friendly acquaintances to full-fledged friends merely by eschewing the grape for a few months.
Another thing you notice, if you quit drinking longer than temporarily, or permanently, or even vastly reduce your drinking to the occasional beer or glass of wine, is how it effects your own behavior. The time formerly spent in bars or pubs or dissipating oneself in front of a TV with a beer in hand is now free to do -- what? The first weeks of total sobriety were a clumsy attempt to answer that question. I found I exercised more, having more energy; and also needing ways to interact with people that didn't involve staring at them from over the rim of a shot glass, took my exercise more socially. Instead of the gym with earbuds wedged in place, swimming or yoga classes with a friend. Instead of hiking solo, hiking with pals who, being sober or nearly so, weren't hung over on Saturday mornings and could do a hard eight miles in the Verdugo Mountains. Instead of taking my laptop to the sleazy biker bar across the street to do my writing between sips of beer, I took it to the library in the park, which was certainly quieter and had less people in it who would stab you in the femoral artery with a screwdriver if you looked cross-eyed at them.
I also started doing things, like going to the movies, which do not require alcohol as part of the social ritual. (True, they require soda, popcorn and candy, all things I shouldn't be consuming, but to hell with it -- I'm not a fucking monk, people.) In this way I was able to re-discover my passion for cinema, which had waned in recent years. But it did also put me in the position of having to drive home from the theater through Hollywood, and the crowds of drunks who always seem to be having a better time than me. Certainly they were wearing less clothing.
It changed my behavior in more subtle ways as well. I live across the street from said dive bar and from two stores that sell alcohol; also down the block from another bar (which poses as a restaurant, but nobody is fooled). I used to frequent all these places to satisfy my thirst for beer and Irish whiskey. Now I look at them the same way I regard vacuum-cleaner repair stores: as places I have absolutely no fucking interest in setting foot in. It makes my neighborhood more closed-off than it did before, changes the way I move around. When, on my nightly walk, I pass by yet another liquor store on Burbank Boulevard, which is very brightly lit at a rather dark part of the street and thus resembles a huge pinball machine, I am now struck not by temptation but how vulgar the place looks. I used to nip in there on impulse, sometimes to peruse the isles, sometimes to buy a half-pint of the True to curb my restlessness or ease my boredom or take the edge off some frustration I was experiencing. But not anymore. I just walk by, disinterested and vaguely disgusted, because, goddamn it, I never liked pinball. Also because the guys walking out always do so with a furtive air, as if they've just left a strip joint. Did I look like that, when I slunk out with a bottle of Jameson in my hand?
It goes yet further. Because I no longer drink very much, I also tend to avoid situations where drinking is the center of activity. Like barbecues or parties thrown by folks I know who are heavy drinkers. Everyone is thrusting beer and wine and booze on me the moment I walk in, and I don't want it, and I get tired of explaining why. Once, I simply walked around with someone's abandoned half-empty beer for hours, and that convinced people I was one of the brotherhood, but I sure did get tired of holding that warm, sweaty can. So did watching people dissolve into slurring, red-eyed drunks who kept touching their noses to make sure they were still there. Suddenly I understood how all my non-drinking friends had felt in college, enduring every manner of low-farce buffoonery for years while simultaneously being told how "lame" they were for not ending the night in a puddle of vomit.
Of course there are times when I do miss the drinking life a little, and occasions -- three this year to be exact -- when I have guzzled myself blue. Each incident was one which merited exiting the wagon. The first was when I found out my novel Cage Life had won Best Indie Book of 2016. The second was at a wake in Hollywood for the make up effects artist Elvis Jones, who had died in Central America while on location. The last was on my birthday. (The morning after each binge was a reminder of why I decided to cut back so drastically in the first place.) I can't promise there won't be more of these one-off debaucheries, but by and large I think there will be less. I've discovered, or rather re-discovered, what life is like on the soberer side of things, where alcohol is consumed like slices of pie -- one at a sitting -- and not like Doritos, devoured until the supply is exhausted. And I kind of like it. But I can't say it has been easy. No, strike that: I can say it's been easy, but I can't say it's been convenient. Because there are a thousand things that constantly conspire to annoy you and try to make you renege on your pledge -- not so much devils on your shoulder as devils in your path, jabbing you with their tiny pitchforks and calling you a fucking f-----t.
It's the same thing when you go vegetarian. Even as a simple experiment, which my present vegetarianism is, sticking to one's guns is a gigantic pain in the ass. You begin to grasp the universality of meat, the way it pervades every aspect of American culture, in ways far more nuanced than alcohol does. Take my experience at the airport yesterday, for example. I arrived at Dulles at quarter past one o'clock for a four o'clock flight back to LAX. After the usual security procedures and so forth, I was at my gate by two. That left two hungry hours before departure, so I perused the vast array of eateries outside my gate. They were, in order: A hamburger joint, a hot dog stand, a cheese-steak shack, a pub that specialized in burgers and wings, a coffee/salad/sandwich shop, and an Asian restaurant. I zeroed in on the Asian place only to find that of their ten entrees, ten had meat in them, so unless I wanted to buy ten vegetable spring rolls at $4.95 a pair, I was out of luck. I backpedaled to the coffee shop, but the only meatless thing they had there was the coffee -- which, for all I know, had fucking bacon in it, Homer Simpson style. At last, about 100 yards from my gate, I found a pizza place which offered six types of craft pizzas, one of which didn't have meat on it. In addition to being largely on the wagon and experimenting in sort-of vegetarianism, I'm also on a low-sodium diet, so eating a footlong pizza was probably a stupid idea, but by that point I was too hungry to care.
Since I was out of town for a solid week, I had nothing in the fridge to eat, and so naturally, this morning, I went to the diner down the street for breakfast. This diner has a menu that weighs about 3.5 pounds and is as thick as my first novel. In that menu, among all those hundreds of choices, are about ten options that are vegetarian-friendly, and if you cut out the dinner salads you have about six. If you're a vegan or the type of vegetarian who doesn't eat eggs, you're basically fucked. Luckily I do eat eggs, but it's still pretty tedious to see a menu the size of a Tolstoy volume and realize your basic choices are iceberg lettuce and Saltines. But never mind the diner. Many of my favorite meals at my favorite restaurants are now, if you'll pardon the pun, off the table. The Shepherd's pie at the Buchanan Arms? Done. A cheeseburger and a milkshake from In 'n Out? Nope. The salt and pepper chicken at Tender Green's? Sorry. The very act of "grabbing a sandwich from somewhere" has become problematic bordering on impossible.
How do real vegetarians do it? Even in Los Angeles, where vegetarians and vegans are common stuff, everything and everyone seems out to make life as difficult as possible for you. What isn't meat-based or garnished with meat is often cooked in beef or chicken stock or some other damned thing that once mooed or cackled. And if you're trying to avoid dairy as well, forget about it. You may as well stay home and invoke the Dark Arts to try to make tofu taste like something other than wet Play-Doh. Shit, even a 7/11 is a veritable death-trap of lurking dairy. The raisins have yogurt. The chocolate has milk. The muffins have -- well, I don't know what the muffins have, but I know my vegan friends can't eat them. And God help you if you get invited to a friend's house for dinner. The poor sod who thought it would be a good idea to have you over will soon regret it when they realize they have to cook you a whole separate meal. But it gets worse! Just try going to a sporting event sober and meatless and dairy-free -- just fucking try it! You'll be shelling peanuts inside five minutes, because that's the only goddamn thing you can eat -- assuming, of course, the oil fits your diet. (My friend Tracy tried to order sweet potato fries the other day, only to be told the oil isn't "vegan-friendly.") Another friend of mine, Lindsey, was reduced to eating Boston baked beans one day at the beach because the only alternatives were fried fish and hot dogs. I know there are times in life we feel the system is rigged against us and it's just self-pity talking, but this time the conspiracy theory is a fact: the system really is rigged! If it isn't a cow or doesn't come out of one, America doesn't want you to eat it. And if it isn't loaded with alcohol or sugar or caffeine, America doesn't want you to drink it. Remember when Oprah Winfrey got sued by Texas cattle barons for "slandering beef?" That shit wasn't an Onion article, it actually happened! The barons didn't win, of course, but the fact that they were even able to bring the lawsuit to trial ought to tell you something. Beef and booze are big business. You avoid them both at your peril and at your serious inconvenience.
Now, before you raise your moral guard, don't bother. I'm not going to adopt the horrible habit of denouncing something now that I've (mostly) given it up. I've known many people who've found Jesus, or stopped using drugs, or embraced veganism or fitness, who have become intolerable prigs and born-again preachers who live to tell you you're going straight to hell, spiritually or physically, if you don't follow in their footsteps. They are nothing but pains in the ass and I have no intention of emulating them, for to do so would involve staggering hypocrisy on my part. God knows I got a lot out of drinking and probably even more out of eating Tyrannosaur-sized portions of red meat, pork, fowl and fish for nearly all of my life. I suppose I'm just curious how the other half lives, and how radical changes in diet will effect my weight and general health now that I'm (gasp!) a middle-aged man. But now that beef jerky and cheese pie and inch-thick T-bones are no longer a part of my life -- at least for now -- and alcohol has become a sort of dessert-treat rather than a staple of my diet, I'm discovering what generations of other people, including my non-drinking vegetarian father, stumbled upon before me: it's not so much that you are what you eat, as you are what you don't.
Published on September 02, 2017 11:26
August 19, 2017
Monsters in Charlottesville
At the end of A NIGHTMARE ON ELM STREET, the embattled heroine Nancy, who has lost everyone she loves to the murderous revenant Freddy Krueger, confronts the killer in her bedroom. Throughout the film, Nancy has done everything possible, offensively and defensively, to defeat the homicidal maniac, and seems to have finally obliterated him. But wise Nancy is no horror-movie dupe. She stands over the empty bed, host to so many terrible nightmares, and begins to speak.
Nancy: I know you're there, Krueger.
Freddy: (emerging from the bed) You think you was gonna get away from me?
Nancy: I know you too well now, Freddy.
Freddy: And now you die.
Nancy: It's too late, Krueger. I know the secret now. This is just a dream, too. You're not alive. The whole thing is a dream. I want my mother and friends again.
Freddy: You what?
Nancy: I take back every bit of energy I ever gave you. You're nothing. You're shit.
Nancy contemptuously turns her back on him. Enraged, Freddy hurls himself at his teenage nemesis... and disintegrates, screaming, into nothingness.
The idea of robbing a monster of its power by simply ignoring it is an old one in both mythology and storytelling, and implies a peculiar sort of relationship, in which in the monster is dependent upon its victims not merely for sustenance, but for its very existence. To some degree is this what differentiates a monster, which is imaginary, from a predator, which is real; wolves do not give a damn if caribou believe in them, because what a wolf needs to live is meat, not acknowledgement. In this sense the monster, who requires you to participate in your own murder, is actually weaker than the predator, who can kill you whether you believe in him or not.
Fifteen years ago, I was living in York, Pennsylvania, working for the District Attorney's Office as an pre- sentence investigator. It so happened that my apartment, which stood across from the courthouse in which I worked, was also on the same block as the city library, and one day a fellow named Matt Hale, who ran an organization called the World Church of the Creator, booked the speaking room of that library to give an address about the beliefs of his church. Some time after he did so, it was discovered that Hale's “church” was a white supremacist organization of some sorts, and a debate arose in town about whether the library should allow him to go ahead with his plans. To me the debate was baffling. Hale was an unknown figure representing a tiny fringe group which was based in another state. It was unlikely that his lecture would be attended by more than a dozen people, and when it was over, he would get back in his car and return to Ohio, or wherever he came from, without the vast majority of York City ever having known he was there. What point was there in debating his freedom of speech, which was a natural human right enshrined in the First Amendment, and inalienable? And more importantly, what was to be gained by giving him attention? Even within the smallish and badly disorganized neo-Nazi community in America the man was a nobody, so why make him into a somebody by talking about him? The most effective weapon against the Hales of the world, I argued, was simply to ignore them, for in the absence of a large body of people who subscribed to their beliefs, the only way they could gain a sense of power was to bring attention to themselves -- to make them seem they were more important than they actually were.
And the sad fact of life is that it is much easier to get attention by evoking fear, anger and hatred than it is through demonstrations of love or logic. A burning cross will always draw a larger audience than an episode of Cosmos and never mind if three-quarters of the people surrounding the cross are only there to put it out.
My arguments did not sit very well with most people, who delighted in reminded me that Hitler had once been an obscure political figure at the head of a fringe party with no money and only a tiny following. They yawned through my counter-arguments that the Nazi movement had at its center a brilliant and dynamic leader (which Hale was not), that it had strong support within the segments of the German military, intellectual, and industrial classes (which Hale did not), that it numbered a fairish group of war heroes, scientists and other prominent citizens in its ranks (which Hale's group didn't), that it was superbly organized and empowered by political and economic conditions which were unique to Germany of the 1920s and 30s (which Hale wasn't), and – most of all – that it existed in a racially homogeneous country that spoke a single language (which America most decidedly isn't). But even if we accepted the Hitler comparison as valid on its face, we would have to admit that Hitler only achieved national prominence in Germany by using provocative tactics to garner attention: he knew that it is far more desirable for a politician to be hated than to be ignored, and indeed, his mortal enemies, the Social Democrats and Communists, both played directly into his hands in this regard. Both had private armies, and both unleashed those armies on Hitler's men. Yet the more they attacked the Nazis in the press, in the streets and in the beer halls, the more the police, press and man on the street in Germany became aware that they existed: in essence, they provided Hitler not only with free publicity, but legions of potential supporters who otherwise might never have heard of him -- including wealthy industrialists with fat checkbooks. By acting as if he and his followers were a national menace when they were merely a smallish regional party wracked by infighting and a perpetual shortage of cash, they helped make him a national menace. In effect, they helped transform a monster, who was merely frightening, into a predator, who was actually dangerous. More dangerous, as it turned out, than they were.
My secondary argument was no more successful than my first. Hale was all anyone wanted to talk about, and a number of people that I knew boasted that if he showed up, they'd march in protest outside the library, and if push came to shove, well, they'd push and they'd shove too, and possibly throw a few rocks. When I stated emphatically that this was precisely what Hale wanted -- to be taken seriously, to be viewed as a threat -- I was looked at with impatience and pity, as a teacher might regard a well- intentioned but particularly stupid pupil. Clearly I didn't get it.
As it happened my fears came true. This debate spilled into the local papers, the local TV news, the regional news, and finally, the national. Protestors bused in from all over the country to stand outside my window and shout obscenities at a man they hadn't heard of a week before. Neo-Nazi and skinhead groups, who also probably had no idea who Hale was before the news had informed them, did the same, though their obscenities were pointed in a different direction. Reporters and photographers also arrived by the seeming trainload. On top of all this, every police officer in the city, as well as every deputy in the sheriff's department and numerous troops of State Police, showed up to maintain order – so many cops, in fact, that the City of York spent its entire budget for law enforcement for a year in a single day. By the time Hale showed up to give his lecture, the streets surrounding the library were so packed with humanity you couldn't see the asphalt, and I was told I'd need a special pass, issued by the city, to cross through the police lines to get to my own apartment. In the end, about 25 people were arrested as fights broke out between the more militant of each faction, but Hale himself was neither seen by these people nor harmed by them; he came and went, like Elvis, through the library's back door. I wasn't physically present for his exit, but I'm told he was well pleased by the events of the day, and why the hell not? He had gotten precisely what he'd wanted and had zero chance of otherwise obtaining, i.e. national prominence, courtesy of a group of people who probably would have killed him if they'd had the chance.
It is sometimes said that the only thing necessary for the triumph of evil is for good people to do nothing. In reality, sometimes the only thing necessary for the triumph of evil is for good men to do something stupid, like give evil a megaphone. And a great deal of knowing when to hold 'em and when to fold 'em comes from determining whether what you see before you is a dangerous predator, or a mere monster.
Since the candidacy of Donald Trump began to gather steam, there has been a wave of extremist sentiment in this country which I would not earlier have believed possible in this day and age – even though I flatter myself that I was more sensitive than most of my peers to one side of it, the "rightist" side. But the anger is hardly limited to what we loosely call the political right. It is just as violent on the left, though spurred by a different set of grievances, and that collective fury has manifested nowhere so pointedly as in what used to be called “political discourse.” It has become permissible, and even chic, for people to say things in public settings which would have been absolutely unimaginable just ten or fifteen years ago. And I don't simply mean by the collection of ill-educated and grammatically challenged fools that inhabit the internet like fleas on a junkyard dog: I mean from educated people, professional types, even politicians of high standing. Phraseology which was once banished to the “outer darkness” of political thought has re-entered the mainstream. People are waving communist banners and Nazi flags. They are openly calling for violent revolution and the assassination of political figures they dislike. They are questioning the humanity – not just the character or intelligence but the actual humanity – of people of different racial types, religions, and political affiliations. They are training themselves to see virtue only in those that agree with them completely, and villainy in anyone who demurs, even slightly. More and more we are witnessing a seeming normalization of radical thought. Scrolling through Facebook and Twitter feeds, one would think America of 2017 is America of 1775 or 1861 -- a powder keg waiting to explode into bloody violence. But is that perception reality, like wolves scratching at the door? Or is it merely a fear-induced phantom, like the The Man Under The Bed or The Thing In The Closet that terrorized us as children?
I flatter myself that I have a fairly wide circle of acquaintances who cover an enormous geographical area, and the majority of people I speak with privately, regardless of race, ethnicity, religious belief, economic stratum, or political affiliation, are manifestly not radical. They might be Tea Party conservatives or Bernie Sanders-style democratic socialists, they might be sullen Libertarians or self-righteous Greens, but at heart they are ordinary Americans who want to live ordinary American lives, free of the violence and hatred extremism brings. Yet universal, or nearly universal, among them is a sense that things have gotten out of hand, that we are heading in a direction no one wants to go, but nonetheless heading there, and even increasing the pace at which we move. The overall feeling that people seem to exude seems to be one of resignation, of helplessness. Somehow “they” (whoever “they” are) have gotten their hands on the levers of power, their fingers on the emotional and physical triggers; somehow “we” have become pawns in “their” game and lost our ability to set our own course. My decidedly unscientific but very interesting sampling of our populace leads me to conclude that it is not so much that there are more extremists in this country now than before Trump (or Obama), but merely that the extremists we have are getting louder. And part of the reason they are getting louder is that we are listening to them, and worse, reacting to them by beginning the slow but sure process of taking sides. We are, in effect, granting them power over us by believing that they are stronger and more numerous than they are – than we are.
If you are familiar with Tarot cards, you know that the “Devil” Tarot features that least popular angel holding a naked man and woman in captivity with a chain. According to the official explanation, however, “They appear to be held here, against their will, but only closer observation, the chains around their necks are loose and could be easily removed. This symbolizes that bondage to the Devil is ultimately a voluntary matter which consciousness can release.” In other words, the Devil has no power but that which we give him, and what is the Devil, anyway, but the ultimate monster?
The perception that we are drifting toward doom, that huge armies of extremists -- Antifa on one side, the Klan on the other -- are gathering like fantasy-novel armies in the wings, ready to do apocalyptic battle, is
just that -- a perception. But we feed into it when we believe it is real, and even worse than real, inevitable. Because fear creates anger, and anger creates violence, and once violent action is taken it no longer matters if the fear was justified. The monster becomes the predator, which can only be destroyed by violence. There is, however, a flip side to this coin. That which we summon into existence by our belief can sometimes be dispelled by withdrawing that belief. It is possible to restore ownership of our country's discourse to ourselves, and it is ispossible to re-marginalize the motley collection of nuts, bullies, loudmouths and psychotics who have hijacked our politics and our national discourse, and to do it without falling into the trap of fighting them physically. We've done it before. In 1925, membership of the Ku Klux Klan stood at somewhere between three and six million people, and the group had enormous political and social influence in the South and Midwest. Now it numbers around 4,500 people, or roughly about as many as Starfleet, a single Star Trek fan club. But what is crucial to understand is that the KKK was broken almost completely without violence -- without violence from its opponents, anyway. A combination of clerical denunciation, newspaper exposes, education campaigns carried out by the NAACP, and later, a tireless effort by the FBI to destabilize the organization, shattered this once-mighty predator into a cartoon monster, not much more frightening than a Frankenstein nite-lite. Had the Klan been attacked violently, by armed mobs, I daresay things would have turned out quite differently -- many who sat on the fence, sympathetic but previously unwilling to join its ranks, would have seen such attacks as mere prelude to attacks on themselves. But by occupying the moral high ground, the enemies of the Klan left it nowhere to go, no one to appeal to. And the more brutality the Klan used in retaliation, the more it was disgraced, exposed, and shown for what it was. In time even many of the worst bigots wanted nothing to do with it. The fate of the KKK was no longer viewed as a bellwether of the white race.
It may seem as if contradicting myself here, speaking on the one hand of how it is possible to dispel negativity by ignoring it, while at the same time pointing out how taking action can be effective; but again, it is important to know what action to take and when to take it, as well as the difference between a predator and a monster. One requires positive action on your part, the other may not. If you own sheep, you must guard against wolves. If you have children, you need not arm yourself against the Boogeyman. He can be destroyed through other means. The seeming powerlessness of the great masses of ordinary, moderate American people is not a physical reality: it is a perception created by specific incidents and experiences, mostly secondhand and communicated and propagated by fear-mongers in the news and social media. It can be overcome in large part simply by grasping that it is not real. In contrast, the rise of political groups which seem hostile to your interest cannot be dealt with by simply wishing they would go away. Organized activity is required. But organized does not necessarily mean violent. It is more difficult to use one's head than one's fist, but more often than not, the head gets better results.
The rise of extremism in any form is damnably tricky; we must be on guard against it, but it cannot be destroyed by attempting to destroy it violently; this only makes the monster more powerful. Just as Nancy gave power to Freddy by believing in him, so we give power to these tiny fringe movements, composed mostly of morons, voyeurs and the dubiously sane, whose sole virtue is the ability to get attention and engender feelings of fear which are totally disproportionate to the actual level of menace they represent.
I have news for you: the vast majority of people on the political right are not neo-Nazis or white nationalists or sympathetic to the Klan. Likewise, the vast majority of people on the left are not communists or anarchists or secretly beholden to the U.N. They are normal, ordinary Americans. They work for a living. They raise children. They drink coffee, watch television, scroll through their news feeds, complain about how bad their football team is. They drink beer on the Fourth of July and buy things they don't need at Christmas. They may disagree with each other on abortion, taxation, welfare, immigration, gun control and whatever else you care to name, but they don't want to use violence to impose their point of view on their neighbors. Sarcasm, maybe, but not violence, because they know that the ballot box is a better place to fight than the streets. It is only when they believe that they are being physically threatened that they begin to make threats themselves. But most of these threats are illusory. They come from a tiny fraction of the population who have no following and no real prospect of getting into power, even tangentially. The best the troublemakers can generally do is to stir up trouble on social media by provoking people into reacting to them and thus making them seem more important than they are. They are good at it, I grant you, but it takes two to tango.
The events in Charlottesville, Virginia, followed an attempted march by right-wing extremists which was met – predictably and foolishly – with an even larger counter demonstration, itself populated to some degree by extremists of the opposing side. Those events led, directly or indirectly, to three deaths and several dozen injuries: in other words, to any given Saturday in Chicago. Yet every news outlet in the country, as well as all forms of social media, are acting as if the rebels just fired on Fort Sumter. People are jabbering about a “Civil War 2.0” -- as if the preconditions for such a conflict have actually been met. Richard Spenser, a once-obscure former alt-right magazine editor, has become a national figure, rather like a mouse which runs in front of a searchlight and appears, in shadow form, to be a giant rat. Even good old David Duke has been plucked from the ash-heap of 1980s politics, dusted off, and pointed before the cameras; if I may hit you with another metaphor, rather like a rotted old muppet being manipulated by your mean old uncle with the glass eye and the taste for Jew jokes. Yet lost in all the coverage, hype, anger and fear-mongering is the fact that the total number of people involved in each march was actually pathetically small.The tally of “neo-Nazis, white nationalists and Klansmen” who made up the initial march is estimated at two or three hundred, while their opposition probably numbered a few thousand – not enough, even in combination, to fill a minor league baseball stadium. Truth be told, the number of rightists fanatical enough to brave stones, tear gas and police dogs just so they can wave a Nazi flag in public is very, very small, and the number of people angry enough to leave their homes and travel long distances to confront them, risking arrest or injury to do so, is not much larger. Just as every hockey team possesses only one or two players who can properly be called goons, even the more extreme ends of the political spectrum possess only small groups of violently active people from among their ranks. Incidents like that which took place Charlottesville are tragic mainly because they are unnecessary. Nothing that happened there had to occur; it occurred because a small group of frightened, angry people holding one point of view decided to hold a rally, and thus provoked a somewhat larger group of equally frightened, angry people holding the opposing point of view to show up in protest. The actual psychological motives for such confrontations are always interesting, and almost never what you might expect. (Where, after all, was this level of left-wing outrage when the Klan held its yearly gathering at Stone Mountain, each and every year Obama held office?) The truth is that in all those noisy, curse-laden face-offs between opponents and supporters of, for example, abortion, have you ever seen someone make an epiphany face, throw down their placard and exclaim, to the person spitting insults and threats at them, “My God! You're right! You've destroyed my argument and changed my entire point of view!” ? Of course you haven't. The purpose of political confrontation in the post-MLK era is almost never to educate or persuade, but to attack, verbally or physically, those with different beliefs. And such attacks never accomplish anything, except to harden your opponent's stance. It was for precisely this reason that Martin Luther King adopted Gandhi's tactic of "satyagraha" during his fight for black liberation; by renouncing violence and aggressive rhetoric, he occupied the moral high ground, aroused sympathy and respect, and -- perhaps most importantly of all -- turned many people around to his way of thinking. But not one person will leave Charlottesville with a different point of view than had when they arrived; they will simply feel their existing emotions of anger and hatred more deeply. They are feeding into a cycle of violence which can only escalate. They are giving the monster its power, and in so doing, tacitly agreeing to become its next victims.
The military philosopher Clausewitz once wrote that "the mistakes of a single hour, made early in a campaign, often cannot be rectified later on even by weeks of sustained effort." In other words, what you do at the beginning of a fight -- when the clay is wet, so to speak -- is often far more important than any actions taken later, when that clay has hardened. We are at that crucial time now. A few pimply monsters groan at us from the dark, trying to fool us that they have substance, and numbers, and can tear us to bits when we sleep. But it isn't true unless we we make it so.
I am not an alarmist, but neither am I clanging a cow bell and croaking "All is well!" when the city is on fire.
It's for damn sure there are predators in American political and economic life today who need to be called out for what they are, confronted, and yes, if necessary, fought (who they are and how they should be combated is a subject for another time). But it is equally important that we differentiate the predators from the mere monsters, the cartoon villains hiding in closets and lurking under beds, who have precisely as much power as we grant them and no more.
Nancy: I know you're there, Krueger.
Freddy: (emerging from the bed) You think you was gonna get away from me?
Nancy: I know you too well now, Freddy.
Freddy: And now you die.
Nancy: It's too late, Krueger. I know the secret now. This is just a dream, too. You're not alive. The whole thing is a dream. I want my mother and friends again.
Freddy: You what?
Nancy: I take back every bit of energy I ever gave you. You're nothing. You're shit.
Nancy contemptuously turns her back on him. Enraged, Freddy hurls himself at his teenage nemesis... and disintegrates, screaming, into nothingness.
The idea of robbing a monster of its power by simply ignoring it is an old one in both mythology and storytelling, and implies a peculiar sort of relationship, in which in the monster is dependent upon its victims not merely for sustenance, but for its very existence. To some degree is this what differentiates a monster, which is imaginary, from a predator, which is real; wolves do not give a damn if caribou believe in them, because what a wolf needs to live is meat, not acknowledgement. In this sense the monster, who requires you to participate in your own murder, is actually weaker than the predator, who can kill you whether you believe in him or not.
Fifteen years ago, I was living in York, Pennsylvania, working for the District Attorney's Office as an pre- sentence investigator. It so happened that my apartment, which stood across from the courthouse in which I worked, was also on the same block as the city library, and one day a fellow named Matt Hale, who ran an organization called the World Church of the Creator, booked the speaking room of that library to give an address about the beliefs of his church. Some time after he did so, it was discovered that Hale's “church” was a white supremacist organization of some sorts, and a debate arose in town about whether the library should allow him to go ahead with his plans. To me the debate was baffling. Hale was an unknown figure representing a tiny fringe group which was based in another state. It was unlikely that his lecture would be attended by more than a dozen people, and when it was over, he would get back in his car and return to Ohio, or wherever he came from, without the vast majority of York City ever having known he was there. What point was there in debating his freedom of speech, which was a natural human right enshrined in the First Amendment, and inalienable? And more importantly, what was to be gained by giving him attention? Even within the smallish and badly disorganized neo-Nazi community in America the man was a nobody, so why make him into a somebody by talking about him? The most effective weapon against the Hales of the world, I argued, was simply to ignore them, for in the absence of a large body of people who subscribed to their beliefs, the only way they could gain a sense of power was to bring attention to themselves -- to make them seem they were more important than they actually were.
And the sad fact of life is that it is much easier to get attention by evoking fear, anger and hatred than it is through demonstrations of love or logic. A burning cross will always draw a larger audience than an episode of Cosmos and never mind if three-quarters of the people surrounding the cross are only there to put it out.
My arguments did not sit very well with most people, who delighted in reminded me that Hitler had once been an obscure political figure at the head of a fringe party with no money and only a tiny following. They yawned through my counter-arguments that the Nazi movement had at its center a brilliant and dynamic leader (which Hale was not), that it had strong support within the segments of the German military, intellectual, and industrial classes (which Hale did not), that it numbered a fairish group of war heroes, scientists and other prominent citizens in its ranks (which Hale's group didn't), that it was superbly organized and empowered by political and economic conditions which were unique to Germany of the 1920s and 30s (which Hale wasn't), and – most of all – that it existed in a racially homogeneous country that spoke a single language (which America most decidedly isn't). But even if we accepted the Hitler comparison as valid on its face, we would have to admit that Hitler only achieved national prominence in Germany by using provocative tactics to garner attention: he knew that it is far more desirable for a politician to be hated than to be ignored, and indeed, his mortal enemies, the Social Democrats and Communists, both played directly into his hands in this regard. Both had private armies, and both unleashed those armies on Hitler's men. Yet the more they attacked the Nazis in the press, in the streets and in the beer halls, the more the police, press and man on the street in Germany became aware that they existed: in essence, they provided Hitler not only with free publicity, but legions of potential supporters who otherwise might never have heard of him -- including wealthy industrialists with fat checkbooks. By acting as if he and his followers were a national menace when they were merely a smallish regional party wracked by infighting and a perpetual shortage of cash, they helped make him a national menace. In effect, they helped transform a monster, who was merely frightening, into a predator, who was actually dangerous. More dangerous, as it turned out, than they were.
My secondary argument was no more successful than my first. Hale was all anyone wanted to talk about, and a number of people that I knew boasted that if he showed up, they'd march in protest outside the library, and if push came to shove, well, they'd push and they'd shove too, and possibly throw a few rocks. When I stated emphatically that this was precisely what Hale wanted -- to be taken seriously, to be viewed as a threat -- I was looked at with impatience and pity, as a teacher might regard a well- intentioned but particularly stupid pupil. Clearly I didn't get it.
As it happened my fears came true. This debate spilled into the local papers, the local TV news, the regional news, and finally, the national. Protestors bused in from all over the country to stand outside my window and shout obscenities at a man they hadn't heard of a week before. Neo-Nazi and skinhead groups, who also probably had no idea who Hale was before the news had informed them, did the same, though their obscenities were pointed in a different direction. Reporters and photographers also arrived by the seeming trainload. On top of all this, every police officer in the city, as well as every deputy in the sheriff's department and numerous troops of State Police, showed up to maintain order – so many cops, in fact, that the City of York spent its entire budget for law enforcement for a year in a single day. By the time Hale showed up to give his lecture, the streets surrounding the library were so packed with humanity you couldn't see the asphalt, and I was told I'd need a special pass, issued by the city, to cross through the police lines to get to my own apartment. In the end, about 25 people were arrested as fights broke out between the more militant of each faction, but Hale himself was neither seen by these people nor harmed by them; he came and went, like Elvis, through the library's back door. I wasn't physically present for his exit, but I'm told he was well pleased by the events of the day, and why the hell not? He had gotten precisely what he'd wanted and had zero chance of otherwise obtaining, i.e. national prominence, courtesy of a group of people who probably would have killed him if they'd had the chance.
It is sometimes said that the only thing necessary for the triumph of evil is for good people to do nothing. In reality, sometimes the only thing necessary for the triumph of evil is for good men to do something stupid, like give evil a megaphone. And a great deal of knowing when to hold 'em and when to fold 'em comes from determining whether what you see before you is a dangerous predator, or a mere monster.
Since the candidacy of Donald Trump began to gather steam, there has been a wave of extremist sentiment in this country which I would not earlier have believed possible in this day and age – even though I flatter myself that I was more sensitive than most of my peers to one side of it, the "rightist" side. But the anger is hardly limited to what we loosely call the political right. It is just as violent on the left, though spurred by a different set of grievances, and that collective fury has manifested nowhere so pointedly as in what used to be called “political discourse.” It has become permissible, and even chic, for people to say things in public settings which would have been absolutely unimaginable just ten or fifteen years ago. And I don't simply mean by the collection of ill-educated and grammatically challenged fools that inhabit the internet like fleas on a junkyard dog: I mean from educated people, professional types, even politicians of high standing. Phraseology which was once banished to the “outer darkness” of political thought has re-entered the mainstream. People are waving communist banners and Nazi flags. They are openly calling for violent revolution and the assassination of political figures they dislike. They are questioning the humanity – not just the character or intelligence but the actual humanity – of people of different racial types, religions, and political affiliations. They are training themselves to see virtue only in those that agree with them completely, and villainy in anyone who demurs, even slightly. More and more we are witnessing a seeming normalization of radical thought. Scrolling through Facebook and Twitter feeds, one would think America of 2017 is America of 1775 or 1861 -- a powder keg waiting to explode into bloody violence. But is that perception reality, like wolves scratching at the door? Or is it merely a fear-induced phantom, like the The Man Under The Bed or The Thing In The Closet that terrorized us as children?
I flatter myself that I have a fairly wide circle of acquaintances who cover an enormous geographical area, and the majority of people I speak with privately, regardless of race, ethnicity, religious belief, economic stratum, or political affiliation, are manifestly not radical. They might be Tea Party conservatives or Bernie Sanders-style democratic socialists, they might be sullen Libertarians or self-righteous Greens, but at heart they are ordinary Americans who want to live ordinary American lives, free of the violence and hatred extremism brings. Yet universal, or nearly universal, among them is a sense that things have gotten out of hand, that we are heading in a direction no one wants to go, but nonetheless heading there, and even increasing the pace at which we move. The overall feeling that people seem to exude seems to be one of resignation, of helplessness. Somehow “they” (whoever “they” are) have gotten their hands on the levers of power, their fingers on the emotional and physical triggers; somehow “we” have become pawns in “their” game and lost our ability to set our own course. My decidedly unscientific but very interesting sampling of our populace leads me to conclude that it is not so much that there are more extremists in this country now than before Trump (or Obama), but merely that the extremists we have are getting louder. And part of the reason they are getting louder is that we are listening to them, and worse, reacting to them by beginning the slow but sure process of taking sides. We are, in effect, granting them power over us by believing that they are stronger and more numerous than they are – than we are.
If you are familiar with Tarot cards, you know that the “Devil” Tarot features that least popular angel holding a naked man and woman in captivity with a chain. According to the official explanation, however, “They appear to be held here, against their will, but only closer observation, the chains around their necks are loose and could be easily removed. This symbolizes that bondage to the Devil is ultimately a voluntary matter which consciousness can release.” In other words, the Devil has no power but that which we give him, and what is the Devil, anyway, but the ultimate monster?
The perception that we are drifting toward doom, that huge armies of extremists -- Antifa on one side, the Klan on the other -- are gathering like fantasy-novel armies in the wings, ready to do apocalyptic battle, is
just that -- a perception. But we feed into it when we believe it is real, and even worse than real, inevitable. Because fear creates anger, and anger creates violence, and once violent action is taken it no longer matters if the fear was justified. The monster becomes the predator, which can only be destroyed by violence. There is, however, a flip side to this coin. That which we summon into existence by our belief can sometimes be dispelled by withdrawing that belief. It is possible to restore ownership of our country's discourse to ourselves, and it is ispossible to re-marginalize the motley collection of nuts, bullies, loudmouths and psychotics who have hijacked our politics and our national discourse, and to do it without falling into the trap of fighting them physically. We've done it before. In 1925, membership of the Ku Klux Klan stood at somewhere between three and six million people, and the group had enormous political and social influence in the South and Midwest. Now it numbers around 4,500 people, or roughly about as many as Starfleet, a single Star Trek fan club. But what is crucial to understand is that the KKK was broken almost completely without violence -- without violence from its opponents, anyway. A combination of clerical denunciation, newspaper exposes, education campaigns carried out by the NAACP, and later, a tireless effort by the FBI to destabilize the organization, shattered this once-mighty predator into a cartoon monster, not much more frightening than a Frankenstein nite-lite. Had the Klan been attacked violently, by armed mobs, I daresay things would have turned out quite differently -- many who sat on the fence, sympathetic but previously unwilling to join its ranks, would have seen such attacks as mere prelude to attacks on themselves. But by occupying the moral high ground, the enemies of the Klan left it nowhere to go, no one to appeal to. And the more brutality the Klan used in retaliation, the more it was disgraced, exposed, and shown for what it was. In time even many of the worst bigots wanted nothing to do with it. The fate of the KKK was no longer viewed as a bellwether of the white race.
It may seem as if contradicting myself here, speaking on the one hand of how it is possible to dispel negativity by ignoring it, while at the same time pointing out how taking action can be effective; but again, it is important to know what action to take and when to take it, as well as the difference between a predator and a monster. One requires positive action on your part, the other may not. If you own sheep, you must guard against wolves. If you have children, you need not arm yourself against the Boogeyman. He can be destroyed through other means. The seeming powerlessness of the great masses of ordinary, moderate American people is not a physical reality: it is a perception created by specific incidents and experiences, mostly secondhand and communicated and propagated by fear-mongers in the news and social media. It can be overcome in large part simply by grasping that it is not real. In contrast, the rise of political groups which seem hostile to your interest cannot be dealt with by simply wishing they would go away. Organized activity is required. But organized does not necessarily mean violent. It is more difficult to use one's head than one's fist, but more often than not, the head gets better results.
The rise of extremism in any form is damnably tricky; we must be on guard against it, but it cannot be destroyed by attempting to destroy it violently; this only makes the monster more powerful. Just as Nancy gave power to Freddy by believing in him, so we give power to these tiny fringe movements, composed mostly of morons, voyeurs and the dubiously sane, whose sole virtue is the ability to get attention and engender feelings of fear which are totally disproportionate to the actual level of menace they represent.
I have news for you: the vast majority of people on the political right are not neo-Nazis or white nationalists or sympathetic to the Klan. Likewise, the vast majority of people on the left are not communists or anarchists or secretly beholden to the U.N. They are normal, ordinary Americans. They work for a living. They raise children. They drink coffee, watch television, scroll through their news feeds, complain about how bad their football team is. They drink beer on the Fourth of July and buy things they don't need at Christmas. They may disagree with each other on abortion, taxation, welfare, immigration, gun control and whatever else you care to name, but they don't want to use violence to impose their point of view on their neighbors. Sarcasm, maybe, but not violence, because they know that the ballot box is a better place to fight than the streets. It is only when they believe that they are being physically threatened that they begin to make threats themselves. But most of these threats are illusory. They come from a tiny fraction of the population who have no following and no real prospect of getting into power, even tangentially. The best the troublemakers can generally do is to stir up trouble on social media by provoking people into reacting to them and thus making them seem more important than they are. They are good at it, I grant you, but it takes two to tango.
The events in Charlottesville, Virginia, followed an attempted march by right-wing extremists which was met – predictably and foolishly – with an even larger counter demonstration, itself populated to some degree by extremists of the opposing side. Those events led, directly or indirectly, to three deaths and several dozen injuries: in other words, to any given Saturday in Chicago. Yet every news outlet in the country, as well as all forms of social media, are acting as if the rebels just fired on Fort Sumter. People are jabbering about a “Civil War 2.0” -- as if the preconditions for such a conflict have actually been met. Richard Spenser, a once-obscure former alt-right magazine editor, has become a national figure, rather like a mouse which runs in front of a searchlight and appears, in shadow form, to be a giant rat. Even good old David Duke has been plucked from the ash-heap of 1980s politics, dusted off, and pointed before the cameras; if I may hit you with another metaphor, rather like a rotted old muppet being manipulated by your mean old uncle with the glass eye and the taste for Jew jokes. Yet lost in all the coverage, hype, anger and fear-mongering is the fact that the total number of people involved in each march was actually pathetically small.The tally of “neo-Nazis, white nationalists and Klansmen” who made up the initial march is estimated at two or three hundred, while their opposition probably numbered a few thousand – not enough, even in combination, to fill a minor league baseball stadium. Truth be told, the number of rightists fanatical enough to brave stones, tear gas and police dogs just so they can wave a Nazi flag in public is very, very small, and the number of people angry enough to leave their homes and travel long distances to confront them, risking arrest or injury to do so, is not much larger. Just as every hockey team possesses only one or two players who can properly be called goons, even the more extreme ends of the political spectrum possess only small groups of violently active people from among their ranks. Incidents like that which took place Charlottesville are tragic mainly because they are unnecessary. Nothing that happened there had to occur; it occurred because a small group of frightened, angry people holding one point of view decided to hold a rally, and thus provoked a somewhat larger group of equally frightened, angry people holding the opposing point of view to show up in protest. The actual psychological motives for such confrontations are always interesting, and almost never what you might expect. (Where, after all, was this level of left-wing outrage when the Klan held its yearly gathering at Stone Mountain, each and every year Obama held office?) The truth is that in all those noisy, curse-laden face-offs between opponents and supporters of, for example, abortion, have you ever seen someone make an epiphany face, throw down their placard and exclaim, to the person spitting insults and threats at them, “My God! You're right! You've destroyed my argument and changed my entire point of view!” ? Of course you haven't. The purpose of political confrontation in the post-MLK era is almost never to educate or persuade, but to attack, verbally or physically, those with different beliefs. And such attacks never accomplish anything, except to harden your opponent's stance. It was for precisely this reason that Martin Luther King adopted Gandhi's tactic of "satyagraha" during his fight for black liberation; by renouncing violence and aggressive rhetoric, he occupied the moral high ground, aroused sympathy and respect, and -- perhaps most importantly of all -- turned many people around to his way of thinking. But not one person will leave Charlottesville with a different point of view than had when they arrived; they will simply feel their existing emotions of anger and hatred more deeply. They are feeding into a cycle of violence which can only escalate. They are giving the monster its power, and in so doing, tacitly agreeing to become its next victims.
The military philosopher Clausewitz once wrote that "the mistakes of a single hour, made early in a campaign, often cannot be rectified later on even by weeks of sustained effort." In other words, what you do at the beginning of a fight -- when the clay is wet, so to speak -- is often far more important than any actions taken later, when that clay has hardened. We are at that crucial time now. A few pimply monsters groan at us from the dark, trying to fool us that they have substance, and numbers, and can tear us to bits when we sleep. But it isn't true unless we we make it so.
I am not an alarmist, but neither am I clanging a cow bell and croaking "All is well!" when the city is on fire.
It's for damn sure there are predators in American political and economic life today who need to be called out for what they are, confronted, and yes, if necessary, fought (who they are and how they should be combated is a subject for another time). But it is equally important that we differentiate the predators from the mere monsters, the cartoon villains hiding in closets and lurking under beds, who have precisely as much power as we grant them and no more.
Published on August 19, 2017 19:00
July 30, 2017
Bullies, Hypocrites, and Your Place in Hell
A week or so ago, I woke up to the news that Chester Bennington had hanged himself. If the name doesn't exactly ring a bell with you, I freely admit it didn't ring any with me, either -- or wouldn't have, had his name not been smeared all over the news for about a month previously. Bennington had been the lead singer for Linkin Park, a rap-metal band which had blown up the airwaves in the 2000s with a string of mega-hits but faded from relevance in more recent years, though commercially they remained quite successful. Interestingly, Bennington committed suicide just two months after the suicide of his close friend and fellow singer Chris Cornell, who'd fronted for Soundgarden, Audioslave and Temple of the Dog.
In the weeks preceding Bennington's death, he had appeared on the pop-culture radar once more, but this time for all the wrong reasons. Linkin Park had released an album called "One More Light" which was violently attacked by critics and fans of the band alike. The oldest and tiredest, but also the hardest-hitting, charge which can be rendered against musicians is that they have "sold out," and "One More Light" was viewed by many as a sell-out album, an attempt by fading rockers to go mainstream in order to recapture their cultural relevance and, presumably, replenish their bank accounts. These attacks deeply angered Bennington, who lashed out in interviews with comments like, "if you’re gonna be the person who says, ‘They made a marketing decision to make this kind of record to make money,’ you can fucking meet me outside and I will punch you in your fucking mouth, because that is the wrong fucking answer.”
It was the fan backlash to these and other comments which brought Bennington back into my personal awareness. My various social media news feeds were a torrent of abuse heaped on Bennington for everything from his physical appearance to the sound of his voice to the type of music he made to the supposed decline in its quality; but they were most especially reserved for the fact had struck back at his critics. Evidently he had been growing increasingly angry and frustrated that fans and critics could not let go of their memories of Linkin Park's first two albums, "Hybrid Theory" and "Meteora," the former of which was released 17 years ago. It is hardly unique for a band to get lost in the shadow of its own early work -- it happens to authors, too, not to mention artists, actors and directors -- but in Bennington's case the burden of his early mega-success seemed to grow heavier over the course of time. This was due largely to the fact that each subsequent album was criticized for not being an exact copy of the originals. In ordinary human lives, the past is supposed to fade with time and not grow more vivid, but with artists the exact opposite often obtains; futures become bleak, presents irrelevant, and only the past seems to matter.
All public figures are subject to criticism and ridicule by that same public, especially in the age of the internet, and Bennington was no exception. What was exceptional in his case was the intensity and the viciousness of the attacks. It was not merely that he had become a prisoner of his own early success, condemned for failing to repeat his formula exactly; he was also attacked even more violently when he tried to reinvent himself into something completely new. As my grandmother would have said, "they got him coming and going," for whichever direction he turned, the critics were waiting. He was like a boy beset by multiple bullies, at once pushing and shoving. And in the last weeks of his life, the bullying increased to a savage intensity. I couldn't lift my phone or turn on my PC without seeing threads and articles devoted not merely to bashing Bennington's new sound or defiant attitude, but to people specifying how much they hated him personally, how much they wanted to beat up "that skinny little shit." What surprised me was not the ferocity of the trolling -- the internet is an ugly place -- but the fact that so much of it came from people who admitted they were, or had been, devoted fans of his band. Some people evidently found no contradiction in boasting that they considered two or thee of Linkin Park's albums masterpieces they had "played to death and still listened to," yet, in the same sentence, wishing they could take Bennington up on his challenge to "meet them outside." He was laughed at, ridiculed, insulted, threatened, dismissed, negated as a musician and a human being, and all because he had expressed crude but understandable frustration with being subjected to same.
I don't profess to know precisely what drove Bennington, a father of six children, to hang himself. He was depressed by the suicide of his close friend Cornell (he died on what would have been Cornell's birthday), and obviously upset by the firestorm of abuse he'd endured following the release of "One More Light," but I imagine the cause lay deeper within himself. Artists are often very troubled souls, thin-skinned and beset by demons and doubts, prone to overthinking everything and often prone to gloomy outlooks on life. They come into the world both blessed and burdened, and the burden often outweighs the blessing, at least within their own minds. It's seldom a shock to me when such a person chooses to take their own life, or commits default suicide by drinking or drugging themselves into oblivion. Many struggle on the edge of that cliff for years or decades or their entire adult lives, and no one knows the struggle is even taking place until it ends, tragically, with a shotgun blast or a length of rope. What I do know is that the difference between life and death is often found at the central point of their spiritual corrosion -- that is, the point where they are most damaged, most weakened, most susceptible to attack. And that area, like as not, was either created or expanded by bullying. For once a person has been bullied -- I don't mean once or twice, but over a long period of time, and usually at a vulnerable age -- they never completely recover from the experience. They will never be able to endure taunts and mean-spirited ridicule with the same equanimity as a self-assured person who was not bullied during their formative years. Like Achilles, their vulnerability is both built-in and permanent, and no amount of success, fame or material wealth will make it go away.
When Bennington killed himself, the reaction on the Net was as effusive and passionate as it had been in the month leading up to his death, with the exception that all the sentiment was turned on its head. Many of the very same people (I recognized their profile pictures and internet handles) who had called Bennington a "skinny, talentless little faggot" and pleaded for a chance to "beat the shit out of him" just weeks before, now painted cyberspace with weeping emojis and long, heartfelt paens to his greatness. They expressed dismay and horror at his death and said things like, "RIP brother," "thoughts and prayers for your family" or the classic, "I didn't like the guy buy I never wanted this!" (This begs the question of what, precisely, they did want when they wrote things like, "Fuck you and your shitty music. Die!").
I found the hypocrisy of all this affected grief to be quite disgusting, and for more than the obvious reason that it is just exactly that. The truth is I understand Chester Bennington uncomfortably well. I too am the creative, "artistic" type, and I also know what it's like to be bullied -- thoroughly, mercilessly and inventively, over a period of years. I know the deep and abiding scars such bullying leaves, as well as the brimming reservoir of anger which can run over either as depression or violence or both. And, perhaps worst of all, I know what it is like to see the bullies of yesteryear turn around in the present day with friendly smiles, pretending or perhaps even believing that they did you no wrong. It was fascinating, though by no means encouraging, to run into people at my high school reunions who were among the most sadistic and enthusiastic bullies I've ever encountered, and discover how completely they had forgotten their behavior -- if, indeed, they had ever acknowledged it in the first place. In one instance I was asked by a former classmate how "X" was doing. This classmate had viciously bullied "X" all through junior high school and into ninth and perhaps tenth grade, sometimes physically attacking him, and doing so despite knowing "X" had tragically lost his father in an accident; yet as a grown man he seemed to regard "X" with genuine affection, as if they had been buddies who'd shared many a sophomoric hi-jink. The truly awful thing about the conversation was that his solicitous questions as to "X's" status and well-being carried no trace or irony or malice; they actually seemed sincere. Another former classmate, now mother of a large family, held forth at some length about her liberal principles and how much she hated Donald Trump because of his long record of bullying behavior: yet she herself was one of the queen bee bitches of my formative years, a Cordelia Chase-type ringleader who made cruel sport of the awkward, the unattractive, the unathletic and the poor.
Don't mistake me. I am not an advocate of holding grudges, I believe in second chances and I have seen people change remarkably over the course of not terribly long lifetimes. I don't believe that someone who was a jerkoff at sixteen must necessarily be one at twenty-eight or forty-five, but I do insist that he or she at least take some responsibility, and accept some accountability, for the wrongs they have done. I have far more respect for an unregenerate scumbag who freely admits all of his crimes and outrages and openly plots to commit more, than I have for someone who spent their youth tormenting others and now pretends that none of it ever happened. This latter type reminds me of the line in Shakespeare about "remembering with advantages." Doubtless a lot of former bullies need to re-write history within their own minds so as not to despise themselves in later life, and I'm sure bullies that have procreated feel an even more urgent need to "remember with advantages" their own school days, lest they wake up in the middle of the night trembling at the thought that someone like their younger selves might take an unhealthy interest in their own children. When Chester Bennington was alive, many took distinct pleasure in harassing him online, and when he committed suicide, many of those same people affected (virtual) tears of sympathy. In some cases, the tears were probably not even affected. One thing is for certain: in all the hundreds of comments and posts I scrolled through, not one expressed any remorse or discomfort for past posts raining abuse on the now-dead singer. Not one person said, "Gee, I kinda feel bad I trashed him so hard -- maybe he read some of that and it got to him." To even acknowledge the possibility, you see, would be to acknowledge the responsibility that accompanies free speech. Technically speaking, and legally, there is nothing to prevent one person from verbally bullying another; it is a question of morals, of right conduct, or more simply put, of not being an asshole. But again, to accept responsibility for something means acknowledging that you did it in the first place, and judging by some of the ex-bullies I mentioned above, that is precisely what these people train themselves to avoid. They want, simultaneously, to act like shitheels and think of themselves as good people, to vent their sadism and to deny ever having been sadistic. Like participants in a riot, they wake up the next morning and go about their business as if nothing happened. And I suppose, from their point of view, nothing did. But it is worth remembering that in Dante's Inferno, hell is composed of nine concentric circles, with the least odious sinners in the first circle and the worst in the ninth.
The eighth is for hypocrites.
In the weeks preceding Bennington's death, he had appeared on the pop-culture radar once more, but this time for all the wrong reasons. Linkin Park had released an album called "One More Light" which was violently attacked by critics and fans of the band alike. The oldest and tiredest, but also the hardest-hitting, charge which can be rendered against musicians is that they have "sold out," and "One More Light" was viewed by many as a sell-out album, an attempt by fading rockers to go mainstream in order to recapture their cultural relevance and, presumably, replenish their bank accounts. These attacks deeply angered Bennington, who lashed out in interviews with comments like, "if you’re gonna be the person who says, ‘They made a marketing decision to make this kind of record to make money,’ you can fucking meet me outside and I will punch you in your fucking mouth, because that is the wrong fucking answer.”
It was the fan backlash to these and other comments which brought Bennington back into my personal awareness. My various social media news feeds were a torrent of abuse heaped on Bennington for everything from his physical appearance to the sound of his voice to the type of music he made to the supposed decline in its quality; but they were most especially reserved for the fact had struck back at his critics. Evidently he had been growing increasingly angry and frustrated that fans and critics could not let go of their memories of Linkin Park's first two albums, "Hybrid Theory" and "Meteora," the former of which was released 17 years ago. It is hardly unique for a band to get lost in the shadow of its own early work -- it happens to authors, too, not to mention artists, actors and directors -- but in Bennington's case the burden of his early mega-success seemed to grow heavier over the course of time. This was due largely to the fact that each subsequent album was criticized for not being an exact copy of the originals. In ordinary human lives, the past is supposed to fade with time and not grow more vivid, but with artists the exact opposite often obtains; futures become bleak, presents irrelevant, and only the past seems to matter.
All public figures are subject to criticism and ridicule by that same public, especially in the age of the internet, and Bennington was no exception. What was exceptional in his case was the intensity and the viciousness of the attacks. It was not merely that he had become a prisoner of his own early success, condemned for failing to repeat his formula exactly; he was also attacked even more violently when he tried to reinvent himself into something completely new. As my grandmother would have said, "they got him coming and going," for whichever direction he turned, the critics were waiting. He was like a boy beset by multiple bullies, at once pushing and shoving. And in the last weeks of his life, the bullying increased to a savage intensity. I couldn't lift my phone or turn on my PC without seeing threads and articles devoted not merely to bashing Bennington's new sound or defiant attitude, but to people specifying how much they hated him personally, how much they wanted to beat up "that skinny little shit." What surprised me was not the ferocity of the trolling -- the internet is an ugly place -- but the fact that so much of it came from people who admitted they were, or had been, devoted fans of his band. Some people evidently found no contradiction in boasting that they considered two or thee of Linkin Park's albums masterpieces they had "played to death and still listened to," yet, in the same sentence, wishing they could take Bennington up on his challenge to "meet them outside." He was laughed at, ridiculed, insulted, threatened, dismissed, negated as a musician and a human being, and all because he had expressed crude but understandable frustration with being subjected to same.
I don't profess to know precisely what drove Bennington, a father of six children, to hang himself. He was depressed by the suicide of his close friend Cornell (he died on what would have been Cornell's birthday), and obviously upset by the firestorm of abuse he'd endured following the release of "One More Light," but I imagine the cause lay deeper within himself. Artists are often very troubled souls, thin-skinned and beset by demons and doubts, prone to overthinking everything and often prone to gloomy outlooks on life. They come into the world both blessed and burdened, and the burden often outweighs the blessing, at least within their own minds. It's seldom a shock to me when such a person chooses to take their own life, or commits default suicide by drinking or drugging themselves into oblivion. Many struggle on the edge of that cliff for years or decades or their entire adult lives, and no one knows the struggle is even taking place until it ends, tragically, with a shotgun blast or a length of rope. What I do know is that the difference between life and death is often found at the central point of their spiritual corrosion -- that is, the point where they are most damaged, most weakened, most susceptible to attack. And that area, like as not, was either created or expanded by bullying. For once a person has been bullied -- I don't mean once or twice, but over a long period of time, and usually at a vulnerable age -- they never completely recover from the experience. They will never be able to endure taunts and mean-spirited ridicule with the same equanimity as a self-assured person who was not bullied during their formative years. Like Achilles, their vulnerability is both built-in and permanent, and no amount of success, fame or material wealth will make it go away.
When Bennington killed himself, the reaction on the Net was as effusive and passionate as it had been in the month leading up to his death, with the exception that all the sentiment was turned on its head. Many of the very same people (I recognized their profile pictures and internet handles) who had called Bennington a "skinny, talentless little faggot" and pleaded for a chance to "beat the shit out of him" just weeks before, now painted cyberspace with weeping emojis and long, heartfelt paens to his greatness. They expressed dismay and horror at his death and said things like, "RIP brother," "thoughts and prayers for your family" or the classic, "I didn't like the guy buy I never wanted this!" (This begs the question of what, precisely, they did want when they wrote things like, "Fuck you and your shitty music. Die!").
I found the hypocrisy of all this affected grief to be quite disgusting, and for more than the obvious reason that it is just exactly that. The truth is I understand Chester Bennington uncomfortably well. I too am the creative, "artistic" type, and I also know what it's like to be bullied -- thoroughly, mercilessly and inventively, over a period of years. I know the deep and abiding scars such bullying leaves, as well as the brimming reservoir of anger which can run over either as depression or violence or both. And, perhaps worst of all, I know what it is like to see the bullies of yesteryear turn around in the present day with friendly smiles, pretending or perhaps even believing that they did you no wrong. It was fascinating, though by no means encouraging, to run into people at my high school reunions who were among the most sadistic and enthusiastic bullies I've ever encountered, and discover how completely they had forgotten their behavior -- if, indeed, they had ever acknowledged it in the first place. In one instance I was asked by a former classmate how "X" was doing. This classmate had viciously bullied "X" all through junior high school and into ninth and perhaps tenth grade, sometimes physically attacking him, and doing so despite knowing "X" had tragically lost his father in an accident; yet as a grown man he seemed to regard "X" with genuine affection, as if they had been buddies who'd shared many a sophomoric hi-jink. The truly awful thing about the conversation was that his solicitous questions as to "X's" status and well-being carried no trace or irony or malice; they actually seemed sincere. Another former classmate, now mother of a large family, held forth at some length about her liberal principles and how much she hated Donald Trump because of his long record of bullying behavior: yet she herself was one of the queen bee bitches of my formative years, a Cordelia Chase-type ringleader who made cruel sport of the awkward, the unattractive, the unathletic and the poor.
Don't mistake me. I am not an advocate of holding grudges, I believe in second chances and I have seen people change remarkably over the course of not terribly long lifetimes. I don't believe that someone who was a jerkoff at sixteen must necessarily be one at twenty-eight or forty-five, but I do insist that he or she at least take some responsibility, and accept some accountability, for the wrongs they have done. I have far more respect for an unregenerate scumbag who freely admits all of his crimes and outrages and openly plots to commit more, than I have for someone who spent their youth tormenting others and now pretends that none of it ever happened. This latter type reminds me of the line in Shakespeare about "remembering with advantages." Doubtless a lot of former bullies need to re-write history within their own minds so as not to despise themselves in later life, and I'm sure bullies that have procreated feel an even more urgent need to "remember with advantages" their own school days, lest they wake up in the middle of the night trembling at the thought that someone like their younger selves might take an unhealthy interest in their own children. When Chester Bennington was alive, many took distinct pleasure in harassing him online, and when he committed suicide, many of those same people affected (virtual) tears of sympathy. In some cases, the tears were probably not even affected. One thing is for certain: in all the hundreds of comments and posts I scrolled through, not one expressed any remorse or discomfort for past posts raining abuse on the now-dead singer. Not one person said, "Gee, I kinda feel bad I trashed him so hard -- maybe he read some of that and it got to him." To even acknowledge the possibility, you see, would be to acknowledge the responsibility that accompanies free speech. Technically speaking, and legally, there is nothing to prevent one person from verbally bullying another; it is a question of morals, of right conduct, or more simply put, of not being an asshole. But again, to accept responsibility for something means acknowledging that you did it in the first place, and judging by some of the ex-bullies I mentioned above, that is precisely what these people train themselves to avoid. They want, simultaneously, to act like shitheels and think of themselves as good people, to vent their sadism and to deny ever having been sadistic. Like participants in a riot, they wake up the next morning and go about their business as if nothing happened. And I suppose, from their point of view, nothing did. But it is worth remembering that in Dante's Inferno, hell is composed of nine concentric circles, with the least odious sinners in the first circle and the worst in the ninth.
The eighth is for hypocrites.
Published on July 30, 2017 19:39
July 18, 2017
Death and Gravity
These days a man makes you something
And you never see his face
But there is no hiding place.
-- Don Henley
One of the strangest things about working in the entertainment industry is the way your life tends to intersect, if only momentarily and trivially, with the lives of people whose work you like and admire. For your whole life, or at least a goodly portion of it, you are a fan of X., and then – bam! – he or she is standing next to you at a party or on set or in line to get coffee somewhere. A word or two of conversation follows, which they won't remember in fifteen minutes but which you will never forget. But this intertwining of lives is sometimes even subtler than that: often you work with them without meeting them at all; movies and even TV shows employ huge numbers of people, and the percentage who physically interact with the cast is actually quite small. I worked on something like nineteen episodes of Heroes, and my one ambition was to meet Hayden Panettierre, but the closest I came was at the wrap party for the show's final season, when I may – or may not – have glimpsed her from across a crowded, neon-lighted room jammed with insensible drunks. Moments like this, silly as they are, go a long way towards compensating for all the bullshit I have to put up with living in Los Angeles – the traffic, the noise, the smog, the exorbitant cost of living. At the same time, these incidents remind me of the tremendous power one human being can exert over another without necessarily knowing they exist. In a sense they are like planets whose gravity effects the course and habits of the smaller objects around them.
In the last week, four actors whose lives tangled with mine – though not one of them knew it – have passed away: George Romero, Martin Landau, Nelsan Ellis and Trevor Baxter. I want to stress that of the four, the only one I physically encountered was Landau, and him only by virtue of being in the same theater (and later, in the same parking lot); I never spoke with any of them and none of them ever knew I existed nor would have had any reason to care had they known. Yet in different ways, each played a role in my life that is worthy of note and made me mourn their passing as if I had actually known them.
George Romero is, of course, famous for his classic 1968 horror film, Night of the Living Dead, which spawned a number of sequels and a host of imitators: indeed, one could say it set the standard for all zombie films which followed, right up to today's Walking Dead. But never having had a real affinity for zombie movies, I can't say Romero was much on my radar over the years – with one notable exception. When I was a college sophomore, I attended a reading by three horror authors collectively known as “The Splat Pack” (John Skipp and Craig Spector are the two I can remember; the third's name unfortunately escapes me). These gentlemen knew Romero quite well, and told a half-humorous, half-sad story about how the director, during the shooting of Dawn of the Dead, had been abruptly informed that his budget was being cut by two-thirds. The four of them were sitting in a diner in Pittsburgh – the three authors in full zombie makeup, for they had been cast as extras – and Romero, crestfallen over what he saw as the ruination of his movie, began to rail bitterly about movie producers – their miserliness, treacherousness and general stupidity. At one point he blurted, “A producer with an idea is like a 90 year-old man with an erection. He's so excited that he's got one he doesn't give a damn where he puts it!” I heard this story in 1992, and while I found it hilarious and have repeated it innumerable times, I had no idea how true it was until I moved to Los Angeles fifteen years later. There are, of course, many movie producers who are authentic geniuses – Gale Anne Hurd comes to mind – but in the main, you could put the lot of them on a cruise ship, sink it over the deepest part of the Pacific, and nobody would notice. Or care if they did. So thank you, George, not only for the horror, but for the laughs -- and for the wisdom.
Martin Landau hardly needs an introduction from me, any more than Romero did; he began acting in the 1950s and continued to work right up to his death, winning as Oscar along the way. I know he was working because when I attended the talk he gave at the Egyptian Theater in January of this year, he had come directly from the Actor's Studio... where he had just sat in on 27 auditions. (When I am 89 years old, if I have the energy to watch 27 actors audition and then go attend a screening of one of my films and be interviewed and conduct a Q & A afterward, you can kiss my octogenarian ass.) He was crusty, quick-witted, and most importantly, full of passion for his craft – he mentioned a film he had just completed with Paul Sorvino as being “the best work he'd ever done,” which was a bold statement from a man who had been in North by Northwest, Crimes and Misdemeanors and Ed Wood. He told humorous stories about some of the directors he'd worked with, like Alfred Hitchcock and Woody Allen, and gave sage advice about acting. I left the theater feeling grateful that I'd had a chance to meet this icon (even if he did star in the most reviled sci-fi show of my childhood, Space: 1999.) Scarcely two months later, I was at a screening of a surprisingly good low-budget horror movie called Terror on Hallow's Eve, when I ran into his daughter, Juliet. Juliet Landau is best known for playing the hell out of her role as the batshit-crazy vampire Drusilla on Buffy the Vampire Slayer and its spinoff, Angel. Being the daughter of Hollywood royalty, she could perhaps be forgiven for being a snob, but no, not a bit of it – I've seldom met anyone in the business who was friendlier or more willing to engage in conversation with a complete stranger, simply because the stranger was a friend of a friend. (In fact, I got a chance to tell her the story of how, when I was working at the now-defunct Optic Nerve Studios in 2009, I dropped the mold of her face they had taken for her vampire makeup onto my foot.) The warmth she showed to everyone at the screening seemed to me to speak very well of her dad and how she was raised. So thank you, Martin, for inspiring me with such great performances (his turn in Ed Wood is truly a masterpiece), and for showing that not every famous actor makes the raising of his children a lesser priority than his craft.
Nelsan Ellis? Never met him. Never saw him, either, unless I did without realizing it. And yet we have a connection of sorts, too, and that connection is True Blood, in which he played a motormouth transvestite cook who instantly became one of the show's most popular characters. It so happens that my first-ever onscreen credit came from an episode of this show, which I worked on intermittently during 2011 and 2012. In fact, my labor on True Blood, trivial as it was, put money in my pocket at a time when I was desperate for same; it also allowed me my first on-set and on-location work. I will never forget the three days I spent in the desert in Palmdale, toiling sleeplessly in blazing heat and, at night, sub-freezing cold, watching a small army of crew work their way through a shot list that would have killed Stanely Kubrick. For a guy with a passionate interest in TV and film, being inside the process was quite literally a dream come true – the reason I moved to Los Angeles in the first place. Of course, when one works on a show, one tends to get the full 411 on the cast – especially their failings as human beings. But nobody ever had a bad word to say about Nelsan Ellis. He was by all accounts not only immensely talented, upbeat and friendly, but the sort of actor who electrifies even jaded professionals. There was a rumor, which I have subsequently discovered to be true, that Ellis, who was a playwright as well as an actor, was the only cast member allowed to improvise his dialogue – no mean feat, that. So I would like to thank you, Nelsan, for delivering the kind of performance that kept ratings up, fans happy...and me from being evicted.
Now we come to Trevor Baxter. Doubtless his name means nothing to you, for in addition to not being famous, he was not even American (gasp!). Trevor Baxter was an English character actor of the sort that abound in that country – skillful but unobtrusive, the sort who delivers such seamless, seemingly effortless performances that he never gets his full due as a thespian, if only because you completely forget he's acting at all. I first “met” Trevor as a boy of about eight or nine years, when my mother introduced me to Doctor Who. At that time – we're talking 1979 or 1980 – Doctor Who was not the international powerhouse it has become since it was rebooted some years ago. Far from it: not one in a hundred Americans had ever heard of the show, and it was available only on PBS or local stations in re-run format, with weak transmitters and incomplete episode libraries. Nevertheless, from the very first episode I encountered, I was hooked, and never so much as when I watched the six-part episode “The Talons of Weng-Chiang.” This story, originally broadcast in 1977, found the Doctor and his beautiful but savage companion, Leela, in foggy Victorian London. Writer Robert Holmes had devised a fiendishly wonderful plot which was inspired by the Jack the Ripper killings, the story of Dracula, the Phantom of the Opera, and Sherlock Holmes. Trevor Baxter played Dr. Litefoot, a police pathologist who teams up with the Doctor to defeat the time-traveling war criminal Magnus Greel. Baxter's character, clearly based on Dr. Watson from the Sherlock Holmes universe, is a gem to behold; the quintessential English gentleman of the era, he is a bit prissy and a bit snobby, and has a pathetic tendency to get knocked unconscious, but is loyal as a hound and fearless in the clutch. I was delighted by his performance, and by the performance of his “wingman” in the story, pompous theater owner Henry Gordon Jago (played by Christopher Benjamin), who is as cowardly and verbose as Litefoot is brave and restrained. Even as an adult I thought it a shame that Baxter and Benjamin could not reunite to reprise these wonderful “one-off” characters, so you can imagine my joy when, in 2009, Big Finish Productions began a series of radio plays called Jago & Litefoot, starring these two great actors in their old roles. Between '09 and '16 no less than thirteen series (seasons) were produced, totaling 52 episodes, with a number of stand-alone stories and crossovers as well. Baxter shocked me by slipping into a role he had played only once, in 1977, as easily as one might an old leather jacket. Listening to his assured, almost unchanged voice, it was almost impossible to believe that decades had passed – indeed, it made me feel like a kid again. So thank you for that, Trevor, and for reminding me what a skilled actor can do when given the right material – namely, bring happiness to someone (to millions of someones) he never met.
And you never see his face
But there is no hiding place.
-- Don Henley
One of the strangest things about working in the entertainment industry is the way your life tends to intersect, if only momentarily and trivially, with the lives of people whose work you like and admire. For your whole life, or at least a goodly portion of it, you are a fan of X., and then – bam! – he or she is standing next to you at a party or on set or in line to get coffee somewhere. A word or two of conversation follows, which they won't remember in fifteen minutes but which you will never forget. But this intertwining of lives is sometimes even subtler than that: often you work with them without meeting them at all; movies and even TV shows employ huge numbers of people, and the percentage who physically interact with the cast is actually quite small. I worked on something like nineteen episodes of Heroes, and my one ambition was to meet Hayden Panettierre, but the closest I came was at the wrap party for the show's final season, when I may – or may not – have glimpsed her from across a crowded, neon-lighted room jammed with insensible drunks. Moments like this, silly as they are, go a long way towards compensating for all the bullshit I have to put up with living in Los Angeles – the traffic, the noise, the smog, the exorbitant cost of living. At the same time, these incidents remind me of the tremendous power one human being can exert over another without necessarily knowing they exist. In a sense they are like planets whose gravity effects the course and habits of the smaller objects around them.
In the last week, four actors whose lives tangled with mine – though not one of them knew it – have passed away: George Romero, Martin Landau, Nelsan Ellis and Trevor Baxter. I want to stress that of the four, the only one I physically encountered was Landau, and him only by virtue of being in the same theater (and later, in the same parking lot); I never spoke with any of them and none of them ever knew I existed nor would have had any reason to care had they known. Yet in different ways, each played a role in my life that is worthy of note and made me mourn their passing as if I had actually known them.
George Romero is, of course, famous for his classic 1968 horror film, Night of the Living Dead, which spawned a number of sequels and a host of imitators: indeed, one could say it set the standard for all zombie films which followed, right up to today's Walking Dead. But never having had a real affinity for zombie movies, I can't say Romero was much on my radar over the years – with one notable exception. When I was a college sophomore, I attended a reading by three horror authors collectively known as “The Splat Pack” (John Skipp and Craig Spector are the two I can remember; the third's name unfortunately escapes me). These gentlemen knew Romero quite well, and told a half-humorous, half-sad story about how the director, during the shooting of Dawn of the Dead, had been abruptly informed that his budget was being cut by two-thirds. The four of them were sitting in a diner in Pittsburgh – the three authors in full zombie makeup, for they had been cast as extras – and Romero, crestfallen over what he saw as the ruination of his movie, began to rail bitterly about movie producers – their miserliness, treacherousness and general stupidity. At one point he blurted, “A producer with an idea is like a 90 year-old man with an erection. He's so excited that he's got one he doesn't give a damn where he puts it!” I heard this story in 1992, and while I found it hilarious and have repeated it innumerable times, I had no idea how true it was until I moved to Los Angeles fifteen years later. There are, of course, many movie producers who are authentic geniuses – Gale Anne Hurd comes to mind – but in the main, you could put the lot of them on a cruise ship, sink it over the deepest part of the Pacific, and nobody would notice. Or care if they did. So thank you, George, not only for the horror, but for the laughs -- and for the wisdom.
Martin Landau hardly needs an introduction from me, any more than Romero did; he began acting in the 1950s and continued to work right up to his death, winning as Oscar along the way. I know he was working because when I attended the talk he gave at the Egyptian Theater in January of this year, he had come directly from the Actor's Studio... where he had just sat in on 27 auditions. (When I am 89 years old, if I have the energy to watch 27 actors audition and then go attend a screening of one of my films and be interviewed and conduct a Q & A afterward, you can kiss my octogenarian ass.) He was crusty, quick-witted, and most importantly, full of passion for his craft – he mentioned a film he had just completed with Paul Sorvino as being “the best work he'd ever done,” which was a bold statement from a man who had been in North by Northwest, Crimes and Misdemeanors and Ed Wood. He told humorous stories about some of the directors he'd worked with, like Alfred Hitchcock and Woody Allen, and gave sage advice about acting. I left the theater feeling grateful that I'd had a chance to meet this icon (even if he did star in the most reviled sci-fi show of my childhood, Space: 1999.) Scarcely two months later, I was at a screening of a surprisingly good low-budget horror movie called Terror on Hallow's Eve, when I ran into his daughter, Juliet. Juliet Landau is best known for playing the hell out of her role as the batshit-crazy vampire Drusilla on Buffy the Vampire Slayer and its spinoff, Angel. Being the daughter of Hollywood royalty, she could perhaps be forgiven for being a snob, but no, not a bit of it – I've seldom met anyone in the business who was friendlier or more willing to engage in conversation with a complete stranger, simply because the stranger was a friend of a friend. (In fact, I got a chance to tell her the story of how, when I was working at the now-defunct Optic Nerve Studios in 2009, I dropped the mold of her face they had taken for her vampire makeup onto my foot.) The warmth she showed to everyone at the screening seemed to me to speak very well of her dad and how she was raised. So thank you, Martin, for inspiring me with such great performances (his turn in Ed Wood is truly a masterpiece), and for showing that not every famous actor makes the raising of his children a lesser priority than his craft.
Nelsan Ellis? Never met him. Never saw him, either, unless I did without realizing it. And yet we have a connection of sorts, too, and that connection is True Blood, in which he played a motormouth transvestite cook who instantly became one of the show's most popular characters. It so happens that my first-ever onscreen credit came from an episode of this show, which I worked on intermittently during 2011 and 2012. In fact, my labor on True Blood, trivial as it was, put money in my pocket at a time when I was desperate for same; it also allowed me my first on-set and on-location work. I will never forget the three days I spent in the desert in Palmdale, toiling sleeplessly in blazing heat and, at night, sub-freezing cold, watching a small army of crew work their way through a shot list that would have killed Stanely Kubrick. For a guy with a passionate interest in TV and film, being inside the process was quite literally a dream come true – the reason I moved to Los Angeles in the first place. Of course, when one works on a show, one tends to get the full 411 on the cast – especially their failings as human beings. But nobody ever had a bad word to say about Nelsan Ellis. He was by all accounts not only immensely talented, upbeat and friendly, but the sort of actor who electrifies even jaded professionals. There was a rumor, which I have subsequently discovered to be true, that Ellis, who was a playwright as well as an actor, was the only cast member allowed to improvise his dialogue – no mean feat, that. So I would like to thank you, Nelsan, for delivering the kind of performance that kept ratings up, fans happy...and me from being evicted.
Now we come to Trevor Baxter. Doubtless his name means nothing to you, for in addition to not being famous, he was not even American (gasp!). Trevor Baxter was an English character actor of the sort that abound in that country – skillful but unobtrusive, the sort who delivers such seamless, seemingly effortless performances that he never gets his full due as a thespian, if only because you completely forget he's acting at all. I first “met” Trevor as a boy of about eight or nine years, when my mother introduced me to Doctor Who. At that time – we're talking 1979 or 1980 – Doctor Who was not the international powerhouse it has become since it was rebooted some years ago. Far from it: not one in a hundred Americans had ever heard of the show, and it was available only on PBS or local stations in re-run format, with weak transmitters and incomplete episode libraries. Nevertheless, from the very first episode I encountered, I was hooked, and never so much as when I watched the six-part episode “The Talons of Weng-Chiang.” This story, originally broadcast in 1977, found the Doctor and his beautiful but savage companion, Leela, in foggy Victorian London. Writer Robert Holmes had devised a fiendishly wonderful plot which was inspired by the Jack the Ripper killings, the story of Dracula, the Phantom of the Opera, and Sherlock Holmes. Trevor Baxter played Dr. Litefoot, a police pathologist who teams up with the Doctor to defeat the time-traveling war criminal Magnus Greel. Baxter's character, clearly based on Dr. Watson from the Sherlock Holmes universe, is a gem to behold; the quintessential English gentleman of the era, he is a bit prissy and a bit snobby, and has a pathetic tendency to get knocked unconscious, but is loyal as a hound and fearless in the clutch. I was delighted by his performance, and by the performance of his “wingman” in the story, pompous theater owner Henry Gordon Jago (played by Christopher Benjamin), who is as cowardly and verbose as Litefoot is brave and restrained. Even as an adult I thought it a shame that Baxter and Benjamin could not reunite to reprise these wonderful “one-off” characters, so you can imagine my joy when, in 2009, Big Finish Productions began a series of radio plays called Jago & Litefoot, starring these two great actors in their old roles. Between '09 and '16 no less than thirteen series (seasons) were produced, totaling 52 episodes, with a number of stand-alone stories and crossovers as well. Baxter shocked me by slipping into a role he had played only once, in 1977, as easily as one might an old leather jacket. Listening to his assured, almost unchanged voice, it was almost impossible to believe that decades had passed – indeed, it made me feel like a kid again. So thank you for that, Trevor, and for reminding me what a skilled actor can do when given the right material – namely, bring happiness to someone (to millions of someones) he never met.
Published on July 18, 2017 21:08
July 2, 2017
Gone Too Soon: 5 TV Shows That Shoulda Lasted Longer
A debuting television show is a lot like a turkey in November. It has a chance of getting a pardon, but the odds are pretty crappy. Every year, “pilot season” brings every studio in Hollywood to a frenzy of activity: for a time the airwaves are positively flooded with the brainchildren of what seem like hundreds of different writers and producers, all hoping they've produced the next ten-year mega hit.
Inevitably, however, the cold mathematics of the ratings game come into play, and many of these children are taken out to the shed and put to the axe, often before they've had time to toddle along for more than a few episodes. In some cases this is absolutely as it should be. Hollywood has churned out a lot, and I mean a lot, of shit over the years, some of it so horrendously bad you wonder how the fuck it got green-lighted in the first place. I know: a few years ago I worked on the pilot for a "Wonder Woman" TV series, and despite costing $12 million to produce, it couldn't even find a buyer and has never been aired, for the simple reason that it didn't deserve to see daylight. But along with all the unwanted orphans that can't find a network buyer, and the no-hopers whose cancellation amounted to mere mercy-killing, there are a number of shows whose demise was not only premature, but arguably tragic. They represent the great “what-ifs” and “might have beens” of television history.
Whenever I look at lists of shows considered to have been wrongly cancelled before they got a second season, I always see Firefly, Freaks and Geeks, and My So-Called Life in the top ten. Briscoe County, Jr. is usually somewhere in the rankings as well, along with the highly influential Kolchak: The Night Stalker, and a few others, some dating back decades, others as recent as a year or two ago. There seems to be a rough consensus among critics which shows fall into the “gone way too soon” category, and I am neither prepared nor equipped to dispute that consensus. I do, however, have a few picks of my own that I'd like to share with you. Please note that these are not series which were merely canceled or taken off the air before their time (a separate category); they are series which were canceled either during or after the conclusion of their inaugural season, which lends them a special pathos – and a place in my heart.
Kindred: The Embraced (1996). Produced by Aaron Spelling and E. Duke Vincent, this toothly prime-time soap opera concerned the doings of five vampire clans based in San Fransisco, who were ruled over with some difficulty by a prince named Julian Luna (Mark Frankel). In addition to contending with all sorts of grief from the clans, Luna makes the mistake of falling for a beautiful female reporter who obviously can't be let in on his secret identity, while at the same time, fending off the attentions of a revenge-obsessed cop played by C. Thomas Howell, who wants to dust Julian for ordering the death of his (vampire) girlfriend in the pilot. Fans of Spelling's shows will recognize all of his trademarks here – period fashion, gallons of hair gel, extraordinarily beautiful actors who nevertheless look slightly freakish, and lots of soapy melodrama. At its worst, this show was embarrassingly bad: the writing, and therefore the acting, were all over the place, Howell was dreadfully miscast, and the vampire makeup on the “Nosferatu” clan looked like something you'd wear for Halloween...when you were twelve. Nonetheless, I mourned the cancellation of this show, for though it only lasted eight episodes, it had such a fabulous premise that it couldn't help but improve from week to week (and indeed, those eight episodes tell a nearly complete story that resolves most of the plot lines, making it satisfying to watch as a kind of unofficial mini-series). Never mind a second season: I'd have been content if this one had simply been allowed to complete its first. Unfortunately, the series' too-handsome-to-be-human star was killed in a motorcycle accident shortly after its cancellation, preventing any possible reunion, and in any case “Kindred” died such a quick death that it has only a small cult following and is unlikely to be tapped for a reboot. (Interestingly, Spelling was to try another supernaturally-themed show set in San Fransisco just two years later, and scored a big hit with “Charmed.”)
Tales of the Gold Monkey (1982). On this list you will see shows which died because of low ratings, unrealistic expectations, bad network decisions or insoluble budget problems. But “Tales of the Gold Monkey” may be one of the only series in TV history whose epitaph reads, “Died of A Pissing Contest.” In the early eighties, the runaway success of “Raiders of the Lost Ark” led executives at ABC to wonder how they could cash in on the craze for old-school, fedora-and-bullwhip-style adventure. Well, it so happened that veteran producer Donald P. Bellisario had just such a script on hand, and before you could say, “Indiana Jones,” “Tales of the Gold Monkey” was born. Set on the fictional South Seas island of Bora Gora in the 1930s, the show followed the adventures of an ex-Flying Tiger named Jake Cutter (Stephen Collins) who makes a meager and very dangerous living flying cargo in his twin-engined seaplane. Jake has a sidekick with a drinking problem, a would-be girlfriend who is actually an American spy, a priest who is actually a German spy, and a sexy nemesis who works for the Japanese. Also a one-eyed dog with whom he has a running disagreement. Though similar in tone and feel to the low-budget cliffhanger adventure movies of the 30s, 40s and 50s, “Monkey” is really more similar to its stablemate “Magnum, P.I.” in terms of structure – there is narration (provided by Jake), self-depreciating humor to leaven all the action, and a great deal of emphasis on the interplay between the characters. Of course, even by the standards of the 1980s this show was cheesy in the extreme and many of the plots were preposterous (I remember, as a small boy, shouting with laughter at the sight of a samurai fighting a carnivorous monkey), but it had a kind of innocent charm that made it emotionally irresistible. Indeed, “Tales” was expected to be on the airwaves for years, but Bellisario clashed with studio brass over the content and direction of the series, and ultimately the executives decided to pull the plug after the first season rather than put up with him. This grieved me as a boy of ten, and it grieves me now. Cheesy and preposterous is just my game.
Alien Nation (1990). Turning successful movies into successful TV shows is problematic at best, and its no wonder that for every “M*A*S*H,” you get ten insta-cancels like “Gung Ho,” “Clueless,” and “My Big Fat Greek Family.” The odds of “Alien Nation” being any good were even longer, because the movie upon which it wasn't any damn good to begin with. The 1988 studio picture was a classic case of a wasted premise, to wit: a huge population of humanoid aliens, marooned on earth when their spaceship crashes in the Nevada desert, have been incorporated into American society as immigrants. These immigrants, originally bred to be slaves, are stronger and more adaptable than humans, but have all sort of weird eccentricities and emotional baggage and are often the brunt of xenophobia and racism from their human hosts. Unfortunately, instead of jumping into this amusing and fascinating world like a gleeful anthropologist, the film elected to settle for second-rate buddy-cop drama. You can therefore be forgiven for having less than zero expectations about the spinoff. And in fact the pilot episode was full of 80s-era silliness and shlock. (I found it particularly hard to get past the elongated, spotted heads of the aliens when they were delivering really dramatic dialogue.) Like “Kindred,” however, it got better as it went along. For starters, it made the main human character, an LADP detective played by Gary Graham, something of a bigot, and then forced him, as he becomes increasingly close to his “newcomer” partner, to confront his own bigotry. More broadly, however, the show tackled such very earth-like subjects as immigration, cultural assimilation, inter-racial relationships, and so forth, by using the aliens as metaphors for any minority group you care to name; and it did this without idealizing them. The newcomers are as flawed and fucked-up as humans, and therefore people we can relate to. This show was cancelled after its first season not because of ratings, but because Fox Network, then in its infancy, couldn't afford to produce a second – a cruel fate when one considers the season finale was a cliffhanger. Eventually a compromise was reached whereby a series of five two-hour TV movies (all with the original cast) continued and more or less resolved the story-lines, but I can't help but wish this cheesy but daring and likeable show had been allowed to live out its life in the manner originally intended: as a weekly episodic.
Blade: The Series (2006). Those of you who know me may be surprised by this choice, since I am hardly an unqualified fan of the “Blade” movie trilogy; but like the “Alien Nation” series, which dove head-first into the premise hinted at by its parent film, thus staying true to its roots while establishing its own identity, “Blade: The Series” was determined be its own bad self from the first frame of the pilot to the conclusion of its thirteenth and final episode. “BTS” is the story of Krista Starr, a tough-yet-sexy Iraq veteran hellbent on discovering who murdered her troubled brother Zach. Her investigation leads to a wealthy Detroit socialite named Marcus van Sciver, who just happens to also be absolute ruler of the local vampire clan. Trying to kill Sciver, she encounters Blade, a human-vampire hybrid who lives to exterminate everything with pointy teeth. The two enter into an uneasy, peril-fraught alliance to bring down Sciver, which becomes all the more perilous when the ubervamp takes a shine to Krista and “turns” her. Krista struggles to adjust to life as a bloodsucker and a double agent while simultaneously fighting unwanted romantic feelings for the charming Marcus. This series was dark, brutal and relentlessly violent, not to mention explicit almost to the point of gratuitousness with both its gore and its sex, and oftimes it was difficult to find a sympathetic character anywhere. Blade, who speaks in a growl and has three facial expressions (sneer, snarl, glare), is a borderline psychopath and not always easy to root for – he doesn't just kill vampires, he kills their human servants (“familiars”) and often does it with great deal of sadism – in one sequence, he impales a nude blonde familiar with the remark, “Be a good pet and stay,” and then proceeds to cut the eye out of a second familiar, quipping, “Don't worry, I only need one.” The truth is that Blade is more than a bit of a bully – somewhere to the right of Wolverine and only just short of The Punisher in terms of his obsessive, pitiless fanaticism. But moral flaws aside, this show was a beautiful aesthetic experience – costume and set design, lighting, cinematography, writing, and most of the acting were all executed at the level of a mid-budget feature film. Ending with a cliffhanger of sorts after thirteen episodes, a second season was regarded as a fait accompli, but the series was unexpectedly canceled, probably due more to production costs than its rather modest ratings. Too bad. The world of “Blade” was rich and complex and deserved more time and more exploration.
The Lone Gunmen (2001). The only true spinoff of “The X-Files” ("Millennium" probably doesn't count) lasted half a season and didn't leave much of a legacy, but if you follow this somewhat unlikely TV series from inception to conclusion you won't be sorry. “Gunmen” is the story of Frohike, Langley and Byers, three recurring characters on “The X-Files” whose passion for conspiracy theories and technological acumen make them useful to Mulder and Scully in their investigations of the paranormal. Well, “TLG” follows the antics of these plucky but irascible nerds as they chase down stories for their conspiracy-theorist weekly, The Lone Gunmen. It is similar to “X-Files” in its visual aesthetic and the quality of its overall production, and features a number of crossovers, including appearances by David Duchovny and Mitch Pileggi, but the nature of the characters lends itself more to farce than drama. The Gunmen are dorky, quarrelsome, prone to bumbling, and nearly always broke – the shittiness of their “surveillance vehicle” (a broken-down old van) is a running gag. What's more, they've got an “intern” named Jimmy Bond who is a disaster-prone moron, and a sexy friend-cum-nemesis named Eve Harlow, who is continously tripping them up (when not providing reluctant help, that is). All in all this show was somewhat less than the sum of its parts, for the actors who portrayed the Gunmen, while funny and likeable enough, lacked the force and charisma to really carry a series; or so it seemed until the cliffhanger ending, which left me hungering for more and wishing like hell the fucking thing hadn't been canceled after all. Fortunately (or not, depending on your point of view), the story was brought to a very decisive end post cancellation, by virtue of a reappearance of the five main characters on an episode of “The X-Files.” Sometimes you have to lose a series before you realize that you wish it had gotten a longer lease on life, and “The Lone Gunmen” was such a show. Not perfect, not even great, but containing the seeds of possible near-greatness within it. Perhaps it didn't deserve a second season, but it sure as hell deserved to finish its first one.
Battlestar Galactica (1979). If you weren't a kid in the late 70s, you simply have no idea how much anticipation and excitement “Battlestar Galactica” inspired before its debut. Back in those days we only had three networks, which meant that during prime time, you had precisely three choices on your dial at any given moment: ABC, NBC and CBS. Just three shows from which to choose your next hour of scripted entertainment. That's almost unimaginable now, but it was the hard reality of TV-land back then, and “Battlestar Galactica” had more buzz behind it on than any show I can recall before or since. CBS had sunk a fortune into this concept, which was intended to cash in on both the popularity of “Star Wars” and the legacy of “Star Trek,” while retaining the glamour then associated with the television mini-series. The idea behind “Galactica” was straightforward: following a treacherous sneak-attack, the human race is driven from its many colony worlds and reduced to passenger status on a single fugitive fleet of rickety spaceships protected by a lone military craft, the Battlestar Galactica. Pursued by the evil robotic Cylons as they flee across the universe looking for their mysterious planet of origin (the Earth!), the humans, led by Commander Adama (Lorne Greene) face all kind of perils from both within and without, essentially hopping from one disaster to another, and all the while juggling romantic relationships and family melodrama. “Galactica” was the very definition of lavish, with enormous sets, beautiful costumes, a gigantic cast, much location shooting and all sorts of special effects: even the credit sequence was opulent. Unfortunately, it suffered from poor continuity, bad writing, cheesy acting, surprisingly unimaginative plots (mostly transparent ripoffs of popular Hollywood movies); and the ratings, which had been very high at the start, began to decline as the season wore on. After its conclusion, the executives at CBS crunched some numbers and decided “Galactica” hadn't returned on their huge investment, so they pulled the plug. Their decision was unfortunate for two reasons. Firstly, the quality of the series had improved dramatically down the home stretch: the last four or five episodes are really quite good, and proof of Dirk (Starbuck) Benedict's assertion that the first season of any show is simply a quest to “find its spine.” Second, the decision to cancel eventually led to the abysmal spinoff “Galactica 80,” which was a rather cynical attempt to bring back the show's audience to a low-budget spinoff using mostly different actors in a different setting. The attempt failed, “Galactia 80” was quickly canceled and even more quickly forgotten, and any hope of reviving the parent series proper faded away with it. Of course, many years later, the series got a “reboot” on the SyFy network to much critical acclaim, but I cannot watch the last few episodes of the “classic Galactica's” first season without lamenting the lack of existence of a second. I truly feel that the producers had “found the spine” and were moving on to bigger and better things. I only wish they'd had the chance to showcase them.
Of course there are many television series which avoided this list by managing to gasp out a second season, or part of one, before they got the chop, and still others which managed three, four or even five seasons yet still ended prematurely. I could fill pages with shows that, however long they ran, still exited the stage before I wanted them to. But there is and always will be a special pathos about promising TV shows which die in or after their freshman seasons. A lucky series shows us what can be done; the luckless only what might have been.
Inevitably, however, the cold mathematics of the ratings game come into play, and many of these children are taken out to the shed and put to the axe, often before they've had time to toddle along for more than a few episodes. In some cases this is absolutely as it should be. Hollywood has churned out a lot, and I mean a lot, of shit over the years, some of it so horrendously bad you wonder how the fuck it got green-lighted in the first place. I know: a few years ago I worked on the pilot for a "Wonder Woman" TV series, and despite costing $12 million to produce, it couldn't even find a buyer and has never been aired, for the simple reason that it didn't deserve to see daylight. But along with all the unwanted orphans that can't find a network buyer, and the no-hopers whose cancellation amounted to mere mercy-killing, there are a number of shows whose demise was not only premature, but arguably tragic. They represent the great “what-ifs” and “might have beens” of television history.
Whenever I look at lists of shows considered to have been wrongly cancelled before they got a second season, I always see Firefly, Freaks and Geeks, and My So-Called Life in the top ten. Briscoe County, Jr. is usually somewhere in the rankings as well, along with the highly influential Kolchak: The Night Stalker, and a few others, some dating back decades, others as recent as a year or two ago. There seems to be a rough consensus among critics which shows fall into the “gone way too soon” category, and I am neither prepared nor equipped to dispute that consensus. I do, however, have a few picks of my own that I'd like to share with you. Please note that these are not series which were merely canceled or taken off the air before their time (a separate category); they are series which were canceled either during or after the conclusion of their inaugural season, which lends them a special pathos – and a place in my heart.
Kindred: The Embraced (1996). Produced by Aaron Spelling and E. Duke Vincent, this toothly prime-time soap opera concerned the doings of five vampire clans based in San Fransisco, who were ruled over with some difficulty by a prince named Julian Luna (Mark Frankel). In addition to contending with all sorts of grief from the clans, Luna makes the mistake of falling for a beautiful female reporter who obviously can't be let in on his secret identity, while at the same time, fending off the attentions of a revenge-obsessed cop played by C. Thomas Howell, who wants to dust Julian for ordering the death of his (vampire) girlfriend in the pilot. Fans of Spelling's shows will recognize all of his trademarks here – period fashion, gallons of hair gel, extraordinarily beautiful actors who nevertheless look slightly freakish, and lots of soapy melodrama. At its worst, this show was embarrassingly bad: the writing, and therefore the acting, were all over the place, Howell was dreadfully miscast, and the vampire makeup on the “Nosferatu” clan looked like something you'd wear for Halloween...when you were twelve. Nonetheless, I mourned the cancellation of this show, for though it only lasted eight episodes, it had such a fabulous premise that it couldn't help but improve from week to week (and indeed, those eight episodes tell a nearly complete story that resolves most of the plot lines, making it satisfying to watch as a kind of unofficial mini-series). Never mind a second season: I'd have been content if this one had simply been allowed to complete its first. Unfortunately, the series' too-handsome-to-be-human star was killed in a motorcycle accident shortly after its cancellation, preventing any possible reunion, and in any case “Kindred” died such a quick death that it has only a small cult following and is unlikely to be tapped for a reboot. (Interestingly, Spelling was to try another supernaturally-themed show set in San Fransisco just two years later, and scored a big hit with “Charmed.”)
Tales of the Gold Monkey (1982). On this list you will see shows which died because of low ratings, unrealistic expectations, bad network decisions or insoluble budget problems. But “Tales of the Gold Monkey” may be one of the only series in TV history whose epitaph reads, “Died of A Pissing Contest.” In the early eighties, the runaway success of “Raiders of the Lost Ark” led executives at ABC to wonder how they could cash in on the craze for old-school, fedora-and-bullwhip-style adventure. Well, it so happened that veteran producer Donald P. Bellisario had just such a script on hand, and before you could say, “Indiana Jones,” “Tales of the Gold Monkey” was born. Set on the fictional South Seas island of Bora Gora in the 1930s, the show followed the adventures of an ex-Flying Tiger named Jake Cutter (Stephen Collins) who makes a meager and very dangerous living flying cargo in his twin-engined seaplane. Jake has a sidekick with a drinking problem, a would-be girlfriend who is actually an American spy, a priest who is actually a German spy, and a sexy nemesis who works for the Japanese. Also a one-eyed dog with whom he has a running disagreement. Though similar in tone and feel to the low-budget cliffhanger adventure movies of the 30s, 40s and 50s, “Monkey” is really more similar to its stablemate “Magnum, P.I.” in terms of structure – there is narration (provided by Jake), self-depreciating humor to leaven all the action, and a great deal of emphasis on the interplay between the characters. Of course, even by the standards of the 1980s this show was cheesy in the extreme and many of the plots were preposterous (I remember, as a small boy, shouting with laughter at the sight of a samurai fighting a carnivorous monkey), but it had a kind of innocent charm that made it emotionally irresistible. Indeed, “Tales” was expected to be on the airwaves for years, but Bellisario clashed with studio brass over the content and direction of the series, and ultimately the executives decided to pull the plug after the first season rather than put up with him. This grieved me as a boy of ten, and it grieves me now. Cheesy and preposterous is just my game.
Alien Nation (1990). Turning successful movies into successful TV shows is problematic at best, and its no wonder that for every “M*A*S*H,” you get ten insta-cancels like “Gung Ho,” “Clueless,” and “My Big Fat Greek Family.” The odds of “Alien Nation” being any good were even longer, because the movie upon which it wasn't any damn good to begin with. The 1988 studio picture was a classic case of a wasted premise, to wit: a huge population of humanoid aliens, marooned on earth when their spaceship crashes in the Nevada desert, have been incorporated into American society as immigrants. These immigrants, originally bred to be slaves, are stronger and more adaptable than humans, but have all sort of weird eccentricities and emotional baggage and are often the brunt of xenophobia and racism from their human hosts. Unfortunately, instead of jumping into this amusing and fascinating world like a gleeful anthropologist, the film elected to settle for second-rate buddy-cop drama. You can therefore be forgiven for having less than zero expectations about the spinoff. And in fact the pilot episode was full of 80s-era silliness and shlock. (I found it particularly hard to get past the elongated, spotted heads of the aliens when they were delivering really dramatic dialogue.) Like “Kindred,” however, it got better as it went along. For starters, it made the main human character, an LADP detective played by Gary Graham, something of a bigot, and then forced him, as he becomes increasingly close to his “newcomer” partner, to confront his own bigotry. More broadly, however, the show tackled such very earth-like subjects as immigration, cultural assimilation, inter-racial relationships, and so forth, by using the aliens as metaphors for any minority group you care to name; and it did this without idealizing them. The newcomers are as flawed and fucked-up as humans, and therefore people we can relate to. This show was cancelled after its first season not because of ratings, but because Fox Network, then in its infancy, couldn't afford to produce a second – a cruel fate when one considers the season finale was a cliffhanger. Eventually a compromise was reached whereby a series of five two-hour TV movies (all with the original cast) continued and more or less resolved the story-lines, but I can't help but wish this cheesy but daring and likeable show had been allowed to live out its life in the manner originally intended: as a weekly episodic.
Blade: The Series (2006). Those of you who know me may be surprised by this choice, since I am hardly an unqualified fan of the “Blade” movie trilogy; but like the “Alien Nation” series, which dove head-first into the premise hinted at by its parent film, thus staying true to its roots while establishing its own identity, “Blade: The Series” was determined be its own bad self from the first frame of the pilot to the conclusion of its thirteenth and final episode. “BTS” is the story of Krista Starr, a tough-yet-sexy Iraq veteran hellbent on discovering who murdered her troubled brother Zach. Her investigation leads to a wealthy Detroit socialite named Marcus van Sciver, who just happens to also be absolute ruler of the local vampire clan. Trying to kill Sciver, she encounters Blade, a human-vampire hybrid who lives to exterminate everything with pointy teeth. The two enter into an uneasy, peril-fraught alliance to bring down Sciver, which becomes all the more perilous when the ubervamp takes a shine to Krista and “turns” her. Krista struggles to adjust to life as a bloodsucker and a double agent while simultaneously fighting unwanted romantic feelings for the charming Marcus. This series was dark, brutal and relentlessly violent, not to mention explicit almost to the point of gratuitousness with both its gore and its sex, and oftimes it was difficult to find a sympathetic character anywhere. Blade, who speaks in a growl and has three facial expressions (sneer, snarl, glare), is a borderline psychopath and not always easy to root for – he doesn't just kill vampires, he kills their human servants (“familiars”) and often does it with great deal of sadism – in one sequence, he impales a nude blonde familiar with the remark, “Be a good pet and stay,” and then proceeds to cut the eye out of a second familiar, quipping, “Don't worry, I only need one.” The truth is that Blade is more than a bit of a bully – somewhere to the right of Wolverine and only just short of The Punisher in terms of his obsessive, pitiless fanaticism. But moral flaws aside, this show was a beautiful aesthetic experience – costume and set design, lighting, cinematography, writing, and most of the acting were all executed at the level of a mid-budget feature film. Ending with a cliffhanger of sorts after thirteen episodes, a second season was regarded as a fait accompli, but the series was unexpectedly canceled, probably due more to production costs than its rather modest ratings. Too bad. The world of “Blade” was rich and complex and deserved more time and more exploration.
The Lone Gunmen (2001). The only true spinoff of “The X-Files” ("Millennium" probably doesn't count) lasted half a season and didn't leave much of a legacy, but if you follow this somewhat unlikely TV series from inception to conclusion you won't be sorry. “Gunmen” is the story of Frohike, Langley and Byers, three recurring characters on “The X-Files” whose passion for conspiracy theories and technological acumen make them useful to Mulder and Scully in their investigations of the paranormal. Well, “TLG” follows the antics of these plucky but irascible nerds as they chase down stories for their conspiracy-theorist weekly, The Lone Gunmen. It is similar to “X-Files” in its visual aesthetic and the quality of its overall production, and features a number of crossovers, including appearances by David Duchovny and Mitch Pileggi, but the nature of the characters lends itself more to farce than drama. The Gunmen are dorky, quarrelsome, prone to bumbling, and nearly always broke – the shittiness of their “surveillance vehicle” (a broken-down old van) is a running gag. What's more, they've got an “intern” named Jimmy Bond who is a disaster-prone moron, and a sexy friend-cum-nemesis named Eve Harlow, who is continously tripping them up (when not providing reluctant help, that is). All in all this show was somewhat less than the sum of its parts, for the actors who portrayed the Gunmen, while funny and likeable enough, lacked the force and charisma to really carry a series; or so it seemed until the cliffhanger ending, which left me hungering for more and wishing like hell the fucking thing hadn't been canceled after all. Fortunately (or not, depending on your point of view), the story was brought to a very decisive end post cancellation, by virtue of a reappearance of the five main characters on an episode of “The X-Files.” Sometimes you have to lose a series before you realize that you wish it had gotten a longer lease on life, and “The Lone Gunmen” was such a show. Not perfect, not even great, but containing the seeds of possible near-greatness within it. Perhaps it didn't deserve a second season, but it sure as hell deserved to finish its first one.
Battlestar Galactica (1979). If you weren't a kid in the late 70s, you simply have no idea how much anticipation and excitement “Battlestar Galactica” inspired before its debut. Back in those days we only had three networks, which meant that during prime time, you had precisely three choices on your dial at any given moment: ABC, NBC and CBS. Just three shows from which to choose your next hour of scripted entertainment. That's almost unimaginable now, but it was the hard reality of TV-land back then, and “Battlestar Galactica” had more buzz behind it on than any show I can recall before or since. CBS had sunk a fortune into this concept, which was intended to cash in on both the popularity of “Star Wars” and the legacy of “Star Trek,” while retaining the glamour then associated with the television mini-series. The idea behind “Galactica” was straightforward: following a treacherous sneak-attack, the human race is driven from its many colony worlds and reduced to passenger status on a single fugitive fleet of rickety spaceships protected by a lone military craft, the Battlestar Galactica. Pursued by the evil robotic Cylons as they flee across the universe looking for their mysterious planet of origin (the Earth!), the humans, led by Commander Adama (Lorne Greene) face all kind of perils from both within and without, essentially hopping from one disaster to another, and all the while juggling romantic relationships and family melodrama. “Galactica” was the very definition of lavish, with enormous sets, beautiful costumes, a gigantic cast, much location shooting and all sorts of special effects: even the credit sequence was opulent. Unfortunately, it suffered from poor continuity, bad writing, cheesy acting, surprisingly unimaginative plots (mostly transparent ripoffs of popular Hollywood movies); and the ratings, which had been very high at the start, began to decline as the season wore on. After its conclusion, the executives at CBS crunched some numbers and decided “Galactica” hadn't returned on their huge investment, so they pulled the plug. Their decision was unfortunate for two reasons. Firstly, the quality of the series had improved dramatically down the home stretch: the last four or five episodes are really quite good, and proof of Dirk (Starbuck) Benedict's assertion that the first season of any show is simply a quest to “find its spine.” Second, the decision to cancel eventually led to the abysmal spinoff “Galactica 80,” which was a rather cynical attempt to bring back the show's audience to a low-budget spinoff using mostly different actors in a different setting. The attempt failed, “Galactia 80” was quickly canceled and even more quickly forgotten, and any hope of reviving the parent series proper faded away with it. Of course, many years later, the series got a “reboot” on the SyFy network to much critical acclaim, but I cannot watch the last few episodes of the “classic Galactica's” first season without lamenting the lack of existence of a second. I truly feel that the producers had “found the spine” and were moving on to bigger and better things. I only wish they'd had the chance to showcase them.
Of course there are many television series which avoided this list by managing to gasp out a second season, or part of one, before they got the chop, and still others which managed three, four or even five seasons yet still ended prematurely. I could fill pages with shows that, however long they ran, still exited the stage before I wanted them to. But there is and always will be a special pathos about promising TV shows which die in or after their freshman seasons. A lucky series shows us what can be done; the luckless only what might have been.
Published on July 02, 2017 10:44
June 26, 2017
All or Nothing
Only a Sith speaks in absolutes.
---Obi-Wan Kenobi
One of the many problems we face today as a nation is the impossibility of taking a stand on anything without immediately being tagged with a particular political allegiance. The environment, abortion, illegal immigration,health care, gay rights, gun control – it scarcely matters. Say you are in favor of this or against that; that you believe in this or are skeptical of that; that you support this, or oppose that; and you will automatically be branded as belonging to either the “liberal” or the “conservative” camps. This applies just as strongly to political figures and TV pundits themselves. Criticize a Republican and you must be a Democrat; criticize a Democrat and you must be a Republican. Express dislike for Israel's foreign or domestic policies and you are an anti-Semite; attack those of Saudi Arabia and you are an Islamophobe. There is no wiggle room, no in-between, no gray area. Anyone, anywhere, regardless of age, race, ethnicity, sex, or creed can be reduced to a one-word identity and defined down to his or her smallest particulars simply by voicing an opinion on a single issue or person.
Setting aside the terrible mental inflexibility of such a reflex – and it really is a reflex, for it is an automatic response not subject to any conscious thought – the process of judging people in their entirety based on a lone viewpoint on a lone issue is symptomatic of a much deeper problem in our political life, which is the refusal to think. It seems to me that the vast majority of Americans have lost the ability to come to a position on anything without being told what to think beforehand by “trusted sources.” But which sources one trusts is mirrored by their own political prejudices: no one, or very few people at any rate, seems to believe in the concept of objective truth any longer, the idea that a fact is a fact no matter how you spin it, and is immune from disbelief. After all, we do live, physically, in an objective world: disbelieving in the rain will not keep you dry. Why should this simple rule of life fail to apply to the political world?
Two phrases which have spring up within the last year – “alternative facts” and “fake news” – both reflect the present relationship Americans have with reality. The first term is terrifyingly Orwellian because it presents facts as having no more validity than opinions. It is also eerily reminiscent of the Nazi idea that there was no such thing as science per se, only science specific to racial types. It is also similar to the Communistic idea that literature and even art were only valid if they had been produced by Communists or those sympathetic to them. In each case the validity of the concept at hand, science or art, was judged based on ideology alone. If this sort of thinking doesn't frighten you, consider the ramifications of a choose-your-own-reality scenario. Consider that all of human civilization, going back 6,000 years is essentially an acculumlation of knowledge, which is to say, a vast piling-up of theories and facts over time. Obviously some of that knowledge was lost or suppressed, but each successive generation since the end of the Dark Ages has added to the total body of knowledge in every field; together they form links in a chain, or steps in a staircase, or rungs on a ladder – however you want to visualize it. The present link, step or rung upon which our society rests exists only because of the innumerable others which precede it. And for every one of these which is sturdy enough to stand the test of time, there are dozens if not hundreds which were ultimately shattered by the crucible-like process of scientific examination. For example, the Roman Church used to maintain that the speed at which an object fell was determined by its weight. This belief persisted for hundreds of years, until Galielo dropped a marble and a lead ball off the Tower of Piza to demonstrate that both objects fell at the same speed. He was very nearly burned at the stake for doing this, but eventually the simplicity of the method he used became inarguable. Anyone could pick up two objects of different weight and see for himself that Galileo's claims were true when he or she let go of them. And today nobody – no sane person – would argue otherwise, because a fact has more weight than an opinion, or for that matter, a theory. A proveable fact is an objective thing. Like the afformentioned rain, it exists whether you believe in it or not.
“Fake news” has a more legitimate origin than alternative fact. Everyone has always known that the press often gets its facts mixed up – the novelist Lawrence Sanders once quipped that the worst insult an American could throw at another was, “You believe what you read in the papers?” As I have explained before in this blog, the accuracy of information bears an inverse proportion to the speed at which it is disseminated, and newspapers disseminated information with very great speed – until the last few decades it was common for the bigger papers to have both a morning and an evening edition, which requires a lot of fast writing. On top of this, the steady decline of journalism over the last 30-odd years, inspired by and coupled with the rise of the internet (a much faster way of getting information, and thereby a much faster way of getting the wrong information), as well as the increasing domination of corporations over the news media, has led to a profusion of stories which were deliberately exaggerated, twisted, under-reported, over-reported, or otherwise changed by the media outlets who reported them. The distrust people have of the news is, I'm afraid, somewhat justified. Nevertheless, actual fake news – meaning news which is simply lies, as opposed to stories which are not entirely accurate, or which have been slanted politically in some way – is only as influential as it is today because Americans have largely lost the power of critical thinking. They are not able, or or not willing, to discriminate between, say, The National Equirer and The New York Times in terms of veracity. What's more, as with “alternative facts,” they are no longer willing to accept that any news article could be accurate if it disagrees with their opinion. As before, the power of opinion – of belief – is elevated over the power of the objective fact, or at least the legitimate process by which objective facts are discovered.
Carl Sagan once said, in some exasperation, that people should stop asking whether he “believed” in UFO's or astrology or ghosts, and ask instead if there was scientific proof that lent itself to a conclusion on any of these things. He was trying to point out the absurdity of asking a scientist whether he or she “believed,” when the whole process of scientific thought was designed to make the idea of “belief” unnecessary. In science, first you theorize, and then, using the empirical method, you try to prove or you disprove. You then report your findings and submit them to other scientists for scrutiny, knowing they will test your hypothesis and repeat your methods to see if they can obtain the same results. It's not a question of "belief," it's a question of fact.
At the same time, and somewhat ironically, it has long been scientifically established that facts are no match for belief. Most people accept science only insomuch as it fails to contradict their opinions. No one today would argue anymore that man cannot fly, but many otherwise intelligent people argue, with no scientific basis and in spite of incontrovertible evidence to the contrary, that vaccines do more harm to society than good. Others maintain that the age of the earth is 3,000 years and not a day more, because this is what it says in the Bible or some other religious text. They reject radiocarbon dating and every other means which which the age of the Earth has been roughly established, yet accept immediately any scientific theory or law which does not conflict with their faith. They pick and choose what they believe, rather as if they were at a sort of buffet table, with no sense of hypocrisy or contradiction. In both cases, the mind has taken the easier course. It has abandoned the ability to come to its own conclusions and lets religious faith, or racial-ethnic bigotry, or political beliefs, short-circuit the thinking process.
When one looks at the political situation in America today, we can see that thinking is at a minimum, but the speed which people arrive at sweeping conclusions is at a maximum. If I express disgust for Donald Trump as a human being, I must be a Democrat or a “liberal.” If I express more or less equal disgust for Hilary Clinton, I must be a Republican or a “conservative.” If I take a single viewpoint contrary to the general trend of political thought in a party, then I am 100% in the camp of the opposing party. In no instance am I allowed am I allowed any deviation at all. The slightest disagreement, even on a minor issue, amounts to total apostasy, total rejection. It is noteworthy that both the Communists and, to a lesser extent, the Nazis, embraced this sort of absolutist thinking completely. Communists were permitted to repeat the exact party line; any disagreement, even on minor points, and one was branded a “right-wing deviationist” and shot. The Nazis, whose ideology was more nebulous, also shot a few of their deviationists, though generally they merely threatened or disgraced them – stints in concentration camps were quite effective for ironing out differences of dogma. At the moment, in America, neither of the major parties nor the government itself possesses the ability to shoot us for disagreeing with them, but I am by no means assured they would not do so if the power was theirs. The Founding Fathers understood better than any humans before – or, sadly, since – the temptation of libido dominandi, a concept identified by the dictionary as “the will to power; the desire to dominate; the lust for government.” They knew that a certain breed of man will always attempt to force its opinions and belief-systems onto others, with violence if necessary; and to organize them for his own selfish ends. The entire Constitution was written to restrain the power of government over the individual – in other words, to protect us from our own government. Unfortunately, it was beyond the power of the Founders to protect us from ourselves. They did not anticipate a generation of Americans who were unwilling to think, but perfectly willing to take action anyway.
In my Facebook feed, I am often confronted by videos in which an interviewer asks people of a certain political persuasion if they agree with certain statements made by a political figure of the same persuasion. A typical question would be, “Hilary Clinton says such and such about the national debt – do you think she's right?” Invariably the interviewee, in this case a Democrat, gushes with enthusiastic approval, and just as invariably, it is revealed at the end of the interview that the actual quotation came from Donald Trump, or George W. Bush, or some other right-wing figure. These sort of tricks are also played on Republicans with equal success, but the political loyalties of those questioned is not important. What is important is the fact that in each case, the response was conditioned not by what was said but by who they believed said it. It is noteworthy that these videos are always exhibited for the purposes of making the ordinary Democrat or Republican look stupid; but their effect on me is not amusement but horror. One of the slogans of the Nazi Party was, “The Führer is always right.” In America today, anyone on your side is always right, and everyone on anyone else's side is always wrong. The actual opinions they hold are irrelevant. Party loyalty has become tribal. But if this is true, it also begs a question: if Americans are so ignorant of their own political ideology as to be unable to recognize its exact opposite, why are they so fanatically loyal to it? If a person can't tell you why he's a Democrat, or a Republican (or a Democratic Socialist or a Green or a Libertarian), why do they cling so tightly to that identity?
It is my personal belief that political affiliations have become entirely emotional in origin. In the vast majority of cases, one does not coldly and soberly judge the various parties on their merits and then, on the basis of that analysis, make a rational decision to join one or the other. No, one comes to political identity in the same way one generally comes to religion or love; through subjective feelings which have nothing whatsoever to do with logic. Exactly why the emotions of one person are triggered by the slogans and symbols of the “right” and others by the slogans and symbols of the “left” remains a mystery, but the key point is that that the triggers exist. Humans began as aggressive, ritualistic, territorial animals with a strong hostility to strangers; as civilization emerged they maintained all these characteristics while transferring their tribal loyalties to the nation-state; and for whatever reason, they have now transferred that loyalty to the political parties which exist within those states. But loyalty, like love, is an feeling, and as I've stated above, subjective feelings – beliefs – are far more powerful than facts or logic. Only this can explain why people who can't even summarize their own party platform would simultaneously insist that it is superior to that of the opposing party, and then resist all arguments to the contrary, no matter how well-reasoned. Indeed, the very fact that it is possible to systematically demolish someone's arguments for being a Democrat or a Republican (or what have you), without in any way effecting their devotion to that body, is absolute proof of this.
If you doubt me here, I ask you to perform the following experiment: try, on the same day, to convince a friend that that computer X is a better bargain than computer Y; and also, that he should not love his abusive, irresponsible, alcoholic mother. In the first case, he is likely to respond positively to a purely intellectual argument, because there are no emotions – no loyalties – at stake. But when it comes to his mother, reason goes out the window, and only loyalty remains. Muster 1,000 perfectly valid reasons why he should not love his mom, and his subjective feelings toward her will not change one iota. What's more, he will probably hate you for trying to change them. In one case, truth has power, and in the other, it is utterly powerless.
It seems to me that primitive instincts served mankind well for much of his formative development. The ability to make snap judgements based on sudden emotional stimulus was key to survival on the ancient savanna – run or fight, kill or show mercy, listen or pick up the club. Hostility to strangers was a survival mechanism, as was territoriality and periodic aggression. Ritual helped create traditions which cemented bonds within the tribe, and submission to a strong leader eliminated argument and reduced discord, allowing a group of thirty to move as one. Back then, too much thinking could get you killed. But in this age, when the turn of two keys can release enough nuclear missiles into the air to turn the planet into a lifeless, radioactive cinder hanging in space, we can no longer afford knee-jerk responses to threatening stimuli. The balance we've established over this earth is too delicate, too fragile to sustain for much longer a population which has the power to kill but refuses to engage the power to think. The absolutism of our ancestors has no place in the nuclear era. It is not too much but too little thinking which will doom us.
George Orwell spent most of his literary career worrying about the decline of objective truth, the increasing unwillingness of human beings to think for themselves. He foresaw that this unwillingness would sooner or later lead to an actual inability; that the brain, like the muscles, must be exercised using critical thinking or else it will fall into the flabby and detestable habit of not thinking at all, but simply reacting, reflexively, to emotional stimulus. I'm sorry to see we have already arrived at this point, or at least to its outermost edge. Whether we draw back into sanity or proceed into the abyss which has consumed other great societies depends entirely on whether we continue to let ourselves be ruled by our passions, or governed by our thoughts.
---Obi-Wan Kenobi
One of the many problems we face today as a nation is the impossibility of taking a stand on anything without immediately being tagged with a particular political allegiance. The environment, abortion, illegal immigration,health care, gay rights, gun control – it scarcely matters. Say you are in favor of this or against that; that you believe in this or are skeptical of that; that you support this, or oppose that; and you will automatically be branded as belonging to either the “liberal” or the “conservative” camps. This applies just as strongly to political figures and TV pundits themselves. Criticize a Republican and you must be a Democrat; criticize a Democrat and you must be a Republican. Express dislike for Israel's foreign or domestic policies and you are an anti-Semite; attack those of Saudi Arabia and you are an Islamophobe. There is no wiggle room, no in-between, no gray area. Anyone, anywhere, regardless of age, race, ethnicity, sex, or creed can be reduced to a one-word identity and defined down to his or her smallest particulars simply by voicing an opinion on a single issue or person.
Setting aside the terrible mental inflexibility of such a reflex – and it really is a reflex, for it is an automatic response not subject to any conscious thought – the process of judging people in their entirety based on a lone viewpoint on a lone issue is symptomatic of a much deeper problem in our political life, which is the refusal to think. It seems to me that the vast majority of Americans have lost the ability to come to a position on anything without being told what to think beforehand by “trusted sources.” But which sources one trusts is mirrored by their own political prejudices: no one, or very few people at any rate, seems to believe in the concept of objective truth any longer, the idea that a fact is a fact no matter how you spin it, and is immune from disbelief. After all, we do live, physically, in an objective world: disbelieving in the rain will not keep you dry. Why should this simple rule of life fail to apply to the political world?
Two phrases which have spring up within the last year – “alternative facts” and “fake news” – both reflect the present relationship Americans have with reality. The first term is terrifyingly Orwellian because it presents facts as having no more validity than opinions. It is also eerily reminiscent of the Nazi idea that there was no such thing as science per se, only science specific to racial types. It is also similar to the Communistic idea that literature and even art were only valid if they had been produced by Communists or those sympathetic to them. In each case the validity of the concept at hand, science or art, was judged based on ideology alone. If this sort of thinking doesn't frighten you, consider the ramifications of a choose-your-own-reality scenario. Consider that all of human civilization, going back 6,000 years is essentially an acculumlation of knowledge, which is to say, a vast piling-up of theories and facts over time. Obviously some of that knowledge was lost or suppressed, but each successive generation since the end of the Dark Ages has added to the total body of knowledge in every field; together they form links in a chain, or steps in a staircase, or rungs on a ladder – however you want to visualize it. The present link, step or rung upon which our society rests exists only because of the innumerable others which precede it. And for every one of these which is sturdy enough to stand the test of time, there are dozens if not hundreds which were ultimately shattered by the crucible-like process of scientific examination. For example, the Roman Church used to maintain that the speed at which an object fell was determined by its weight. This belief persisted for hundreds of years, until Galielo dropped a marble and a lead ball off the Tower of Piza to demonstrate that both objects fell at the same speed. He was very nearly burned at the stake for doing this, but eventually the simplicity of the method he used became inarguable. Anyone could pick up two objects of different weight and see for himself that Galileo's claims were true when he or she let go of them. And today nobody – no sane person – would argue otherwise, because a fact has more weight than an opinion, or for that matter, a theory. A proveable fact is an objective thing. Like the afformentioned rain, it exists whether you believe in it or not.
“Fake news” has a more legitimate origin than alternative fact. Everyone has always known that the press often gets its facts mixed up – the novelist Lawrence Sanders once quipped that the worst insult an American could throw at another was, “You believe what you read in the papers?” As I have explained before in this blog, the accuracy of information bears an inverse proportion to the speed at which it is disseminated, and newspapers disseminated information with very great speed – until the last few decades it was common for the bigger papers to have both a morning and an evening edition, which requires a lot of fast writing. On top of this, the steady decline of journalism over the last 30-odd years, inspired by and coupled with the rise of the internet (a much faster way of getting information, and thereby a much faster way of getting the wrong information), as well as the increasing domination of corporations over the news media, has led to a profusion of stories which were deliberately exaggerated, twisted, under-reported, over-reported, or otherwise changed by the media outlets who reported them. The distrust people have of the news is, I'm afraid, somewhat justified. Nevertheless, actual fake news – meaning news which is simply lies, as opposed to stories which are not entirely accurate, or which have been slanted politically in some way – is only as influential as it is today because Americans have largely lost the power of critical thinking. They are not able, or or not willing, to discriminate between, say, The National Equirer and The New York Times in terms of veracity. What's more, as with “alternative facts,” they are no longer willing to accept that any news article could be accurate if it disagrees with their opinion. As before, the power of opinion – of belief – is elevated over the power of the objective fact, or at least the legitimate process by which objective facts are discovered.
Carl Sagan once said, in some exasperation, that people should stop asking whether he “believed” in UFO's or astrology or ghosts, and ask instead if there was scientific proof that lent itself to a conclusion on any of these things. He was trying to point out the absurdity of asking a scientist whether he or she “believed,” when the whole process of scientific thought was designed to make the idea of “belief” unnecessary. In science, first you theorize, and then, using the empirical method, you try to prove or you disprove. You then report your findings and submit them to other scientists for scrutiny, knowing they will test your hypothesis and repeat your methods to see if they can obtain the same results. It's not a question of "belief," it's a question of fact.
At the same time, and somewhat ironically, it has long been scientifically established that facts are no match for belief. Most people accept science only insomuch as it fails to contradict their opinions. No one today would argue anymore that man cannot fly, but many otherwise intelligent people argue, with no scientific basis and in spite of incontrovertible evidence to the contrary, that vaccines do more harm to society than good. Others maintain that the age of the earth is 3,000 years and not a day more, because this is what it says in the Bible or some other religious text. They reject radiocarbon dating and every other means which which the age of the Earth has been roughly established, yet accept immediately any scientific theory or law which does not conflict with their faith. They pick and choose what they believe, rather as if they were at a sort of buffet table, with no sense of hypocrisy or contradiction. In both cases, the mind has taken the easier course. It has abandoned the ability to come to its own conclusions and lets religious faith, or racial-ethnic bigotry, or political beliefs, short-circuit the thinking process.
When one looks at the political situation in America today, we can see that thinking is at a minimum, but the speed which people arrive at sweeping conclusions is at a maximum. If I express disgust for Donald Trump as a human being, I must be a Democrat or a “liberal.” If I express more or less equal disgust for Hilary Clinton, I must be a Republican or a “conservative.” If I take a single viewpoint contrary to the general trend of political thought in a party, then I am 100% in the camp of the opposing party. In no instance am I allowed am I allowed any deviation at all. The slightest disagreement, even on a minor issue, amounts to total apostasy, total rejection. It is noteworthy that both the Communists and, to a lesser extent, the Nazis, embraced this sort of absolutist thinking completely. Communists were permitted to repeat the exact party line; any disagreement, even on minor points, and one was branded a “right-wing deviationist” and shot. The Nazis, whose ideology was more nebulous, also shot a few of their deviationists, though generally they merely threatened or disgraced them – stints in concentration camps were quite effective for ironing out differences of dogma. At the moment, in America, neither of the major parties nor the government itself possesses the ability to shoot us for disagreeing with them, but I am by no means assured they would not do so if the power was theirs. The Founding Fathers understood better than any humans before – or, sadly, since – the temptation of libido dominandi, a concept identified by the dictionary as “the will to power; the desire to dominate; the lust for government.” They knew that a certain breed of man will always attempt to force its opinions and belief-systems onto others, with violence if necessary; and to organize them for his own selfish ends. The entire Constitution was written to restrain the power of government over the individual – in other words, to protect us from our own government. Unfortunately, it was beyond the power of the Founders to protect us from ourselves. They did not anticipate a generation of Americans who were unwilling to think, but perfectly willing to take action anyway.
In my Facebook feed, I am often confronted by videos in which an interviewer asks people of a certain political persuasion if they agree with certain statements made by a political figure of the same persuasion. A typical question would be, “Hilary Clinton says such and such about the national debt – do you think she's right?” Invariably the interviewee, in this case a Democrat, gushes with enthusiastic approval, and just as invariably, it is revealed at the end of the interview that the actual quotation came from Donald Trump, or George W. Bush, or some other right-wing figure. These sort of tricks are also played on Republicans with equal success, but the political loyalties of those questioned is not important. What is important is the fact that in each case, the response was conditioned not by what was said but by who they believed said it. It is noteworthy that these videos are always exhibited for the purposes of making the ordinary Democrat or Republican look stupid; but their effect on me is not amusement but horror. One of the slogans of the Nazi Party was, “The Führer is always right.” In America today, anyone on your side is always right, and everyone on anyone else's side is always wrong. The actual opinions they hold are irrelevant. Party loyalty has become tribal. But if this is true, it also begs a question: if Americans are so ignorant of their own political ideology as to be unable to recognize its exact opposite, why are they so fanatically loyal to it? If a person can't tell you why he's a Democrat, or a Republican (or a Democratic Socialist or a Green or a Libertarian), why do they cling so tightly to that identity?
It is my personal belief that political affiliations have become entirely emotional in origin. In the vast majority of cases, one does not coldly and soberly judge the various parties on their merits and then, on the basis of that analysis, make a rational decision to join one or the other. No, one comes to political identity in the same way one generally comes to religion or love; through subjective feelings which have nothing whatsoever to do with logic. Exactly why the emotions of one person are triggered by the slogans and symbols of the “right” and others by the slogans and symbols of the “left” remains a mystery, but the key point is that that the triggers exist. Humans began as aggressive, ritualistic, territorial animals with a strong hostility to strangers; as civilization emerged they maintained all these characteristics while transferring their tribal loyalties to the nation-state; and for whatever reason, they have now transferred that loyalty to the political parties which exist within those states. But loyalty, like love, is an feeling, and as I've stated above, subjective feelings – beliefs – are far more powerful than facts or logic. Only this can explain why people who can't even summarize their own party platform would simultaneously insist that it is superior to that of the opposing party, and then resist all arguments to the contrary, no matter how well-reasoned. Indeed, the very fact that it is possible to systematically demolish someone's arguments for being a Democrat or a Republican (or what have you), without in any way effecting their devotion to that body, is absolute proof of this.
If you doubt me here, I ask you to perform the following experiment: try, on the same day, to convince a friend that that computer X is a better bargain than computer Y; and also, that he should not love his abusive, irresponsible, alcoholic mother. In the first case, he is likely to respond positively to a purely intellectual argument, because there are no emotions – no loyalties – at stake. But when it comes to his mother, reason goes out the window, and only loyalty remains. Muster 1,000 perfectly valid reasons why he should not love his mom, and his subjective feelings toward her will not change one iota. What's more, he will probably hate you for trying to change them. In one case, truth has power, and in the other, it is utterly powerless.
It seems to me that primitive instincts served mankind well for much of his formative development. The ability to make snap judgements based on sudden emotional stimulus was key to survival on the ancient savanna – run or fight, kill or show mercy, listen or pick up the club. Hostility to strangers was a survival mechanism, as was territoriality and periodic aggression. Ritual helped create traditions which cemented bonds within the tribe, and submission to a strong leader eliminated argument and reduced discord, allowing a group of thirty to move as one. Back then, too much thinking could get you killed. But in this age, when the turn of two keys can release enough nuclear missiles into the air to turn the planet into a lifeless, radioactive cinder hanging in space, we can no longer afford knee-jerk responses to threatening stimuli. The balance we've established over this earth is too delicate, too fragile to sustain for much longer a population which has the power to kill but refuses to engage the power to think. The absolutism of our ancestors has no place in the nuclear era. It is not too much but too little thinking which will doom us.
George Orwell spent most of his literary career worrying about the decline of objective truth, the increasing unwillingness of human beings to think for themselves. He foresaw that this unwillingness would sooner or later lead to an actual inability; that the brain, like the muscles, must be exercised using critical thinking or else it will fall into the flabby and detestable habit of not thinking at all, but simply reacting, reflexively, to emotional stimulus. I'm sorry to see we have already arrived at this point, or at least to its outermost edge. Whether we draw back into sanity or proceed into the abyss which has consumed other great societies depends entirely on whether we continue to let ourselves be ruled by our passions, or governed by our thoughts.
Published on June 26, 2017 10:03
June 14, 2017
The Other D-Day
For a warlike people, Americans are woefully ignorant of their own military history. Indeed, if pressed, the ordinary American could probably not name five major battles this country has engaged in since 1776. But there are a few which manage to shine through the fog of inexplicable disinterest we possess about our wars, and of this select number, one shines brighter than all the rest. That battle is known colloquially as D-Day.
In military parlance, "D-Day" simply means the day a particular operation is scheduled to commence. There was a D-Day for every amphibious invasion and offensive conducted by the U.S. military in WW2 -- and God knows there were plenty of those, scattered across nearly the whole surface of the planet. In the minds of our people, however, D-Day means only one operation, one attack -- that carried out on June 6, 1944 upon the beaches of Normandy.
On that day, eight Allied divisions and supporting commando forces landed by sea or by air in France, which had been under German occupation since the summer of 1940. These divisions, whose total numerical strength was 150,000 -- though the combat elements made up only a smallish percentage of that -- were transported by the mightiest naval armada ever assembled, some 3,427 ships of every conceivable type, from battleships to tugs, and supported by the most powerful air force ever to take flight -- 3,100 bombers and 5,000 fighters. Of course, the eight divisions which hit the beaches were only the tip of the spear; the Allies had a staggering 78 more divisions in England, with more than 50,000 vehicles, all waiting to cross into France when and if the beaches were taken.
Opposing them on those beaches were some six or seven German divisions (50,000 troops) scattered along the coast in defensive fortifications, and supported -- in theory -- by about 225 aircraft (the German air force in France, not strong to begin with, had been almost completely destroyed in the weeks leading up to D-Day via relentless attacks on German airfields). These 50,000 were not alone, of course; the total German strength in France was actually 59 divisions (about 850,000 men), but these had to cover the entire nation of France, including the Franco-Spanish border and the Mediterranean coast, and many were of low quality, under-strength, and lacking in motor transport. Hitler had long viewed France as a sort of gigantic "rest and training area" for his army; now his "rest and training army" had to fight what the Western Allies viewed as "the decisive battle of the war."
The outcome of D-Day is, of course, well known. Despite suffering 10,000 casualties in the first 24 hours, the Allies were able to establish a beach-head in France, and from that point, were able to begin packing it with men, guns, tanks, and supplies on a breath-taking scale -- two million men and two million tons of supplies were brought over in the first sixty days of the campaign alone. And though they were unable to break out of Normandy until early August, the Allies' success on D-Day made the outcome of the Western campaign a foregone conclusion: the Germans simply could not supply enough men, guns and tanks to crush the beach-heads or even contain them indefinitely. When the dam finally burst, the flood of Allied military power swept all the way to Germany, whose Western border was breached for the first time in September.
Thus D-Day. So long as America exists, and perhaps even afterward, it is a day which will live in human memory. And rightly so, for it was thelogistical military feat of human history, a monstrously large operation involving literally millions of people and uncountable amounts of equipment which was as finely calibrated as a Swiss watch. Yet great as it is, it remains only half of one of the greatest stories ever told -- and, in the minds and arguments of some, the lesser half. There was another D-Day, another "decisive battle of the war," and to this day, Americans remain almost entirely ignorant of its existence. And since we are just days before its 73rd anniversary, now seems an opportune moment to discuss it.
Above I stated that Germany failed in the Normandy campaign because its army in France was too small for the task it was supposed to achieve; also because once the landings had succeeded, Hitler was unable to counter the Allied build-up with one of his own -- unable, so to speak, to wall off the Allied armies in Normandy like the unfortunate victim in "The Cask of Amontillado." Yet when one looks at the fighting strength of the German army in June of 1944, one sees it musters 287 divisions of all types. Thus less than quarter of Hitler's strength was in France when the first Allied paratrooper jumped into Normandy in the early morning of June 6. This begs a question: if the landings, which everyone knew were coming, were to be "the decisive battle of the war," why didn't Hitler put more military muscle in France to meet them?
The reason for this is quite simple. The vast majority of Hitler's army was in Russia, and had been since June 22, 1941, when he threw three-quarters of it -- some 3.5 million men, along with the majority of his air force, artillery and tank fleet -- into the task of annihilating the Soviet Union. Hitler had hoped to smash the Red colossus in one quick, savage campaign, and thus free his eastern borders from the perpetual threat posed by Stalin -- the one man in Europe arguably greedier and more ruthless than Hitler himself. But the attack ultimately bogged down deep in the Russian hinterland, casualties mounted, and the Red Army slowly and inexorably began to push the Germans backward. To paraphrase a Ukrainian historian, Hitler was like a python who had bitten deep into his prey, only to discover that he could neither swallow the victim whole nor withdraw his fangs.
By the summer of 1944, the German army in the U.S.S.R. was badly battered and covering far more territory than it could comfortably hold; it was also outnumbered and outgunned on both the ground and in the air. Nevertheless, it remained a dangerous and cohesive fighting force comprising 150 German divisions (2.46 million men), which held an immense front running from Finland down to Leningrad (now St. Petersburg), all the way to the Black Sea just west of Odessa, on the Rumanian border. This force was assembled into four army groups, the most powerful of which was Army Group Center, consisting of four armies or roughly 800,000 men, with about 1,300 tanks, 10,000 guns and 1,000 aircraft. The Soviets had longed viewed the destruction of Army Group Center as the key to the entire Eastern war, and had twice tried to destroy it, but had never been able to achieve this goal. So in April of '44, when weather conditions made attack impossible for several months, they took the respite period to assemble a gigantic force for the express purpose of bringing about this long-awaited annihilation. They knew the Western Allies were due to land in France in early or mid-June, and Stalin and promised an all-out attack to coincide; however, he also wanted the attack to come on the precise anniversary of the German invasion of the U.S.S.R -- June 22.
The total strength of the Red Army at this time was 6.4 million men. Of these, Stalin's generals massed 1.67 million for an attack code-named BAGRATION, after a Russian hero of the Napoleonic wars. These soldiers were backed up by 5,800 tanks, 39,000 guns and around 7,000 aircraft of all types. Like their Western counterparts, they benefited from elaborate deception operations designed to fool the Germans into thinking their attack would come elsewhere. And on the morning of June 22, 1944 -- precisely three years since Hitler had launched his invasion -- they blitzed the semicircular German line on a massive front stretching roughly 660 miles in length.
Like the Normandy campaign, BAGRATION, later known as "The Battle of White Russia," lasted almost exactly two months. Unlike Normandy, this battle developed with astonishing rapidity. In the sixty days in question, the Red Army finally drove the hated German invader off Soviet soil, pushing as far west as Warsaw, which had been in Hitler's hands since 1939. More died in that sixty day period, however, than Hitler's dream of establishing an empire in the East. Army Group Center, which had once penetrated to the very suburbs of Moscow,, had been wiped off the face of the earth. Four strong German armies were virtually or totally destroyed, along with nearly all their arms and equipment. No less than 28 German divisions had ceased to exist -- the historian David Glantz estimated the numerical casualties at somewhere between 400 - 450,000 men -- and a staggering 31 German generals were captured, some seventeen of whom later joined the "National Committee For a Free Germany," a Soviet puppet organization comprised of German turncoats who denounced Hitler and Nazism in radio broadcasts and put their signatures to Soviet leaflets encouraging German soldiers to desert. Hitler, looking at a map, described the area where the Army Group had occupied as "nothing but a hole." The fact is, despite the D-Day landings, on the morning of June 22, 1944, the German dictator could boast that, in spite of all setbacks, his armies still ruled most of Europe. By August 22, many of those armies had been destroyed, and much of Europe liberated -- or conquered -- by Allied troops. Germany was caught like a hazelnut between the jaws of a giant cracker; Roosevelt held one handle and Stalin the other. The war still had nine long and almost unbelievably bloody months to go, but the outcome could no longer be doubted. Nazi Germany was finished. Not surprisingly, the Russians maintain to this day that White Russia, and not Normandy, was "the decisive battle of the war."
There is much evidence to argue this claim. The Western Allies put 150,000 men into France on D-Day and in the days immediately following. The Soviets employed ten times in this many to liberate White Russia (modern-day Belarus). Of course, both of these victories had to be paid for with oceans of human blood. The Normandy battles cost the Western Allies 226,386 casualties, including 72,000 killed and missing. In that same period, the Red Army reported a staggering 770,888 total losses (killed, wounded, missing, sick) on the front of Army Group Center alone. This is three-quarters of the total number of casualties suffered by the United States in the whole of World War 2 on all fronts.
But as badly as the Allies had suffered, the Germans had been even more seriously damaged. Entire army groups had been wiped out. Irreplaceable officers and men killed or captured. Incalculable amounts of planes, guns, tanks and equipment lost. And any hope of military victory -- or even a negotiated settlement of the war -- finished for good.
We Americans take justified pride in the contribution our country made to Allied victory in WW2. We carried the Pacific war on our backs, took over the air war in Europe when it seemed hopeless, and it was our industrial might -- an endless flood of guns, planes, tanks, equipment, food and raw materials -- which kept both Britain and Soviet Russia fighting when they were at their lowest points between 1940 - 1942. Yet it would be a mistake to believe, or rather to continue to choose to believe, that "the decisive battle of the war" was fought and won by us alone, or even by the Western Allies acting in concert. Victory in WW2 cannot be discussed without giving a prominent place, and perhaps even the place of honor, to the Soviet Union. It was their soldiers took most of the casualties, and who fought, tied down and ultimately destroyed 3/4 of the German army. It is true that those soldiers served a regime almost morally identical to Nazism, and that said regime was guilty of terrible military aggression against its neighbors and innumerable crimes against humanity; but in the final analysis this does not change the fact that it was only in concert with their efforts that "the decisive battle of the war" was fought...and won.
Something to ponder as another June 22 approaches.
In military parlance, "D-Day" simply means the day a particular operation is scheduled to commence. There was a D-Day for every amphibious invasion and offensive conducted by the U.S. military in WW2 -- and God knows there were plenty of those, scattered across nearly the whole surface of the planet. In the minds of our people, however, D-Day means only one operation, one attack -- that carried out on June 6, 1944 upon the beaches of Normandy.
On that day, eight Allied divisions and supporting commando forces landed by sea or by air in France, which had been under German occupation since the summer of 1940. These divisions, whose total numerical strength was 150,000 -- though the combat elements made up only a smallish percentage of that -- were transported by the mightiest naval armada ever assembled, some 3,427 ships of every conceivable type, from battleships to tugs, and supported by the most powerful air force ever to take flight -- 3,100 bombers and 5,000 fighters. Of course, the eight divisions which hit the beaches were only the tip of the spear; the Allies had a staggering 78 more divisions in England, with more than 50,000 vehicles, all waiting to cross into France when and if the beaches were taken.
Opposing them on those beaches were some six or seven German divisions (50,000 troops) scattered along the coast in defensive fortifications, and supported -- in theory -- by about 225 aircraft (the German air force in France, not strong to begin with, had been almost completely destroyed in the weeks leading up to D-Day via relentless attacks on German airfields). These 50,000 were not alone, of course; the total German strength in France was actually 59 divisions (about 850,000 men), but these had to cover the entire nation of France, including the Franco-Spanish border and the Mediterranean coast, and many were of low quality, under-strength, and lacking in motor transport. Hitler had long viewed France as a sort of gigantic "rest and training area" for his army; now his "rest and training army" had to fight what the Western Allies viewed as "the decisive battle of the war."
The outcome of D-Day is, of course, well known. Despite suffering 10,000 casualties in the first 24 hours, the Allies were able to establish a beach-head in France, and from that point, were able to begin packing it with men, guns, tanks, and supplies on a breath-taking scale -- two million men and two million tons of supplies were brought over in the first sixty days of the campaign alone. And though they were unable to break out of Normandy until early August, the Allies' success on D-Day made the outcome of the Western campaign a foregone conclusion: the Germans simply could not supply enough men, guns and tanks to crush the beach-heads or even contain them indefinitely. When the dam finally burst, the flood of Allied military power swept all the way to Germany, whose Western border was breached for the first time in September.
Thus D-Day. So long as America exists, and perhaps even afterward, it is a day which will live in human memory. And rightly so, for it was thelogistical military feat of human history, a monstrously large operation involving literally millions of people and uncountable amounts of equipment which was as finely calibrated as a Swiss watch. Yet great as it is, it remains only half of one of the greatest stories ever told -- and, in the minds and arguments of some, the lesser half. There was another D-Day, another "decisive battle of the war," and to this day, Americans remain almost entirely ignorant of its existence. And since we are just days before its 73rd anniversary, now seems an opportune moment to discuss it.
Above I stated that Germany failed in the Normandy campaign because its army in France was too small for the task it was supposed to achieve; also because once the landings had succeeded, Hitler was unable to counter the Allied build-up with one of his own -- unable, so to speak, to wall off the Allied armies in Normandy like the unfortunate victim in "The Cask of Amontillado." Yet when one looks at the fighting strength of the German army in June of 1944, one sees it musters 287 divisions of all types. Thus less than quarter of Hitler's strength was in France when the first Allied paratrooper jumped into Normandy in the early morning of June 6. This begs a question: if the landings, which everyone knew were coming, were to be "the decisive battle of the war," why didn't Hitler put more military muscle in France to meet them?
The reason for this is quite simple. The vast majority of Hitler's army was in Russia, and had been since June 22, 1941, when he threw three-quarters of it -- some 3.5 million men, along with the majority of his air force, artillery and tank fleet -- into the task of annihilating the Soviet Union. Hitler had hoped to smash the Red colossus in one quick, savage campaign, and thus free his eastern borders from the perpetual threat posed by Stalin -- the one man in Europe arguably greedier and more ruthless than Hitler himself. But the attack ultimately bogged down deep in the Russian hinterland, casualties mounted, and the Red Army slowly and inexorably began to push the Germans backward. To paraphrase a Ukrainian historian, Hitler was like a python who had bitten deep into his prey, only to discover that he could neither swallow the victim whole nor withdraw his fangs.
By the summer of 1944, the German army in the U.S.S.R. was badly battered and covering far more territory than it could comfortably hold; it was also outnumbered and outgunned on both the ground and in the air. Nevertheless, it remained a dangerous and cohesive fighting force comprising 150 German divisions (2.46 million men), which held an immense front running from Finland down to Leningrad (now St. Petersburg), all the way to the Black Sea just west of Odessa, on the Rumanian border. This force was assembled into four army groups, the most powerful of which was Army Group Center, consisting of four armies or roughly 800,000 men, with about 1,300 tanks, 10,000 guns and 1,000 aircraft. The Soviets had longed viewed the destruction of Army Group Center as the key to the entire Eastern war, and had twice tried to destroy it, but had never been able to achieve this goal. So in April of '44, when weather conditions made attack impossible for several months, they took the respite period to assemble a gigantic force for the express purpose of bringing about this long-awaited annihilation. They knew the Western Allies were due to land in France in early or mid-June, and Stalin and promised an all-out attack to coincide; however, he also wanted the attack to come on the precise anniversary of the German invasion of the U.S.S.R -- June 22.
The total strength of the Red Army at this time was 6.4 million men. Of these, Stalin's generals massed 1.67 million for an attack code-named BAGRATION, after a Russian hero of the Napoleonic wars. These soldiers were backed up by 5,800 tanks, 39,000 guns and around 7,000 aircraft of all types. Like their Western counterparts, they benefited from elaborate deception operations designed to fool the Germans into thinking their attack would come elsewhere. And on the morning of June 22, 1944 -- precisely three years since Hitler had launched his invasion -- they blitzed the semicircular German line on a massive front stretching roughly 660 miles in length.
Like the Normandy campaign, BAGRATION, later known as "The Battle of White Russia," lasted almost exactly two months. Unlike Normandy, this battle developed with astonishing rapidity. In the sixty days in question, the Red Army finally drove the hated German invader off Soviet soil, pushing as far west as Warsaw, which had been in Hitler's hands since 1939. More died in that sixty day period, however, than Hitler's dream of establishing an empire in the East. Army Group Center, which had once penetrated to the very suburbs of Moscow,, had been wiped off the face of the earth. Four strong German armies were virtually or totally destroyed, along with nearly all their arms and equipment. No less than 28 German divisions had ceased to exist -- the historian David Glantz estimated the numerical casualties at somewhere between 400 - 450,000 men -- and a staggering 31 German generals were captured, some seventeen of whom later joined the "National Committee For a Free Germany," a Soviet puppet organization comprised of German turncoats who denounced Hitler and Nazism in radio broadcasts and put their signatures to Soviet leaflets encouraging German soldiers to desert. Hitler, looking at a map, described the area where the Army Group had occupied as "nothing but a hole." The fact is, despite the D-Day landings, on the morning of June 22, 1944, the German dictator could boast that, in spite of all setbacks, his armies still ruled most of Europe. By August 22, many of those armies had been destroyed, and much of Europe liberated -- or conquered -- by Allied troops. Germany was caught like a hazelnut between the jaws of a giant cracker; Roosevelt held one handle and Stalin the other. The war still had nine long and almost unbelievably bloody months to go, but the outcome could no longer be doubted. Nazi Germany was finished. Not surprisingly, the Russians maintain to this day that White Russia, and not Normandy, was "the decisive battle of the war."
There is much evidence to argue this claim. The Western Allies put 150,000 men into France on D-Day and in the days immediately following. The Soviets employed ten times in this many to liberate White Russia (modern-day Belarus). Of course, both of these victories had to be paid for with oceans of human blood. The Normandy battles cost the Western Allies 226,386 casualties, including 72,000 killed and missing. In that same period, the Red Army reported a staggering 770,888 total losses (killed, wounded, missing, sick) on the front of Army Group Center alone. This is three-quarters of the total number of casualties suffered by the United States in the whole of World War 2 on all fronts.
But as badly as the Allies had suffered, the Germans had been even more seriously damaged. Entire army groups had been wiped out. Irreplaceable officers and men killed or captured. Incalculable amounts of planes, guns, tanks and equipment lost. And any hope of military victory -- or even a negotiated settlement of the war -- finished for good.
We Americans take justified pride in the contribution our country made to Allied victory in WW2. We carried the Pacific war on our backs, took over the air war in Europe when it seemed hopeless, and it was our industrial might -- an endless flood of guns, planes, tanks, equipment, food and raw materials -- which kept both Britain and Soviet Russia fighting when they were at their lowest points between 1940 - 1942. Yet it would be a mistake to believe, or rather to continue to choose to believe, that "the decisive battle of the war" was fought and won by us alone, or even by the Western Allies acting in concert. Victory in WW2 cannot be discussed without giving a prominent place, and perhaps even the place of honor, to the Soviet Union. It was their soldiers took most of the casualties, and who fought, tied down and ultimately destroyed 3/4 of the German army. It is true that those soldiers served a regime almost morally identical to Nazism, and that said regime was guilty of terrible military aggression against its neighbors and innumerable crimes against humanity; but in the final analysis this does not change the fact that it was only in concert with their efforts that "the decisive battle of the war" was fought...and won.
Something to ponder as another June 22 approaches.
Published on June 14, 2017 22:59
May 19, 2017
Homesick in Hollywood
When I started freelancing in Hollywood some years ago, I signed so many nondisclosure agreements that I felt like I was joining the CIA. As a result of these agreements I can't really talk about what I do, which means I can't even offer you a good excuse for why I haven't written anything for this blog in the last few weeks. So instead, I'm just going to talk about how ridiculously homesick I am. Bear with me. It's possible that this is not as silly as it sounds.
I moved to Los Angeles almost ten years ago, and I thought that whatever homesickness I might experience for the East would be shallow and short-lived. (After all, if I hadn't wanted to leave, I wouldn't have come here in the first place.) And, by and large, I was proven correct. Except for a certain silly nostalgia for snow, which I certainly didn't experience when I was shoveling it out of my parents' driveway, I rapidly became so immersed in learning the geography and folkways of my new city that there was no time to feel mushy about Maryland, where I grew up, or Pennsylvania, where I settled after college. But a decade is a long time, even if it doesn't necessarily feel that way, and lately I have begun to wonder if it is really possible to transplant yourself into a culture and a climate so fundamentally different from the one in which you were raised.
I once read a book about a Soviet defector to the United States. While cataloguing his reasons for defection, he referenced an instance in which the defense minister of the USSR was scheduled to visit his airbase. This being in Siberia, the road between the landing strip and the base was barren and depressing, and the minister supposedly “a lover of nature and all its verdancy.” The base commander, obviously hoping to score points with the boss, ordered his entire regiment to cease all military training and spend several months digging up trees and replanting them along the roadside. Everyone knew replanting trees in the middle of the summer was impossible, but the trees were nevertheless wrenched from their homes in the forest, transported by trucks to the barren road, and sunk into deep pits. Not surprisingly, within a week all of them were dead, so the commander had another hundred or so dug up put in their place. They too withered and died, and after several more sweaty and fruitless attempts, the commander, “finally realizing the laws of Nature would not bend even for communists,” ordered the deceased trees painted green. When the driver detailed to whisk the defense minister from the runway to the base pointed out that the old man would certainly not be taken in by this charade, the commander said, “Oh hell, just drive fast – he'll never notice.”
As it happened the minister didn't notice, because he never arrived; matters of state forced him to cancel the visit. The dead, green-painted trees were left to rot along the road and the regiment washed the dirt from its collective hands and finally resumed its military duties. Unfortunately, after spending months as unwilling gardeners, the pilots were so out of practice at flying that several of them soon crashed, including one who hit a passing city bus while attempting a takeoff, decapitating some 20 passengers.
I never think about this incident without realizing that a large part of the commander's problem stemmed from the fact that he did not understand the importance of roots. Pulling a living thing – be it a tree or a fighter pilot – out of the environment from which it (or he) grew and trying to graft it somewhere else is an unnatural act, and unless it is carried off with great care it will inevitably lead to failure and disaster.
I lived thirty years of my life in the East, but if I remember my basic biology correctly, there is no longer one particle of my being which remains from that time. Every hair follicle, every skin cell, my toenails, even my scars – everything which presently comprises me has formed here in California. So I suppose that, and my driver's license, make me a Californian and more specifically, a Los Angelino. Certainly I've picked up many of the local habits. I see more movies, eat healthier, and complain more about traffic than I ever did back East. I'm also far more physically active: last year I went hiking 99 times and swimming 95, to say nothing of time spent in the gym, and yes, I even occasionally do yoga. I no longer refer to highways as highways but as freeways, and I do not identify them in the Eastern manner, i.e. by their numbers; instead I use the prefix “the” beforehand, as in, “You take the 134 to the 405 to the 101.” I drink bottled water instead of tap, and at one point I had not one but two fedora hats, not because they were stylish but because they kept the sun off both my face and my neck. I probably don't go more than a few months without seeing a famous or semi-famous actor (the last was Keifer Sutherland, who passed by my table at a restaurant singing the lyrics to “Stand By Me” under his breath – no kidding) and I even work in the entertainment industry myself, which truly cements the cliché.
And make no mistake. Grumbling and grousing aside, I really do enjoy living in California. I can go swimming beneath palm trees in February, drive down the street to hike any number of hills and mountains, visit an infinity of beaches, see my favorite movies in 100 year-old theaters, and my favorite actors, in person, in playhouses so intimate you can hear them breathing between lines. Sporting events and concerts are almost easier to go to than to avoid; music and food festivals run year round; and my inner nerd can satisfying his deepest longings by visiting the shooting locations of the television shows I grew up watching. Museums of every type abound, there are so many restaurants in this town that even visiting every one in your own immediate neighborhood is impossible, and there are things you can do here that you can only do here and pretty much nowhere else on earth, like sit in the audience during the taping of a TV show. Being bored here is essentially a personal choice.
And yet.
When I think about that Soviet defector – Viktor Belenko was his name – I am always reminded of the fact that when he finally made his way into the U.S., he literally thought he'd discovered paradise. Private automobiles, television sets, transistor radios, garbage disposals, comfortable clothing, the freedom to travel from one state to another without a passport, the freedom to denounce the government without fearing arrest or execution – all of it was new to him. (It took him months to overcome his belief that the supermarkets he saw on every street were not fabrications created by the CIA.) He realized that he actually had the chance, in America, to do everything he'd ever wanted to do with his life. To become the person he'd always wanted to be. Yet at one point, about a year or two after he defected, and despite the near-certainty that if he did so he would be tortured and shot, he very nearly returned to the USSR of his own free will.
It turns out that, like a tree uprooted from the forest or an animal imprisoned in a zoo, a human being can stand only so much time away from what he considers to be his natural environment without suffering severe spiritual and psychological consequences. In Belenko's case, he longed to hear and speak Russian again, to hear Russian music, to eat Russian food, to wander the muddy, cobblestoned, smoke-tinged streets of Russian villages, and to sit in leaky, drafty railway stations jammed with peasants and soldiers who looked and spoke and thought just exactly like him, singing Russian folks songs as they waited for trains that might never arrive. “He was hearing and being drawn not only by the call of the Mother Country,” his biographer wrote. “But the Call of the Wild.”
Don't get me wrong. I am not in any way comparing the East to Soviet Russia, or myself to the heroic Belenko. I'm just saying that when I think about small-town living in Pennsylvania, or suburban existence in Maryland, one of the first things that hits me is the lack of diversion relative to my present circumstances. Even where I grew up, which was a stone's throw from both D.C. and Northern Virginia, the options now seem extremely limited in comparison. If I were to pack my things and move back there today, I can honestly say I don't know what the hell I'd do with myself on weekends, especially in the wintertime. I'm certain restlessness and boredom would bedevil me worse even than the loneliness which one must always endure when showing up to a place where one has few readily available friends. Nor would the transition back to the East coast way of doing things be easy. Easterners are punctual; Californians perpetually tardy. Easterners walk and talk fast; Californians amble and ramble. Easterners confront quickly; Californians tend to shrug things off. Easterners are conditioned and toughened by seasonal changes; Californians, who have no seasons, live in a kind of endless summer where winter never pays for spring. The Easterner is formal; the Californian hopelessly casual. I'm honestly not sure if I could withstand twenty degree temperatures again, or days of freezing rain in November, or the horrible necessity of scraping ice from a car windshield while wondering if the battery is dead. After Cali, I don't know if the political monomania I grew up with inside the Beltway would be endurable now, any more than the ill-informed, frighteningly opinionated talk I constantly heard in the bars and diners of York, Pennsylvania. And I do know that sinking back into the whole rigamarole of suits, ties and dress shoes after a decade of T-shirts and jeans would feel like joining the army at the age of forty-four. Like Belenko, I'm probably better off where I am than where I used to be.
And yet.
Lately I have been experiencing the Call of the Wild myself. Or perhaps just the call of the East.
You know the sound, or rather the almost complete absence of sound, which occurs very late at night during a snowstorm? The streetlamps are haloed with moisture; snowflakes flicker through the glaring light. Rooftops, cars, front lawns, the street, all blanketed in unbroken, virgin white, and there is no movement anywhere save that of the snow, which you can only see against the light, and no sound save that of your boots and your breathing and maybe your heartbeat. It's been ten years – ten years! – since I didn't hear that sound.
You know what else I miss hearing? A few days after the blizzard, when the snow finally starts to melt, and you wake up to the sound of icicles dripping and drain-pipes gurgling and car-wheels spinning in the slush. There's nothing quite like taking a walk when it's forty degrees and every gutter is full of melting snow and every drain-pipe is spitting a steady stream of water that sparkles like liquid crystal in the sun. I can't describe the sound but you'd know it if you heard it and I haven't heard it in way, way too goddamned long.
There are sensations, too, that only exist in the East in winter. Like the way a bowl of hot soup feels percolating in your belly after you just spent two hours shoveling snow and scattering salt and cursing so that your breath smoked like a chimney. And speaking of chimneys, in the East they are actually put to use. Every Easterner who grew up with a fireplace knows the complex rituals involved in building and maintaining a good fire – gathering the kindling in the yard, buying the bundles from that joint down the road, saving that one log with the rope handles to burn last, because you enjoy watching rope burn. Listening to the squeak and crackle and hiss and pop. Watching as if hypnotized the intricate patterns of the flame. And the old newspaper you use to start the blaze, and the warm ashes beneath the grate the next morning, and the tedious job of cleaning them out when they finally go cold.
And then there are the summers. I just called California land of endless summer, and by and large it is, but there are Western summers and there are Eastern ones. In the West, summer is a monolith – every day is the same, sunny and dry and so, so hot. In the East summer is like a jewel with a hundred facets. There are days when the humidity closes around you like an iron maiden, when the mosquitoes and gnats hum fit to make your eyes water, when your skin sticks to the car seat and happiness is a glass of Coke with six ice cubes, or the can of beer that's wedged so far back in the vegetable crisper it has frost around the ring-tab. I don't miss those, but I miss other things – thunder, for example. I've heard thunder maybe four times in the 3,424 days I've lived out here, and I can't remember the last time I saw any lightning. And what about thunderstorms – the rain coming down in buckets out of nowhere, sometimes while the sun is still shining? There's nothing quite like watching a thunderstorm through a window, or better yet, from beneath a portico or an awning somewhere. The streets empty so fast you'd think the raindrops were shrapnel, the birds and insects stop their music, the trees and bushes lash about, the gutters overflow, and that lovely scent blows in through the screens. You know the one I mean. It's the smell of rain, but also of green things growing, of wet earth. Of life. And there is something that happens in the early evenings after a thunderstorm – a sort of smokey hush falls over the land, except that the smoke is mist, curling over the grass and the trunks of the trees, lending a pearlescent finish to the grass. If you've never sat on a porch in the evening after a thunderstorm, watching the darkness gather as the cicadas slowly burr back to life and somebody's grandfather or great-uncle draws on their pipe, making soft orange light among the shadows, you haven't lived. And what about fireflies? A huge wood-fringed field alight with the greenish-yellow flame of innumerable fireflies, calling to each other silently as night comes on? Do you know I have seen fireflies only once in the last decade, and it was during a weekend trip to my old graduate school in Pennsylvania? What the fuck kind of place doesn't have fireflies?
Then there is fall, the season with two faces. The first half of it is so beautiful as to defy description. Like the time you were driving in West Virginia and saw the Appalachians when the leaves were turning? Shades and intensities of green shading to yellow-gold and then to copper and finally that dark wine-red, immensities of leaves rippling over the horizon, and that somehow spicy smell blowing in along with the wind through the open car windows? Or the nights in high school you'd go to the football games with the lights glaring and the crowd howling and the air so cool and crisp you could practically cut a slice for yourself to take home? Or the times you'd look up and see the sky filled with honking, cackling, cawing birds from horizon to horizon, thousands of them all flying south like some immense fleet of bombing planes? That was the first face. The second was early darkness and bare trees and days and days ice-cold rain, rain so that you thought it would never stop raining, and the back of your collar and the cuffs of your jeans always wet, and before the fireplace there was a row of shoes and boots that never seemed to dry? But you enjoyed the way the logs hissed and popped and crackled as they burned on the grate, and the gold shadows on the darkened walls, and your homework spread like a deck of cards across the floor, and Judy Collins on the radio, and that was fall, the two faces together, and you miss them both.
And Spring. Christ, Spring. There is no spring in California. We have 200 days of Summer and then we have this runty thing called not-Summer which is not really a season but the scrapings and leftovers of one, a kind of shabu-shabu that can't rightly be called a season. No ice-cold morning that turns blazing hot by noon and then near-freezing again by sunset, with sunshine one minute and rain the next and always the gusty fragrant air that smells simultaneously of melting snow and flowering plants. No feeling of waking up with the air playing the curtains above your head and the birds singing lustily and your body responding in the ancient manner to the irresistible urge to procreate or die trying. What is spring to me? Spring is sprinklers huffing and chuffing and bare, muddy feet and the first time you hear the tinkling tune of the ice cream truck. Spring is the that first whiff of woodsmoke from the neighbor's barbeque, the first time you hear the rip-roar of the neighbor's lawnmower and think, shit, it's that time again. Spring is when the cicadas come out of their years-long sleep by the billions and set up a racket so deafening you don't even hear it after the first night. Spring is when the air has a taste and the taste is good, though sometimes it makes your teeth ache, just like that ice cream.
Of course by now you think I'm sentimentalizing, nostalgifying, looking at the past, or rather the region, with rose-colored glasses. Perhaps I am. My last winter in Pennsylvania, the temperature hovered around seven degrees for weeks at a time. That well and truly sucked. And so do chapped lips and cracked hands and dandruff, and icy-cold wet socks, and shoveling snow and scraping ice off side-view mirrors; and so do weeks of endless, discouraging drizzle. Gaining fifteen pounds every winter blows goats. So does that choking, paralyzing humidity that comes with summer, and so do gnats, and so do mosquitoes (especially those huge striped bastards that take about a thimbleful of blood out of you and contrive to ruin every picnic). Getting stung by a yellow jacket is no fun either, and wearing a tight collar and a tie when your neck is raw from a blunt razor and it's ninety-six degrees outside is not much better. Poison ivy and poison oak and poison sumac were less fun than a trip to the dentist, and the power outages that inevitably followed the really big thunderstorms – especially when it was 93 degrees outside – were somewhat less than joyous occasions. Few sailors can swear the way I did on a succession of mornings back East when the cold murdered my car battery, and I never have to worry about that anymore. Nor do I have to sweat that horrible, tight-lipped political correctness as it exists in the workplace back East -- here in L.A. we have discussions around the ole water cooler that would give any East Coast H.R. director a fucking heart attack, and nobody bats an eyelash. The truth is that if I remain here I will never again have to purchase a bag of salt to scatter on the front steps, or handle a snow shovel, or buy a rake, or blow hot air down a frozen pipe. I'm 14 miles from the beach, three and half from the Hollywood sign. Sometimes, in airports, when people ask me where I live, I puff up and respond, “L.A.” as if daring them to top me. In many ways this is a good place, a good home, a good life. I tell myself that every day. I tell myself I'm happy here, and that it would be foolish to leave. And yet....
And yet I can't help wonder if, after almost ten years, my roots aren't starting to wither up and die in the strange soil of SoCal. If the call of the East isn't starting to drown out all the sober, logic, reasonable arguments I have to stay here in the West. If it wouldn't do my heart some good to shovel some snow, swat a few mosquitoes, and stand out in the back yard in my bare feet in the middle of a thunderstorm, listening to the sound of the rain.
I moved to Los Angeles almost ten years ago, and I thought that whatever homesickness I might experience for the East would be shallow and short-lived. (After all, if I hadn't wanted to leave, I wouldn't have come here in the first place.) And, by and large, I was proven correct. Except for a certain silly nostalgia for snow, which I certainly didn't experience when I was shoveling it out of my parents' driveway, I rapidly became so immersed in learning the geography and folkways of my new city that there was no time to feel mushy about Maryland, where I grew up, or Pennsylvania, where I settled after college. But a decade is a long time, even if it doesn't necessarily feel that way, and lately I have begun to wonder if it is really possible to transplant yourself into a culture and a climate so fundamentally different from the one in which you were raised.
I once read a book about a Soviet defector to the United States. While cataloguing his reasons for defection, he referenced an instance in which the defense minister of the USSR was scheduled to visit his airbase. This being in Siberia, the road between the landing strip and the base was barren and depressing, and the minister supposedly “a lover of nature and all its verdancy.” The base commander, obviously hoping to score points with the boss, ordered his entire regiment to cease all military training and spend several months digging up trees and replanting them along the roadside. Everyone knew replanting trees in the middle of the summer was impossible, but the trees were nevertheless wrenched from their homes in the forest, transported by trucks to the barren road, and sunk into deep pits. Not surprisingly, within a week all of them were dead, so the commander had another hundred or so dug up put in their place. They too withered and died, and after several more sweaty and fruitless attempts, the commander, “finally realizing the laws of Nature would not bend even for communists,” ordered the deceased trees painted green. When the driver detailed to whisk the defense minister from the runway to the base pointed out that the old man would certainly not be taken in by this charade, the commander said, “Oh hell, just drive fast – he'll never notice.”
As it happened the minister didn't notice, because he never arrived; matters of state forced him to cancel the visit. The dead, green-painted trees were left to rot along the road and the regiment washed the dirt from its collective hands and finally resumed its military duties. Unfortunately, after spending months as unwilling gardeners, the pilots were so out of practice at flying that several of them soon crashed, including one who hit a passing city bus while attempting a takeoff, decapitating some 20 passengers.
I never think about this incident without realizing that a large part of the commander's problem stemmed from the fact that he did not understand the importance of roots. Pulling a living thing – be it a tree or a fighter pilot – out of the environment from which it (or he) grew and trying to graft it somewhere else is an unnatural act, and unless it is carried off with great care it will inevitably lead to failure and disaster.
I lived thirty years of my life in the East, but if I remember my basic biology correctly, there is no longer one particle of my being which remains from that time. Every hair follicle, every skin cell, my toenails, even my scars – everything which presently comprises me has formed here in California. So I suppose that, and my driver's license, make me a Californian and more specifically, a Los Angelino. Certainly I've picked up many of the local habits. I see more movies, eat healthier, and complain more about traffic than I ever did back East. I'm also far more physically active: last year I went hiking 99 times and swimming 95, to say nothing of time spent in the gym, and yes, I even occasionally do yoga. I no longer refer to highways as highways but as freeways, and I do not identify them in the Eastern manner, i.e. by their numbers; instead I use the prefix “the” beforehand, as in, “You take the 134 to the 405 to the 101.” I drink bottled water instead of tap, and at one point I had not one but two fedora hats, not because they were stylish but because they kept the sun off both my face and my neck. I probably don't go more than a few months without seeing a famous or semi-famous actor (the last was Keifer Sutherland, who passed by my table at a restaurant singing the lyrics to “Stand By Me” under his breath – no kidding) and I even work in the entertainment industry myself, which truly cements the cliché.
And make no mistake. Grumbling and grousing aside, I really do enjoy living in California. I can go swimming beneath palm trees in February, drive down the street to hike any number of hills and mountains, visit an infinity of beaches, see my favorite movies in 100 year-old theaters, and my favorite actors, in person, in playhouses so intimate you can hear them breathing between lines. Sporting events and concerts are almost easier to go to than to avoid; music and food festivals run year round; and my inner nerd can satisfying his deepest longings by visiting the shooting locations of the television shows I grew up watching. Museums of every type abound, there are so many restaurants in this town that even visiting every one in your own immediate neighborhood is impossible, and there are things you can do here that you can only do here and pretty much nowhere else on earth, like sit in the audience during the taping of a TV show. Being bored here is essentially a personal choice.
And yet.
When I think about that Soviet defector – Viktor Belenko was his name – I am always reminded of the fact that when he finally made his way into the U.S., he literally thought he'd discovered paradise. Private automobiles, television sets, transistor radios, garbage disposals, comfortable clothing, the freedom to travel from one state to another without a passport, the freedom to denounce the government without fearing arrest or execution – all of it was new to him. (It took him months to overcome his belief that the supermarkets he saw on every street were not fabrications created by the CIA.) He realized that he actually had the chance, in America, to do everything he'd ever wanted to do with his life. To become the person he'd always wanted to be. Yet at one point, about a year or two after he defected, and despite the near-certainty that if he did so he would be tortured and shot, he very nearly returned to the USSR of his own free will.
It turns out that, like a tree uprooted from the forest or an animal imprisoned in a zoo, a human being can stand only so much time away from what he considers to be his natural environment without suffering severe spiritual and psychological consequences. In Belenko's case, he longed to hear and speak Russian again, to hear Russian music, to eat Russian food, to wander the muddy, cobblestoned, smoke-tinged streets of Russian villages, and to sit in leaky, drafty railway stations jammed with peasants and soldiers who looked and spoke and thought just exactly like him, singing Russian folks songs as they waited for trains that might never arrive. “He was hearing and being drawn not only by the call of the Mother Country,” his biographer wrote. “But the Call of the Wild.”
Don't get me wrong. I am not in any way comparing the East to Soviet Russia, or myself to the heroic Belenko. I'm just saying that when I think about small-town living in Pennsylvania, or suburban existence in Maryland, one of the first things that hits me is the lack of diversion relative to my present circumstances. Even where I grew up, which was a stone's throw from both D.C. and Northern Virginia, the options now seem extremely limited in comparison. If I were to pack my things and move back there today, I can honestly say I don't know what the hell I'd do with myself on weekends, especially in the wintertime. I'm certain restlessness and boredom would bedevil me worse even than the loneliness which one must always endure when showing up to a place where one has few readily available friends. Nor would the transition back to the East coast way of doing things be easy. Easterners are punctual; Californians perpetually tardy. Easterners walk and talk fast; Californians amble and ramble. Easterners confront quickly; Californians tend to shrug things off. Easterners are conditioned and toughened by seasonal changes; Californians, who have no seasons, live in a kind of endless summer where winter never pays for spring. The Easterner is formal; the Californian hopelessly casual. I'm honestly not sure if I could withstand twenty degree temperatures again, or days of freezing rain in November, or the horrible necessity of scraping ice from a car windshield while wondering if the battery is dead. After Cali, I don't know if the political monomania I grew up with inside the Beltway would be endurable now, any more than the ill-informed, frighteningly opinionated talk I constantly heard in the bars and diners of York, Pennsylvania. And I do know that sinking back into the whole rigamarole of suits, ties and dress shoes after a decade of T-shirts and jeans would feel like joining the army at the age of forty-four. Like Belenko, I'm probably better off where I am than where I used to be.
And yet.
Lately I have been experiencing the Call of the Wild myself. Or perhaps just the call of the East.
You know the sound, or rather the almost complete absence of sound, which occurs very late at night during a snowstorm? The streetlamps are haloed with moisture; snowflakes flicker through the glaring light. Rooftops, cars, front lawns, the street, all blanketed in unbroken, virgin white, and there is no movement anywhere save that of the snow, which you can only see against the light, and no sound save that of your boots and your breathing and maybe your heartbeat. It's been ten years – ten years! – since I didn't hear that sound.
You know what else I miss hearing? A few days after the blizzard, when the snow finally starts to melt, and you wake up to the sound of icicles dripping and drain-pipes gurgling and car-wheels spinning in the slush. There's nothing quite like taking a walk when it's forty degrees and every gutter is full of melting snow and every drain-pipe is spitting a steady stream of water that sparkles like liquid crystal in the sun. I can't describe the sound but you'd know it if you heard it and I haven't heard it in way, way too goddamned long.
There are sensations, too, that only exist in the East in winter. Like the way a bowl of hot soup feels percolating in your belly after you just spent two hours shoveling snow and scattering salt and cursing so that your breath smoked like a chimney. And speaking of chimneys, in the East they are actually put to use. Every Easterner who grew up with a fireplace knows the complex rituals involved in building and maintaining a good fire – gathering the kindling in the yard, buying the bundles from that joint down the road, saving that one log with the rope handles to burn last, because you enjoy watching rope burn. Listening to the squeak and crackle and hiss and pop. Watching as if hypnotized the intricate patterns of the flame. And the old newspaper you use to start the blaze, and the warm ashes beneath the grate the next morning, and the tedious job of cleaning them out when they finally go cold.
And then there are the summers. I just called California land of endless summer, and by and large it is, but there are Western summers and there are Eastern ones. In the West, summer is a monolith – every day is the same, sunny and dry and so, so hot. In the East summer is like a jewel with a hundred facets. There are days when the humidity closes around you like an iron maiden, when the mosquitoes and gnats hum fit to make your eyes water, when your skin sticks to the car seat and happiness is a glass of Coke with six ice cubes, or the can of beer that's wedged so far back in the vegetable crisper it has frost around the ring-tab. I don't miss those, but I miss other things – thunder, for example. I've heard thunder maybe four times in the 3,424 days I've lived out here, and I can't remember the last time I saw any lightning. And what about thunderstorms – the rain coming down in buckets out of nowhere, sometimes while the sun is still shining? There's nothing quite like watching a thunderstorm through a window, or better yet, from beneath a portico or an awning somewhere. The streets empty so fast you'd think the raindrops were shrapnel, the birds and insects stop their music, the trees and bushes lash about, the gutters overflow, and that lovely scent blows in through the screens. You know the one I mean. It's the smell of rain, but also of green things growing, of wet earth. Of life. And there is something that happens in the early evenings after a thunderstorm – a sort of smokey hush falls over the land, except that the smoke is mist, curling over the grass and the trunks of the trees, lending a pearlescent finish to the grass. If you've never sat on a porch in the evening after a thunderstorm, watching the darkness gather as the cicadas slowly burr back to life and somebody's grandfather or great-uncle draws on their pipe, making soft orange light among the shadows, you haven't lived. And what about fireflies? A huge wood-fringed field alight with the greenish-yellow flame of innumerable fireflies, calling to each other silently as night comes on? Do you know I have seen fireflies only once in the last decade, and it was during a weekend trip to my old graduate school in Pennsylvania? What the fuck kind of place doesn't have fireflies?
Then there is fall, the season with two faces. The first half of it is so beautiful as to defy description. Like the time you were driving in West Virginia and saw the Appalachians when the leaves were turning? Shades and intensities of green shading to yellow-gold and then to copper and finally that dark wine-red, immensities of leaves rippling over the horizon, and that somehow spicy smell blowing in along with the wind through the open car windows? Or the nights in high school you'd go to the football games with the lights glaring and the crowd howling and the air so cool and crisp you could practically cut a slice for yourself to take home? Or the times you'd look up and see the sky filled with honking, cackling, cawing birds from horizon to horizon, thousands of them all flying south like some immense fleet of bombing planes? That was the first face. The second was early darkness and bare trees and days and days ice-cold rain, rain so that you thought it would never stop raining, and the back of your collar and the cuffs of your jeans always wet, and before the fireplace there was a row of shoes and boots that never seemed to dry? But you enjoyed the way the logs hissed and popped and crackled as they burned on the grate, and the gold shadows on the darkened walls, and your homework spread like a deck of cards across the floor, and Judy Collins on the radio, and that was fall, the two faces together, and you miss them both.
And Spring. Christ, Spring. There is no spring in California. We have 200 days of Summer and then we have this runty thing called not-Summer which is not really a season but the scrapings and leftovers of one, a kind of shabu-shabu that can't rightly be called a season. No ice-cold morning that turns blazing hot by noon and then near-freezing again by sunset, with sunshine one minute and rain the next and always the gusty fragrant air that smells simultaneously of melting snow and flowering plants. No feeling of waking up with the air playing the curtains above your head and the birds singing lustily and your body responding in the ancient manner to the irresistible urge to procreate or die trying. What is spring to me? Spring is sprinklers huffing and chuffing and bare, muddy feet and the first time you hear the tinkling tune of the ice cream truck. Spring is the that first whiff of woodsmoke from the neighbor's barbeque, the first time you hear the rip-roar of the neighbor's lawnmower and think, shit, it's that time again. Spring is when the cicadas come out of their years-long sleep by the billions and set up a racket so deafening you don't even hear it after the first night. Spring is when the air has a taste and the taste is good, though sometimes it makes your teeth ache, just like that ice cream.
Of course by now you think I'm sentimentalizing, nostalgifying, looking at the past, or rather the region, with rose-colored glasses. Perhaps I am. My last winter in Pennsylvania, the temperature hovered around seven degrees for weeks at a time. That well and truly sucked. And so do chapped lips and cracked hands and dandruff, and icy-cold wet socks, and shoveling snow and scraping ice off side-view mirrors; and so do weeks of endless, discouraging drizzle. Gaining fifteen pounds every winter blows goats. So does that choking, paralyzing humidity that comes with summer, and so do gnats, and so do mosquitoes (especially those huge striped bastards that take about a thimbleful of blood out of you and contrive to ruin every picnic). Getting stung by a yellow jacket is no fun either, and wearing a tight collar and a tie when your neck is raw from a blunt razor and it's ninety-six degrees outside is not much better. Poison ivy and poison oak and poison sumac were less fun than a trip to the dentist, and the power outages that inevitably followed the really big thunderstorms – especially when it was 93 degrees outside – were somewhat less than joyous occasions. Few sailors can swear the way I did on a succession of mornings back East when the cold murdered my car battery, and I never have to worry about that anymore. Nor do I have to sweat that horrible, tight-lipped political correctness as it exists in the workplace back East -- here in L.A. we have discussions around the ole water cooler that would give any East Coast H.R. director a fucking heart attack, and nobody bats an eyelash. The truth is that if I remain here I will never again have to purchase a bag of salt to scatter on the front steps, or handle a snow shovel, or buy a rake, or blow hot air down a frozen pipe. I'm 14 miles from the beach, three and half from the Hollywood sign. Sometimes, in airports, when people ask me where I live, I puff up and respond, “L.A.” as if daring them to top me. In many ways this is a good place, a good home, a good life. I tell myself that every day. I tell myself I'm happy here, and that it would be foolish to leave. And yet....
And yet I can't help wonder if, after almost ten years, my roots aren't starting to wither up and die in the strange soil of SoCal. If the call of the East isn't starting to drown out all the sober, logic, reasonable arguments I have to stay here in the West. If it wouldn't do my heart some good to shovel some snow, swat a few mosquitoes, and stand out in the back yard in my bare feet in the middle of a thunderstorm, listening to the sound of the rain.
Published on May 19, 2017 21:56
ANTAGONY: BECAUSE EVERYONE IS ENTITLED TO MY OPINION
A blog about everything. Literally. Everything. Coming out twice a week until I run out of everything.
- Miles Watson's profile
- 63 followers
