David Lee Holcomb's Blog, page 3
September 17, 2023
Good, gooder, goodest.
Way back in 1770 the French philosopher, historian, and poet Voltaire wrote that “Perfect is the Enemy of Good.”1 He was quoting an Italian proverb, which was itself probably derived from the Greeks or the Etruscans or somebody, but we’ll go with Voltaire because he said so many wonderful things and deserves all the credit he can get.
This statement, “Perfect is the Enemy of Good,” seems troubling at first glance. Shouldn’t we strive for perfection, even if we know that we — flawed beasts that we are — can never achieve it? According to yet another poet, Robert Browning, “…a man’s reach should exceed his grasp.”2
So which way do we roll? Browning is telling us that we should try to do impossible things because even when we fail we will have pushed ourselves higher than we would normally go. Voltaire is saying that by insisting on an impossible perfection we miss out on doing things that might not be perfect but are nonetheless good, both doable and worth doing. It’s not hard to agree with both of these statements, even though they seem mutually exclusive.
What if we accept the idea that both can be true, both valid, but in an imperfect way?
Aiming higher than we can realistically go is about aspiration rather than actual achievement. After all, the whole point of Browning’s statement is that you’re making an attempt that is destined to fail, knowing that it’s going to fail, but trying it anyway, in the hope of benefiting from the mere attempt. Voltaire’s contribution here is telling us to look at what we do manage to achieve, even when we fail to reach perfection, and to recognize the value of that accomplishment.
As is so often the case, I’m sure by now you’re all listening to what I’m saying and thinking: “Is he going somewhere with this?” Well, yes. I am.
We’re there now, in fact.
Unless you’ve been on Mars for the last few weeks, you know that I’ve just self-published my second novel. Over the last few months, this epic has been edited and proofread by two trained human beings (not including myself) and has been chewed over thoroughly by Grammarly’s AI. Whether or not you think that what I’ve written is any good, I can, at least, assert that this text is free of errors.3
Yesterday, ten copies of the paperback arrived on my doorstep. I won’t try to tell you that this is anything like seeing your newborn child for the first time — but that’s exactly what it was like. I opened the box and pulled out one of the precious books. The text, the formatting… every page was perfect.
Unfortunately, this perfect work of art was wrapped in a cover whose lettering was slightly off-center.
Cue the heavy music in a minor key, the roaring of the storm, the crashing of the waves as the ship strikes the rocks. Cue the massive depression, waiting in the wings just for moments like this.
Mind you, the books weren’t hideous. Just because the baby’s ears stick out doesn’t mean he isn’t worthy of love. They just weren’t perfect. My reach had exceeded my grasp, and I had achieved something good, but not perfect, and I was devastated.
During the course of the evening, I corrected the problem — which may or may not have ever been visible to anyone but me in the first place — and arranged to send that first batch of books away to the great pulping machine in the sky. By this time tomorrow, everything will be fine. Probably still not perfect, but good.
I could say that the crisis has been averted, but there never really was a crisis except in my own overheated imagination. My desire to create something perfect had blinded me to the fact that I’d made something good. Now it will even be more good, but still not perfect, because if there wasn’t a visible flaw, I’d imagine one anyway. I wouldn’t know perfection if it ran up and bit me.
The moral of this story? If you can’t get to perfect — and none of us can, not really — then learn to accept good when the delivery guy drops it on your doorstep. Lighten up, for Pete’s sake!
. . .
1 M. de Voltaire (François-Marie Arouet), Questions sur l’Encyclopédie, 1770-1772
2 Robert Browning, “Andrea del Sarto”, from the collection Men and Women, 1855
3 A nod to the readers of my first novel, The Bone Doll, whose first printing was chock full of typos. That situation has since been remedied, but it still makes my stomach hurt whenever I think about it.
August 22, 2021
Prisoners in the museum
The classical Greek conception of the afterlife was not a particularly attractive one.
In Homer’s universe, the vast majority of the dead — those not singled out by the gods for special treatment — did not wake up in some bright city of jasper and chalcedony. There were no beautiful houris, no songs, no drinking with old comrades, no dancing in fields of asphodel. Death meant a transition from the daylight world to a gray twilight, a cavern of ashes and dust, populated by muttering shadows. The Homeric dead retained their identities only through the living, sustained in the memories of those left behind. As those memories faded, or the people who had known them in life themselves died off, the dead reflected that loss, becoming more and more vague, insubstantial, losing all individual selfhood. The one thing that could provide a moment’s respite in this slide into oblivion was blood. The blood of the living, freely given, would restore a shade’s identity and memory, at least for a short time.
Pretty grim, right? We are talking about death, after all, the big D, the final darkness, the end of life, so to expect sunshine and roses and platoons of beautiful virgins does seem a bit naive.
If you’re one of those folks who believes that you will, upon the death of your body, rise up to enjoy dancing and singing and partying for all eternity with your ancestors back to Adam and Eve, I’m not here to rain on your parade. We all look for consolation where we can. What I really want you to think about, looking out at those gray multitudes in the Greek afterlife vibrating to the last fading echoes of selfhood, is the concept of identity.
* * *
What got me started on this track was the singing of the cicadas in my yard. (Bear with me: This’ll make sense in a bit.)
It’s August here in northwest Arkansas, and the cicadas are in full frenzy up in the trees, advertising their brief passions with a chorus of shrilling, chittering, and buzzing. The different groups punctuate their declarations of love and territory with occasional silences. These lacunae are usually easy to overlook, since one instrument dropping out doesn’t always have that much impact on the overall symphony.
Every now and then, however, everybody goes silent at the same moment. It’s a surreal feeling, that sudden jerking away of the curtain of sound, as if the pounding and hissing of the surf stopped without warning, or the ticking of the grandfather’s clock in the hall ceased in the middle of lunch. There’s a sensation of vertigo, of missing a step on the way down the stairs. A silence you can almost touch.
* * *
In 1946, Greek poet and diplomat George Seferis wrote a poem he titled “Thrush”, about a Greek sailing vessel sunk by the Germans during WWII in Athens’ harbor. Seferis, awarded the Nobel Prize for literature in 1963, is widely considered to have defined Hellenic culture in the post-World-War era. For me, his contribution is less specific. I find that Seferis’ writings speak to me, about as non-Greek as a man can get, about identity; about who and what we are in the time and place we occupy.
I don’t have to be Greek to share Seferis’ questions about what defines us. I’m a Southerner, born into a military family in Montgomery, Alabama, during the opening years of the Civil Rights movement, raised in a small, white, conservative farm town in the Appalachian foothills. I’m also a gay man, politically and socially liberal, a xenophile, bilingual, somewhat of an intellectual. How do I reconcile those two sides of my life? Who I am and how I got here seem mutually exclusive. How can I bring the past and the present together without destroying them both?
This, I feel, is the true central theme of Seferis’ poetry. The poet, a Greek, was heir to a culture that reached back to Homer and Herodotus, Pericles and Plato, and yet he lived in a time when his fellow Greeks were roiled in a bloody civil war between two opposing ideological views of what the country should look like in the aftermath of the Nazi occupation, a conflict which petered out in the early 1960s, and then ended once and for all when Greece fell under the control of a brutal military dictatorship. Seferis, a career diplomat, lived every day immersed in the petty squabbling, the violence, the slavish adherence to irrational beliefs, yet, for all that, he was an educated man, a man who knew that his people had once been so much more.
Past and present? Past or present? Where was the middle ground between the Parthenon and Odysseus and the funeral oration of Pericles on the one hand, and the tortures, disappearances, petty internal power struggles and political dirty tricks of Seferis’ own lifetime? This was the poet’s dilemma, and one that informed and energized much of his writing.
One aspect of my personality that I neglected to mention a couple of paragraphs back is that I’m a coward. Faced with the contradictions of my existence early on, I fled my home town at the age of nineteen, and have returned during the last four and half decades only when absolutely necessary. To preserve my present, I repudiate my past. A stronger, braver man would have found a way to reconcile the two without sacrificing personal integrity. Unfortunately, I’m not that stronger, braver man. I did what I had to do with the character and resources I had at my disposal, meager though they might have been.
As Americans, we face this issue more than most. George Seferis belonged to a cultural tradition that could be traced back over millennia. As an American, the descendant of immigrants, born in a country less than two centuries old, I don’t have that foundation to build upon. Quite often, we deal with this by creating fictional “heritage”, based less on historical fact than on wishful thinking, sanitizing our history and our ancestors to justify an exaggerated view of our own stature. (This is, of course, not unique to us. After all, the Greece of the Iliad and the Odyssey was hardly an accurate reflection of reality.)
Reinventing ourselves as what Seferis once referred to as “men without ancestors” is one solution, allowing us to start fresh, to set the past aside and focus on the future. A nice idea, but as I know from my own experience, it’s never that easy: the ghosts are always with us, muttering and begging, pleading for a few drops of blood now and then, demanding to be remembered. The opposite approach, living in the glorified past, is also a non-starter. An identity built on moonshine and fairy dust won’t hold up for long against the very real vicissitudes of day-to-day life in a world created from the dirt and blood and bones of a very real history.
What to do?
For one thing, don’t look to me for answers. All I have is questions. And yet … Seferis offers us a glimpse of something: he doesn’t propose a return to a glorious past that never existed, nor does he suggest leaving the ancestors to wither and fade, forgotten in the dark. He promises anxiety, and fear, and the possibility of hope and change, and maybe a moment of silence in which to draw the line between past and future.
I leave you with the last few lines of Seferis’ “Thrush”:
“… I’m not speaking to you about things past, I’m speaking about love;
adorn your hair with the sun’s thorns,
dark girl;
the heart of the Scorpion has set,
the tyrant in man has fled,
and all the daughters of the sea, Nereids, Graeae,
hurry toward the shimmering of the rising goddess:
whoever has never loved will love,
in the light;
in a large house with many windows open
running from room to room, not knowing from where to look out first,
because the pine trees will vanish, and the mirrored mountains, and the chirping of birds
the sea will empty, shattered glass, from north and south
your eyes will empty of the light of day
the way the cicadas all together suddenly fall silent.”
George Seferis, “Thrush” from Collected Poems (George Seferis). Translated, edited, and introduced by Edmund Keeley and Philip Sherrard. Copyright © 1995 by George Seferis. The entire poem can be found at the Poetry Foundation website: https://www.poetryfoundation.org/poems/51358/thrush.
January 19, 2021
In the Mood
Way back during my turbulent twenties – about the time Nancy Reagan was tossing out Rosalynn’s White House china, and Mount St Helens was tossing its summit into low earth orbit – I had a friend.
We’re going to call this friend “Carl,” mainly because that’s his name, and when I try to use pseudonyms I lose track of who’s who from one paragraph to the next. Carl was a director of theatrical productions, and possessed a wealth of interesting – if occasionally impenetrable – epigrams with which he informed and edified his actors. In the course of a friendship that lasted many years (and continues to this day, thanks to the internet) I managed to retain two important and enduring lessons from Carl’s store of wisdom:
A) that cultural sophistication is something you evolve over time, not something you can pick up by watching a lot of public television, and
B) that “mood” spelled backwards is “doom”.
In a twenty-two-year-old farm boy from rural Alabama, barriers to comprehension were high. It didn’t take me long to understand that inserting chunks of Eliot or Auden into every conversation never fools anybody, but the significance of the mood/doom connection went right over my head.
*
In the golden age of classical Greek drama, it was not unusual for actors to appear with their faces hidden behind large masks, each designed to display an emotional state – love, hate, anger, joy, whatever the story required. The identity of the actor behind each mask was unimportant; he was only a single individual, after all, and the Greek dramatists sought to address issues and concepts that were universal. When Aeschylus wrote about the relationship between Electra and her mother, he wasn’t just dealing with one woman’s story of rage and revenge. He was attempting to connect with those emotions in everyone, using a framework of narrative and characters already familiar to much of his audience.
I like Ian McKellen, but I don’t pay ten bucks to watch Mrs McKellen’s baby boy romp around a big screen. What I’m paying to see is Gandalf, or Magneto, or King Lear. I want characters who are also ideas, bigger than any one person. I don’t actually know Ian McKellen personally, but I feel that I know King Lear – or at least what he represents – and that’s what I’m after: the universal, the tragic, something I can connect to within myself even if I’m nothing like Ian McKellen.
This dialectic is one that every artist must address: how to describe an interior landscape without being so specific that nobody outside one’s own head can possibly find a way around in it. Hence the masks. If I want you to experience my joy, standing up in front of you giggling uncontrollably for an hour is probably not going to do the trick. Likewise with tragedy or pain. Even if I really just want to roll around on the floor in a swamp of tears and snot tearing at my hair and howling, that kind of display will at best inspire pity, and at worst terror, in a spectator. I have to find those emotions in you and call them out, not just stand there showing you how they affect me. I want you, the audience, the viewer, to feel something, but if I succumb to that feeling myself, I lose the ability to communicate with you effectively.
“Mood” equates to “doom”. It begins to make sense.
A painter, like any other artist, has something to say. Even if the message is just about a color (Yves Klein) or about paint itself (Jackson Pollock) or about the beauty of the square (Kazimir Malevich), there is a message, and there’s an emotional need to convey that message — but the more intensely felt the message is, the more profound the the risk of merely displaying the emotion, rather than communicating it.
If I’m too obscure when I put a painting out in front of people, too abstract, I may not be providing enough for a viewer to grab. A perfectly blank white canvas is a beautiful thing, but how is anybody supposed to intuit my emotional and intellectual concerns from several square feet of nothingness?
But what about the opposite? What if I make my message absolutely billboard clear and explicit, leaving nothing to the imagination? I will have made a statement, certainly, but will I have left enough room for the viewer to engage with me? After all, a conversation with someone who does nothing but shout one phrase over and over is not likely to be very emotionally fulfilling.
Somehow, I have to provide enough information to invite the spectator into the conversation, but leave enough unsaid that he or she can draw upon his or her own experience to complete the thought in a way that allows us both to take ownership of the exchange. I want the viewer to experience the emotion — the “mood” — but internally, not as something that I’ve smacked him over the head with.
* * *
August 25, 2019
A Likely Story
“It was a likely story. But then, all of his stories were likely.”
– Margaret Atwood, The Penelopiad
In a somewhat pointless exchange on Facebook recently (but aren’t they all, usually?) a friend-of-a-friend, struggling to defend against a criticism of current US President Donald Trump, trotted out the “birther” trope: the assertion that Barack Obama was actually born in Africa.
Her conviction is supported by a widely-circulated image of an alleged birth certificate labeled “The Republic of Kenya” and dated August, 1961. The simple historical fact that the Republic of Kenya only came into existence in December of 1964, three and half years after the date on the certificate, is not a deterrent to this woman’s belief in the absolute integrity of the document. She has harnessed her wagon to that particular mule; that the animal is dead and decaying bothers her not in the least. It’s her mule, and she plans to keep lashing away at it until the race has been run.
Meanwhile …
In a throwaway segment on “Good Morning America” a few days ago, television presenter Lara Spencer listed the activities in which Prince George, future King of England, would be participating as he began the new school year. One of those activities was ballet, a fact that Ms Spencer seemed to find amusing – amusing not in a “let’s be happy with this child” kind of way, but in a “let’s all make fun of this little sissy” kind of way.
Ms Spencer, whose credentials as a journalist include such highlights as a stint on “Antiques Roadshow”, and the host slot on “Flea Market Flip”, implied that the young Prince would lose his interest in dance very quickly, because people like her would be making fun of him for it.
We all have our own sacred cows, ideas that are so deeply embedded in our psyches that we are willing to go to any lengths, make any sacrifice, to defend them. Unlike concepts that are patently stupid, like Holocaust denial or trickle-down economics or Adam Sandler movies, these are so intrinsic to our worldview that they are usually invisible to us. Examining them objectively is like trying to see the back of one’s own head. When they cause us to do harm, it’s not because we mean to hurt anyone: Ms Spencer, in laughing at the young prince’s interests, did not intend malice toward the boy but rather was basing her comments on a stereotype, then using that same stereotype to justify her comments.
“We’re making fun of him for his interest in ballet because we are convinced that he won’t enjoy it because people like me will make fun of him for it.” Makes perfect sense, right?
It should be mentioned that, had Barack Obama in fact been born in the East Africa colony, he would not have been the only US President born under the British flag. George Washington, Thomas Jefferson, James Madison, James Monroe, and William Henry Harrison were all born in the British Colony of Virginia; John Adams and John Quincy Adams were born in the Massachusetts Bay Colony; and Andrew Jackson was born in the Carolina Colony. The first President actually born in the United States of America was Number Nine, Martin Van Buren, born in New York in 1782.
For the birther, it’s obvious that Barack Obama did not belong in the White House. She believes this because her background, her socialization, her identity all tell her that a non-white person cannot be President of the United States. However, phrasing the issue in that explicit way conflicts with her conscious self-image, which says that “I don’t have a racist bone in my body,” so she has to find a likely story that will mesh the very real revulsion she feels at the spectacle of a black man in a position of power with her view of herself as an educated and unbiased judge of persons and events. Toss in the (to her) unusual name, and the acknowledged fact that Mr Obama’s father came from the place that would eventually become the Republic of Kenya, and the solution is obvious: She can tell herself that her objection isn’t that Barack Obama isn’t white enough, it’s that he isn’t American enough.
“I don’t care whether the man is black or white. I’m just saying that the only way a black man could have been elected President is if there was a complex multinational conspiracy at all levels of government and society to put him there illegitimately. It has nothing to do with race.”
*
I like to consider myself pretty reasonable. My worldview is not based on animist superstition or the even more bizarre pronouncements of Jenny McCarthy or Franklin Graham, but on science and observation. This is what I want to believe about myself. This is my “likely story”.
When a paper wasp from the colony living above my front door lands on my arm, I don’t dance and scream and flail. I just stand there and wait for her to get bored and move on. Paper wasps (genus Polistes) have incredibly painful stings, so the visitor represents a very real threat, but I am sufficiently rational that I can remain calm and avoid confrontation. On the other hand, if one of the absolutely harmless camel crickets that infest my basement jumps onto my shoe, I go flying out the door and across the yard, hopping and squealing like a three-year-old at a pool party. I explain this behavior in a variety of ways: the creatures are close relatives of the cockroach, they are slimy to the touch, they feed on the kind of nasty detritus that one finds in a hundred-year-old dirt-floor basement, and so on. None of my “logical” explanations are at all convincing, but I have to try, because otherwise I’d have to accept that I’m behaving in a completely ridiculous, irrational way, out of unthinking fear, and that’s uncomfortable for me.
*
The genetic differences between a “black” American and a “white” American are often no greater than the distinctions between two people in a single family. Race is an artificial social construct that has no biological basis. The very definition of “black” or “white” is ambiguous. For many Americans a blue-eyed blond with one Nigerian grandparent can’t be considered white, while for others having a white mother meant that Barack Obama was about as black as Tilda Swinton. The terms of the argument are so deeply flawed that the argument itself can’t be anything but meaningless – yet, here we are.
Likewise, the idea that Prince George deserves a certain amount of ridicule for enjoying ballet derives from two completely valueless premises: one, that people will assume that he is gay, and two, that being considered gay will justify his being ridiculed. Both of these assertions have weight only because the people using them to support their beliefs give them that weight.
“My argument is valid because it is based on premises that are valid because the argument I’m making that is based on those premises is valid …”
*
In the end, the birther lady on Facebook slunk away, outraged that nobody bothered to challenge her, to give her a forum to vent some spleen, but instead just treated her like a doddering relative appearing unwanted at the dinner table: “Bless her heart, she doesn’t know what she’s saying, poor old thing …” Lara Spencer, meanwhile, published a non-apology on Instagram, accompanied by a picture of a lovely, but quite empty, landscape. In a more prescient individual, one could interpret that as a bit of self-deprecating humor, but … well, it’s probably just the first picture she found that wasn’t a selfie.
Barack Obama continues to bask in very high popularity numbers, and presumably Prince George will have a good time learning his pliés and I wish him well in the struggle he will face to find an identity for himself in the goldfish bowl in which he and his family live their lives.
Me? I’m doing the best I can with what I’ve got. Camel crickets still give me the heebie-jeebies, and I’m still no better at examining the back of my own head than anybody else, but at least I try to remember that it’s there.
* * *
July 16, 2018
Tire Tracks on the Putting Green
I’m not what you would call a fan of Donald Trump.
To be honest, I doubt if I would waste a good cup of coffee to extinguish a brushfire in his comb-over. At the same time, watching the video clips of our President lurching along in front of the 92-year-old Queen Elizabeth II at the inspection of the Queen’s Guard during his recent visit to the UK, my principal response was not disgust, or embarrassment, or outrage, or any of the other sentiments that seemed appropriate, but — strangely enough — sympathy.
Wait! Don’t hit me again: I can explain.
When a giraffe or a bluebird or an armadillo is born, he hits the ground with most of the guidelines for future interactions with other giraffes, bluebirds, or armadillos already programmed into his little brain. Over a period of days or weeks the adult animals in his life will provide updates and security patches, but the basic outline is pre-installed, and the little beast will know from the beginning how to behave in almost any situation that might arise involving others of his own kind.
We humans, on the other hand, resemble large pink sea-cucumbers at birth, squirmy tubes that take in food at one end and produce copious volumes of excrement at the other, sometimes simultaneously. We have a few basic reflexive actions programmed in, mostly having to do with moment-to-moment survival, such as sucking and grasping and a tendency to scream blue murder if confronted with the risk of falling or abandonment, but our social skills at that stage are, at best, incomplete.
This unfinished quality, called neoteny, leaves us incredibly vulnerable for the first few years of our lives, but in return it makes us something of a blank slate, capable of being trained to suit our specific environment. A human infant born to a naked cave-dwelling stone-age family requires a suite of skills and responses that are very different from those that might appertain to a silver-spoon baby in Westchester County coming home to a Swedish nanny and a trust fund. If humans started life with a one-size-fits-all set of internal guidelines like those of the giraffe, they would be equally limited in their ability to adapt, to spread, and to diversify. There are no giraffes living wild and proliferating in Greenland, or the Gobi desert, or Patagonia, or Chicago, or low Earth orbit. For better or for worse, there are humans in all those places.
There is a downside to this system, however, apart from the incredible challenges of keeping a human child alive and healthy long enough for her to survive on her own. What happens if, for some reason, the necessary programming is not all there when needed? Or if the programming is flawed, or outdated, or specific to a set of conditions that are not those that the child is actually experiencing?
Bear with me for a moment while I digress still further …
My social life involves occasionally showing up at other people’s houses for drinks, or food, or conversation, or some combination of the three. A specific time is usually indicated:
“Come over about five.”
“We’ll be having dinner at seven-thirty.”
“Let’s get together at six.”
I am, unfortunately, that terror of every host, the person who interprets the invitation absolutely literally. If you say 5:00, I’ll be coming up your front steps at 4:58 and then dithering on the doorstep for a minute and forty-five seconds before ringing the bell.
I’m not stupid: I know that my punctuality is not quite acceptable, but I simply don’t know how to make the adjustment. Does 5:00 mean 5:12? Does it mean 5:32? Is 5:05 too early? Is 6:00 too late? Other people seem to simply know what is intended, they show up at strange and patternless intervals over the entire course of the evening and it’s right. I’m doing precisely as instructed and it’s wrong.
This is an example of training that was correct under one set of circumstances, but which has not translated to a new milieu. My family was military, deeply conservative in its values. Punctuality was drilled into me from the first delivery-room butt-slap, and reinforced – with additional butt-slaps when required – over subsequent years. We were not social, we didn’t go to other people’s houses for drinks or tiny sandwiches, or ask those people to visit ours. Everything worked according to a set of strict rules, and if we didn’t know the rules for a thing, then we didn’t do that thing.
I am very much aware of my ineptitude in areas like this, and I am grateful to the friends who tolerate it with such good grace. I show up at the wrong times, I say the wrong things, I read the wrong books, I listen to the wrong music … But as Popeye would say, “I yam what I yam, and that’s all what I yam.” I’d upgrade my programming if I could, but it’s too deeply ingrained, and the new patterns are too vague, too uncertain, to overwrite it.
Now back to London.
Looking at the President’s face, his body language, his ill-fitting suit, his spray-on tan, flopping necktie and awkward, shambling gait, I saw the same arrogance and self-absorption that everyone else saw, but beneath that I also saw a man who had simply never been taught how to be nice, how to behave in social settings, how to be courteous to an old woman whose whole life has been spent bound up in rigid protocol and an elaborate and unbending system of rules governing her every waking moment. I saw a man whose parents had trained him relentlessly to eat or be eaten, to do unto others before they could do unto you – to never ask, to never admit, to never back down. To never, ever, let someone else get ahead of you.
In 1969, Canadian educator Lawrence J. Peter published a book describing what he called “The Peter Principle.” His thesis stated, essentially, that in a hierarchy, individuals rise to the level of their own incompetence – meaning that you do well, you master your craft, you get promoted, you climb the ladder … until you climb beyond your ability to perform, at which point your movement stops, and you settle in at that level, unwilling to backtrack, but unable to function where you are or to move forward, trapped and miserable.
Maybe I’m projecting, but Mr. Trump looked pathetic to me. He looked like someone who was tragically out of his element, ignorant of even those simple “Yes, ma’am/No ma’am/After you, ma’am” kind of rules that most people take for granted, and that can ease so many awkward situations. He was a big mangy mongrel hound at the Westminster Kennel Club, shedding all over the shih-tzus, expressing his anxiety in aggression and excessive barking.
He didn’t belong.
Life is easy when you can just yell at everybody, demand respect – or at least a reasonable imitation of it – because you’re the Boss, and (as Mr. Trump is so fond of pointing out) the Boss always gets to do whatever he wants. But what happens when you come up against an Angela Merkel, or a Queen Elizabeth, or a Barack Obama – people who always seem to know exactly which fork to use, and where to stand, and when to bow? People who have read all the right books and can quote all the right philosophers? If you’re me, you apologize, you ask for help – but I was trained to do that when necessary. What if you’re a man like our President?
Then, perhaps, you just stumble along – miserable, breeding misery, waiting for the first opportunity that comes along to make someone else feel even worse.
January 29, 2017
Paddling Point Nemo
I like to think that I’m a pretty easy-going sort of person.
I have strong opinions about a lot of things, but they don’t get in the way of my being able to talk to just about anybody, about just about anything, and I try to be courteous to, and considerate of, the people I deal with in my day-to-day life – regardless of who they are, and who I am. Sometimes I succeed, sometimes I fail, but I think it’s important to give it my best shot.
I’m not afraid of the middle ground. I spend a lot of my time there. I’m not religious, but I keep the Bible and the Koran on my desk, and I’ve read ’em both cover to cover. I don’t have kids, but I generally like the little monsters, and I have great respect for the people who dedicate themselves to raising them. I’m a pacifist who studies military history, and whose parents were both Air Force veterans. The music I love best is that of composers like Martinu and Poulenc but I enthusiastically join in with our neighborhood music group every weekend noodling around on songs by Creedence Clearwater Revival and the Eagles.
I live with cats, but hey – I like dogs, too.
The thing about the middle ground, though, is that it’s defined by the two extremes. When you’re meeting someone halfway between West Palm Beach and Miami, you’ll be lunching somewhere around Fort Lauderdale. The Floridian Diner may not have been the first choice for either of you, but it’ll do, and you’ll each have traveled about the same distance, made the same sacrifice in time and gas and convenience. You might both have preferred something closer to home, more familiar, but the compromise distributes the disruption evenly between you, and you’ll be equally comfortable and equally uncomfortable. No winners, no losers, but everybody gets lunch.
If you’re meeting someone halfway between West Palm Beach and Shanghai, on the other hand, you’re going to end up treading water mid-Pacific. You’ll get wet, and the sharks will be the only ones dining.
Suppose we’re having a discussion about healthcare reform. I advocate for a single-payer system, equal care for all citizens, regardless of income. You prefer a market-driven approach. I’m concerned that your system will favor the interests of stockholders and investors over those of patients, ultimately excluding all but the affluent. You’re worried that my system will end up stifling innovation and crushing patients under a burden of bureaucratic inefficiency.
Point Nemo, at coordinates 48°52.6′ south, 123°23.6′ west, in the Pacific Ocean, is the furthest point from any land on the planet. In a very real sense, it’s the “halfway point” between everywhere and everywhere else. The name “Nemo”, used by Jules Verne for the mysterious submarine captain in his 1870 novel “Twenty Thousand Leagues Under the Sea”, is from the Latin: it means “nobody”. It is very likely that that is exactly who has ever actually visited Point Nemo: nobody.
We can’t both be right, obviously, but are we both wrong?
We can each present rational cases, based on real-world data, to support our respective views. At the same time, because our objections are specific and reasoned, they are also addressable. I can look for ways to introduce market forces into my universal system, counterbalancing the inertia and inefficiency of government bureaucracy. You can accept some regulatory oversight in your free-market approach, guaranteeing equitable treatment regardless of income. Neither of us gets everything the way we want it, but we each leave the table with something. We meet halfway. We lunch on Las Olas Boulevard, and we both get a decent meal.
Let’s try another one.
I’m Emmett Louis Till, and I’m down from Chicago visiting relatives in Money, Mississippi. I’m fourteen years old, and I’m Black. You’re Roy Bryant and J.W. Milam, and you really, really hate Black people. Not for any particular reason, but just because you are who you are, and they aren’t.
I’m from the big city, and I’ve never been away from home before. I like wearing a tie and a grownup hat and going out and strutting my bad self down Main Street and buying little doodads to take back to my mom in Chi-town. You were born and raised in the Mississippi delta, and have never been anywhere else, or wanted to go. You think all Black people should be forced to live as slaves, or livestock, or not be permitted to live at all.
The middle ground? Perhaps Milam and Bryant could have crushed only one of Emmett’s testicles and only halfway gouged out his eye before halfway strangling him with a length of barbed wire, halfway shooting him, and throwing his body halfway into the Tallahatchie river. That would be about halfway between leaving him alone and doing what they ultimately did do, right? Emmett may or may not have whistled at Bryant’s lovely twenty-one-year-old wife in a store (Carolyn Bryant, now 82, has recently admitted that Emmett did nothing to trigger the retribution), but the attack was apparently motivated mostly by the desire to inflict as much suffering as possible on a young boy, simply because he was Black and all-too-visible; the victim was small, vulnerable, an out-of-towner, a target of opportunity. Emmett wanted to enjoy the summer vacation and then go home to his mom; Bryant and Milam wanted to torture and kill. So, again, how should Emmett have met Bryant and Milam halfway?
Any time you have more than one person in a room, you’re going to have differences. I’m tall, you’re not. I pronounce the word for the sister of one of my parents “ahnt”, while you make it sound exactly like the name of a tiny insect. I have a beard because I think it makes me look scholarly, you think beards are nothing but crumb-catcher bibs for messy eaters of a certain age. Despite these differences, we still get along, because we have made some rational decisions about what really matters and what can be overlooked in the interest of consensus and coexistence.
Looking for the middle ground in a disagreement makes sense if both of us are working from a rational, logically defensible point of view — but if one of us is clinging to a position that is irrational, that defies all reasonable evidence, or that denies the very humanity of the other, even perhaps that person’s right to exist at all, compromise ceases to be a possibility. After all, if what you really want, when all is said and done, is to obliterate me and everyone like me, I can’t become halfway dead in order to meet you in that middle ground.
*
I don’t like conflict. I let people overcharge me in stores, I allow other drivers to cut me off in traffic, I accept condescension and phony “tolerance” from people who see me as something only marginally different from a criminal or a lunatic, all because I prefer to avoid confrontation wherever I can.
At the same time, while I deeply appreciate the good intentions of those who say: “Must we fight? Can’t we meet halfway?” I know that the halfway point between violent, irrational hatred and ordinary human dignity is still nothing but deep water and sharks, and it’s not a place I ever want to be.
* * *
January 27, 2017
Really and truly.
Many years ago, during a visit to my family in my hometown of Boaz, Alabama, I got the notion to prepare a really fabulous meal for everybody.
On the face of it, this would seem like a nice gesture, but don’t fool yourself. I was thirty years old, and my snobbery knew no limits. I was from Boaz, but not of Boaz; I had gone away and become part of a wider world, and a fancy meal was just another way to prove my superiority. (I suppose all escapees from small towns go through that phase somewhere down the line. We’re Truman Capote or Andy Warhol: We go away for a few years, then come back to visit, proudly bearing suitcases full of Robert Rauschenberg and Igor Stravinsky and W. H. Auden and chicken recipes in Italian.)
At that time there were two grocery stores of any size and scope in the town, a Piggly Wiggly and an A&P. Since our family had patronized the Piggly Wiggly since time immemorial, that’s where I went to gather the materials for the feast I was planning. Spinach. Chicken breasts. Feta. Nutmeg (and something to grate it with). Butter. Balsamic vinegar.
My scheme was, of course, doomed from the beginning. The month was December. Spinach was available only as little green bricks packed in torn cardboard, crusted with ice that smelled faintly of cat urine. The chicken breasts were gray and exhausted, having been frozen and thawed more often than Great Bear Lake. The only cheese available – apart from Kraft “cheese food products” – was a rubbery orange material that claimed to have been manufactured with, but not of, real milk. There was no butter, only margarine. Nutmeg was there, yes, but pre-ground in a tiny red-capped plastic jar, with a sell-by date some three years previous to that of my visit. Vinegar was limited to cider and distilled white, in half-gallon jugs.
I made do, but I also made a fuss. After all, my true purpose – had I been willing to admit it – was to display my superior savoir-faire before the benighted locals, and this could be served just as easily by a spectacular failure that spotlighted the shortcomings of the local supermarket as by a successful dining experience.
Predictably, dinner was a flop, nobody enjoyed themselves – and I found myself even more frustrated and unhappy than I had been at the beginning of the process.
In retrospect, the problem is easy to see. I was desperately anxious to prove that I was so ill-adapted to the pond that had spawned me not because there was something wrong with me, but because I was in fact not a fish or a frog at all. I was a rabbit or a raccoon, not damaged or inadequate, just a critter meant for a completely different environment.
I refused to look honestly at what I was doing and why, and as a result went to a lot of trouble and expense only to make myself and everyone around me miserable.
*
Back in 1546, English poet and playwright John Heywood noted that “There are none so blind as those who will not see.” Heywood understood the difference between ignorance derived from a lack of information, and stupidity, in which the individual has the facts in front of him but chooses to deny them out of weakness, or laziness, or to protect prejudices or comfortable misconceptions.
Philosophers and theologians have struggled since the dawn of time with the question of objective truth: Is anything true in a universal, abstract sense, or is all information contingent, dependent on our perceptions and our ability to process the data? In 1637 René Descartes decided that nothing could be trusted but the fact that we were asking the question in the first place. While his “cogito ergo sum” makes a great bumper sticker, it unfortunately doesn’t give us much to work with. Gravity happens: if I step off the edge of the roof, I’m going to slam into the ground a split-second later. I can refuse to accept the existence of gravity as an objective truth, but I’m still going to bust my head every time I perform the experiment. We have to lay down some basic ground rules and agree that some things are “real”, regardless of ideology, or we don’t survive.
In the science of decision theory, there is a principle called “minimax/maximin.” Here, absolutes are irrelevant. The goal is to minimize the possible bad outcomes of a decision, while maximizing the possible good outcomes. In his “Pensees,” Blaise Pascal (1623-62) stated his argument for believing in God: “If I bet that God DOES exist, and he does, I win everything, and if I lose, I lose nothing. If I bet that God DOES NOT exist, and I win, I win nothing, but if I lose? I lose everything.” Truth becomes a question of calculation.
We all want certain things to be true, and others to be false. We can, to some degree, even behave according to those desires. Believing that Santa Claus lives in a vast factory complex at the north pole, churning out billions of brand-name consumer items that he will then distribute – at no cost to anyone, anywhere – during a single twenty-four hour period each December … Well, it’s a lovely idea, and I think we’d all like to be able to embrace it, but if we plan our holiday budgeting around that premise there’s going to be trouble.
*
Pretending that six hundred thousand people is a vastly larger crowd than one-point-eight million people is not “alternative facts”, it’s just foolishness, especially when the issue in question is not even of any real importance. “Spinning” information – presenting data in such a way as to support a particular objective — is a tried and true component of our politics, our marketing, and our advertising, and always will be, but even when we’re selling toothpaste or movie tickets or smartphones – or inaugural crowds, or border fences, or oil pipelines – we have to be able to discern the objective reality for ourselves. We can lie to everyone else, but the essence of a useful lie is that the liar knows that it’s a lie, and can act on the basis of the truth, regardless of what sort of fiction he or she is promoting to the crowd. I may convince you that gravity is a hoax, and that it’s perfectly safe for you to walk off the roof of a four-story building, but that scheme only works as long as I know that I’m lying to you. If I believe my own “alternative facts”, and I walk off the roof myself, then the game is over.
My relationship to my origins is still pretty complicated, but I have, over time, come to realize that the best thing for everybody involved is to simply face the facts. It’s useless to try to manage that relationship on the basis of what I think should be true, or what I wish were true, or what I have believed in the past to be true. I need to try to assess the facts, dispassionately and objectively, to the best of my ability. If I want to make things work, I have to be honest about what I’m trying to do, and about what the circumstances are – honest with others, yes, but most importantly, honest with myself.
Voltaire pointed out in 1733 that “The interest I have in believing a thing is not a proof of the existence of that thing.” We may not all agree as to what the facts are, but we can agree that the facts matter, that there is such a thing as objective reality, and that we should refer to it when we make the important decisions for ourselves and for those who depend on us. As soon as we start telling ourselves that the truth is nothing but an ideological construct, to be invented or destroyed or ignored at will, we’re only one step away from that long dive off the roof.
* * *
December 6, 2016
Elaine, let’s get the hell out of here.
I don’t like country music. The yodeling vocals, the whining guitars, the relentlessly predictable lyrics about faithless babes, abusive bubbas, pickup trucks, disreputable nightspots in the middle of nowhere … An hour of this, and a visitor from another planet would marvel that everything south of the Mason-Dixon line had not long since slid off into the Gulf of Mexico, crushed into slurry under the weight of all that drama and all those tears.
“Wait just a gosh-darned minute!” I hear someone shouting from the back row. “Yes, a lot of country music is like that, but it’s not all the same. You’re being unfair.”
As a matter of fact, you are absolutely correct, ma’am. I am being grossly unfair. Although the tropes that I’ve mentioned are common enough to have birthed the stereotype of the cowboy-hatted men and big-haired women that make up such a large part of the country music image, they are by no means the whole story. Isn’t it possible to loathe Porter Wagoner but love Willie Nelson? What do Jerry Jeff Walker and the Dixie Chicks really have in common except their Texas origins? Is Patsy Cline “country”? Is Kenny Rogers? Celine Dion has that breast-beating, sobbing delivery down to a science, but would anybody really put her on the same shelf as Tammy Wynette? Why is “Blue Bayou” a rock-n-roll ballad for Roy Orbison, a pop song when Linda Ronstadt sings it, but country when Martina McBride takes it on?
Elaine de Kooning once recalled a party where she and another painter, Joan Mitchell, were asked, “What do you WOMEN artists think … ?” Mitchell interrupted, “Elaine, let’s get the hell out of here.” Mitchell, de Kooning, and other female artists of their generation suffered mightily under that characterization by gender, which made it so easy for the male-dominated world of critics and collectors to dismiss them en masse, classifying them as nothing but muses or bedmates of the “real” artists, which is to say, of course, the men. Labels. Categories. Fences made of words.
In a previous life I lived in Dallas, Texas, where there was for some years a Tower Records, where I could drop in and pick up a handful of CDs a couple of times a month. The store was carefully organized by genre: Country, World Music, Jazz, Pop/Rock, Classical (in the basement), Soundtracks, Children’s Music, and so on.
Even a casual perusal of the arrangement, however, betrayed serious shortcomings.
Take, for instance, the classic 1964 album Getz/Gilberto, with Philly native Stan Getz, Brazilian bossanova greats Joao Gilberto and Antonio Carlos Jobim, and vocals in both Portuguese and English by Gilberto’s German/Brazilian wife Astrud. Where did this music belong? Was this “Jazz”? Getz was, after all, a well-known tenor sax player in the New York jazz scene, and the album was recorded on Verve, a jazz-oriented label, in that city. Or was it “World Music”, as Gilberto and Jobim were already becoming legends in Brazil? Or maybe it was “Latin”, a category that embraced everything from mariachi to Andean flutes to Italian pop songs recorded in Madrid? All of the above? None?
According to music licensing service ASCAP, the most-recorded song in the history of copyrighted music is the aria “Summertime”, which appears a couple of times in Gershwin’s Porgy and Bess. ASCAP lists more than 25,000 different recordings of “Summertime”, by artists ranging from Billie Holiday and Sam Cooke to Janis Joplin and The Fun Boy Three. Operatic aria? Jazz standard? Pop classic? What difference, really, does it make?
Here’s another one for you: The first opera ever written by and about Americans was Porgy and Bess, with music by Jewish New Yorker George Gershwin and text by his brother Ira and poet DuBose Heyward. The work deals with love and death in Catfish Row, a dockside tenement in South Carolina; the characters are the children and grandchildren of slaves, and the style of the music is drawn from black worksongs, gospel, and other mostly African-American music forms. Critics for decades have wrestled with finding a convenient niche for this work: do we lump it in with The Barber of Seville and Wagner’s Ring Cycle, or do we call it jazz and stick it on the shelf between Ella Fitzgerald and Herbie Hancock? Is the music black, white, New York, South Carolina, jazz, pop, classical, lowbrow, highbrow … where the hell does it go?
Categories are the darlings of marketers, but the bane of creators. Nobel laureate Doris Lessing‘s five-volume Canopus in Argos: Archives is a vast and detailed analysis of a series of different social structures on several different planets, viewed over a span of millennia – nothing at all like her intimate, semi-autobiographical novels about life in mid-twentieth-century South Africa. Neither fish nor fowl, Lessing is impossible to place in any one category, but equally impossible to ignore. Charles Dodgson, better known to us as Lewis Carroll, the author of the immortal Alice’s Adventures in Wonderland and Through the Looking-Glass, was also the author of an comic poetry epic, a textbook on an abstruse branch of mathematical logic, and of a satire of Victorian English society disguised as a story about fairies. Is he a children’s book author, a poet, a mathematician, or a social commentator? Where do we put him, for crying out loud?
*
Let’s go back to the statement I began this essay with: “I don’t like country music.”
What I’m really saying is that because I don’t like certain music or musicians that happen to be classified within a certain (completely arbitrary) category, I can justify throwing out everybody else who might happen to end up in that same category without bothering to listen to them first. Since I don’t care for Travis Tritt, I can walk past that entire section of the record store without so much as glancing at what else is being offered. It’s like staying away from New York City because you once had a bad meal at a Greek restaurant in the East Village.
We like organizing things, sorting everything – and everybody – into structures that allow us to rely on generalizations to determine our attitudes and our behavior, without requiring us to examine the component parts on their own individual merits. “Country”, “Jazz”, “Classical”, “Grunge”, “Rap” … With a single word we can accept or dismiss vast swathes of creative effort. No muss, no fuss, no need to invest a lot of time listening to anything unfamiliar.
Why not take this a step further, and add a few more labels to our shelves: “Abstract”, “Impressionist”, “Minimalist”, “Pop”? Or how about “Mystery”, “Poetry”, “Sci-fi”, “Thriller”? Or maybe still a few more: “Liberal”, “Trumpster”, “Intellectual”, “Evangelical”? Neat little drawers, each with its own label. So convenient.
The attractions of this approach are undeniable. Everything is so simple when you can reduce the entire messy, random circus of human existence to just a few convenient tags, and walk right by the awkward bits without even turning your head.
* * *
December 4, 2016
Calculating the value of pie.
Of all the obnoxious and unpopular universals we have to deal with – gravity, conservation of momentum, the ratio of the circumference of a circle to its diameter, the speed of light in a vacuum, the way coffee never tastes as good as it smells – the one that seems to be the hardest for most of us to accept is entropy.
Just when we think we’ve gotten a handle on things, figured out how to survive, how to be happy, how to get through the day, we discover that the universe has marched on and the situation has changed. Suddenly all the systems and workarounds that we rely upon to keep us sane no longer work the way we expect them to. The rules have changed on us. Loved ones die, things break down, the places that are important to us become strange and different. “For no reason!” we insist, red-faced and frustrated, but in fact there is a reason: simple entropy.
I own a car that is now entering into its sixteenth year of life. I don’t drive it much, and I take care of it to the best of my (admittedly limited) ability, but nobody’s ever going to mistake it for a new vehicle. The headliner is pulling loose, the paint is dinged, the driver’s-side window no longer goes up and down: entropy. Even if I had shrink-wrapped the car sixteen years ago and stored it in a climate-controlled bunker in the desert, it would still not be the same car it was when it first rolled off the VW assembly line in Puebla, Mexico. Plastics deteriorate, fabrics sag and pull, the same chemical and mechanical processes that created the materials and parts continue long after the papers are signed and the keys handed over, turning gaskets into ash, warping delicate fixtures, and disabling sensitive electronics.
One of the most important features of entropy is its adherence to what is known as “the arrow of time”. This is to say that entropy, unlike any other measurable quantity in our universe, only works one way: things break down with the passing of time, going from more structured, more organized, to less. A muffin, a Maserati, or a man will, given enough time, be reduced to component atoms, and the carbon in an oatmeal muffin is absolutely identical to, and interchangeable with, the carbon in my red blood cells. That carbon will not spontaneously reorganize itself into a bird or a pot roast, not without the expenditure of enormous energy and even more time — during which everything else is still sliding into oblivion.
At absolute zero, -459.67 degrees Fahrenheit (-273.15 degrees Celsius, zero Kelvin and Rankine), everything stops. All activity in the sub-atomic world of electrons and protons ceases, and matter becomes inert and unchanging. This is, however – like the perfect marriage or consumer-friendly air travel – an imaginary state. In the real universe, nothing achieves absolute zero for long. Even in deepest space, beyond the light of any star, the background radiation left over from the Big Bang keeps everything percolating away at about four degrees Kelvin. Things slow down Out There, but they don’t stop. Here, in the world of light and air and heat that sustains us, entropy churns along at a pretty frantic pace. We can irradiate our tomatoes until they glow in the dark, persecute termites and mildew and dry rot with all the passion and inventiveness at our disposal, but in the end, the leftover pasta sauce goes furry and green, the shower curtain has to be replaced every August, and the tires on that bicycle you haven’t taken out of the garage since the Reagan administration crumble away to nothing.
*
Make a pie on Sunday, and then eat a slice of it every day thereafter. At some point you will discover that the dish is empty, and there’s no more pie. This is irritating, but it’s not the fault of immigrants, or healthcare reform, or political correctness. It’s just that pie is finite, you ate all your pie, and sooner or later you have to either make a new pie or find something else to snack on. You have to change. You have to do something different. No rhetoric, no rallies, no ranting on cable news is going to make that pie last forever. The universe moves on. Things are consumed, becoming something else. Life happens.
I wish I still had the hair and teeth and knees I had at twenty. I wish there were still places on Earth that were represented on the maps by big glamorous empty areas marked “Terra Incognita” and “Here there be dragons.” I wish a new Chrysler Imperial cost $1,500, and doctors made house calls. I wish I could read The Haunting of Hill House for the first time, again and again and again.
I wish a lot of things, but the universe really doesn’t give a damn what I wish – the universe has much more important things to do.
So, what are my options? Obviously, pretending that entropy just isn’t happening is not very helpful. Nor is simply throwing up my hands and locking myself into the basement to wait for everything to grind to its messy and inevitable end. Punish the Jews, or the Muslims, or the gays, or the poor, or the people in the big fancy house down the street for the fact that my pie didn’t last as long as I had hoped it would? None of these things is going to make the tiniest bit of difference in the end. I’ll just be making life more difficult for people who are probably no more to blame for my bad knees and thinning hair than the Queen of Sheba. Things are going to change. Tomorrow will never be exactly like yesterday. It’s nobody’s fault, it’s just the way the universe works. I can learn to deal with it, change with it, or I can shoot myself in thehttp://www.turningupbones.com/http://... head before entropy has a chance to wind things up for me. My choice.
For the moment, however, here we are. I’m still going on and on about all sorts of things, and you’ve actually managed to stay with me all the way to here. So sit with me for a bit longer. We’ll share some of my pie.
* * *
December 1, 2016
Bonfire of the Vanities
During my survey of the art news this week I happened upon a provocative headline from the Daily Beast: “Why Artist Gerhard Richter Destroys His Own Art” . The title of the article is a bit misleading: the writer asks the question but she does not actually attempt to answer it. Instead she merely elaborates on the fact that Mr Richter has destroyed a considerable number of his own paintings over the years. She did, however, get me thinking about artists and their emotional relationship to the products of their craft — because I, too, often feel the desire to haul a big load of my artwork out into the yard and set it on fire.
Although most Americans know the phrase “Bonfire of the Vanities” as the title of a 1987 novel by Tom Wolfe, it actually comes to us originally from an event in 1497, when the Dominican priest Girolamo Savonarola and his followers collected books, art, musical instruments — anything that might tempt the faithful to the sin of vanity — and burned them in the the town square of Florence, Italy. A passionate reformer, Savonarola alienated everyone from the Pope to the powerful de Medici family and eventually ended as the star attraction at yet another bonfire, when he was hanged and his body burned in that same town square.
For me, personally, the urge to destroy has nothing to do with what I think of the quality of the work. It encompasses good pieces, bad pieces, even pieces I love. Any product of my hands and mind can suddenly cry out to be included in the auto–da–fé. Instead, it has more to do with the way the products of creative effort can slowly accumulate into a kind of crust, cutting off air and light, stifling new ideas.
*
William Faulkner once advised his fellow writers to “Kill your darlings.” The Nobel laureate was speaking about the risks of becoming so emotionally invested in certain characters or situations that the work as a whole becomes nothing more than a tribute to those “darlings”, devoid of interest to anyone outside the author’s own head and heart. (After all, listening to someone singing the praises of his own offspring, while endearing in small doses, can pale rapidly when no other topic is ever permitted to intrude.) This can apply to a visual artist as well: The artist finds a technique or a subject that works well, that gets the results that she craves, and then slowly allows everything else to atrophy. Innovation, risk, and experimentation are lost, and after everyone has become sated with the confections she’s been providing, she realizes to her dismay that she’s forgotten how to do anything else.
As with so much in art, there are no hard and fast rules. Some artists have repeated themselves endlessly, and yet remained endlessly fresh and relevant. Rembrandt’s self-portraits, Degas’ dancers, Modigliani’s mistresses, the collages of Hannah Höch or the little theatres of Joseph Cornell. All of these tap into a vein of creativity that could not be exhausted in a year, a decade, or even a lifetime. Others, like Salvador Dalí and Andy Warhol, having successfully made important statements about art and life, then proceeded to repeat those same pronouncements ad nauseam, until only death could save their bedraggled artistic reputations.
Jackson Pollock, Mark Rothko and Nicolas de Staël, upon reaching a level of success that most artists can only dream of, each woke up one day to realize that he had become little more than a machine for turning out lucrative and popular Pollocks, Rothkos, and de Staëls. The creative landscape is littered with the corpses of careers that died a slow and ugly death as artists found themselves paralyzed by a moment of success, the reports of their activities gradually moving from ARTnews, the NY Times Review of Books, or Variety to the supermarket tabloids and the police scanner.
In 1950 Dutch-born American artist Willem de Kooning, after decades of poverty and obscurity, produced a painting titled “Excavation“, that catapulted him overnight to the pinnacle of the New York art scene. Influential critic Clement Greenberg praised “Excavation” as one of the greatest paintings ever produced in America. Collectors began snatching up works that a year before they wouldn’t have accepted as gifts. The artist had arrived.
De Kooning never produced another painting even remotely akin to “Excavation.” In fact, he turned away from abstraction completely and began working on “Woman I“, the first of what would become a series of savage and terrifying explorations of the female form. A horrified Greenberg condemned the new work, and de Kooning once again slid — for a time, at least — back into the shadows. In retrospect, we can see what a courageous act this was. With “Excavation” de Kooning achieved fame, but then, rather than allowing that moment of success to define him forever, he simply descended back into the mines for dig for new treasures.
Like de Kooning, Richter has been both acclaimed and ridiculed, but he has never allowed himself the luxury of becoming “the man who paints Richters”. Instead, he continually reinvents himself, a strategy that has allowed him to become financially and critically successful while still remaining artistically relevant. Occasionally destroying valuable artwork is part of that process of reinvention.
Richter himself has expressed mixed emotions about his periodic pogroms. He speaks of some of the lost works with regret, yet he does not question the need for the cull. His ruminations evoke the Hindu tradition of Shiva, the Destroyer, who destroys not out of malice but impersonally, arbitrarily, to make room for the ongoing work of Brahma, the Creator. Push and pull, constant movement between the two poles.
*
The market value of the works that Richter is known to have obliterated is estimated at somewhere around $65 million. My bonfire of the vanities would encompass little more than a few hundred dollars’ worth of paint and plywood. Still, it is strangely comforting to know that sometimes the cat and the king may both warm themselves at the same blaze.
* * *