Miles Watson's Blog: ANTAGONY: BECAUSE EVERYONE IS ENTITLED TO MY OPINION , page 31
March 19, 2017
CAGE LIFE takes Top Honors for 2016
I'm pleased to announce that my debut novel, CAGE LIFE, has received the 2016 "Book of the Year" Award from the online British magazine Zealot Script. This is the second honor the book has been awarded since its release almost exactly a year ago. According to the citation, "This award is presented to a specially selected work that we have judged to be the best independently published book of the year." CAGE LIFE had already been awarded a "runner up" in Shelf Unbound magazine's Best Indie Book Awards for the same year, but this time it took home the gold.
The boys at Zealot Script asked me for a statement to go along with the announcement of the award, which appeared today. I thought I would share it with you in lieu of the blog I was going to release this week, which was about the trial and travails of being an independent author:
They say the greatest trick the devil ever pulled is convincing the world he doesn't exist. Well, the greatest trick an indie author ever pulled is convincing anyone they do exist. Every day, literally every hour, is a struggle, not only against all the other independents out there, but against a publishing system that reserves all of the marketing oxygen for itself. It's a constant war, not against indifference, but complete unawareness. The goal is to sell books, of course, but before you can sell, people have to know what you're selling – indeed, they have to know there's something being sold. When I released CAGE LIFE early last year, I quickly realized that all the agonizing struggles that had come beforehand were simple prologue: the actual battle, marketing the damn thing, had yet to be fought. I knew my work existed; I just had to inform the rest of the world. And hope that they gave a damn.
Winning Zealot Script's “Book of the Year” for 2016 was a beautiful bolt from the blue. After a year in the market, CAGE LIFE had garnered good reviews, sold more than a few copies, and even picked up a small award from a literary magazine, but had yet to achieve the breakthrough moment that I desired. Trying to make myself heard amidst all the other authors trying to make themselves heard was a tougher assignment than I had feared, and I was becoming a little discouraged. Coming home to the Zealot Script trophy was precisely what I needed at the precise moment I needed it the most. And how often does that happen in life?
I know that the focus of Zealot Script is on science fiction and fantasy, genres I read but have yet to tackle as a writer, but I like to think that although CAGE LIFE is a crime novel, it was heavily influenced, at least in terms of the style in which I wrote it, by authors as diverse as Frank Herbert and Ursula K. LeGuin, who I read avidly growing up, and still read today. Indeed, one of the reasons I chose to publish the novel independently was because I became deeply frustrated with the traditional publishing world's emphasis on branding (read: constraining) authors, not merely to a particular genre, but to a particular style of delivery. Mine was supposedly too “literary” for the world of the Noirish thriller. I rejected that conclusion, and I'm delighted that Pete Richmond did, too. Indeed, I am very grateful to Mr. Richmond for taking the initiative to reach out to me at a time when I was struggling to get interviews, to the staff of Zealot Script for selecting my novel for the award, and to anyone and everyone in the readership who gives it a chance. Because that is really all any indie author can ever ask for: a chance.
March and April are going to be big months for the novel, as I am running a series of advertising campaigns designed to promote awareness of the book and its sequel, KNUCKLE DOWN, which, I'm pleased to say, many readers have told me is the superior work. Whether that is true or not, I'm pleased to be taking my writing to the next level, because, having mastered every aspect of obscurity and poverty, I find myself more than ready to try conquering the concept of unlimited wealth and international fame. Or just paying the bills on time.
The boys at Zealot Script asked me for a statement to go along with the announcement of the award, which appeared today. I thought I would share it with you in lieu of the blog I was going to release this week, which was about the trial and travails of being an independent author:
They say the greatest trick the devil ever pulled is convincing the world he doesn't exist. Well, the greatest trick an indie author ever pulled is convincing anyone they do exist. Every day, literally every hour, is a struggle, not only against all the other independents out there, but against a publishing system that reserves all of the marketing oxygen for itself. It's a constant war, not against indifference, but complete unawareness. The goal is to sell books, of course, but before you can sell, people have to know what you're selling – indeed, they have to know there's something being sold. When I released CAGE LIFE early last year, I quickly realized that all the agonizing struggles that had come beforehand were simple prologue: the actual battle, marketing the damn thing, had yet to be fought. I knew my work existed; I just had to inform the rest of the world. And hope that they gave a damn.
Winning Zealot Script's “Book of the Year” for 2016 was a beautiful bolt from the blue. After a year in the market, CAGE LIFE had garnered good reviews, sold more than a few copies, and even picked up a small award from a literary magazine, but had yet to achieve the breakthrough moment that I desired. Trying to make myself heard amidst all the other authors trying to make themselves heard was a tougher assignment than I had feared, and I was becoming a little discouraged. Coming home to the Zealot Script trophy was precisely what I needed at the precise moment I needed it the most. And how often does that happen in life?
I know that the focus of Zealot Script is on science fiction and fantasy, genres I read but have yet to tackle as a writer, but I like to think that although CAGE LIFE is a crime novel, it was heavily influenced, at least in terms of the style in which I wrote it, by authors as diverse as Frank Herbert and Ursula K. LeGuin, who I read avidly growing up, and still read today. Indeed, one of the reasons I chose to publish the novel independently was because I became deeply frustrated with the traditional publishing world's emphasis on branding (read: constraining) authors, not merely to a particular genre, but to a particular style of delivery. Mine was supposedly too “literary” for the world of the Noirish thriller. I rejected that conclusion, and I'm delighted that Pete Richmond did, too. Indeed, I am very grateful to Mr. Richmond for taking the initiative to reach out to me at a time when I was struggling to get interviews, to the staff of Zealot Script for selecting my novel for the award, and to anyone and everyone in the readership who gives it a chance. Because that is really all any indie author can ever ask for: a chance.
March and April are going to be big months for the novel, as I am running a series of advertising campaigns designed to promote awareness of the book and its sequel, KNUCKLE DOWN, which, I'm pleased to say, many readers have told me is the superior work. Whether that is true or not, I'm pleased to be taking my writing to the next level, because, having mastered every aspect of obscurity and poverty, I find myself more than ready to try conquering the concept of unlimited wealth and international fame. Or just paying the bills on time.
Published on March 19, 2017 11:31
March 12, 2017
HOW JOURNALISTS CAN GET IT RIGHT: FIXING THE NEWS, PART 2
So far I have been speaking in broad terms about how we, the readership, can work to counteract some of the worst abuses of journalism. But these are only temporary, stop-gap measures; a jury-rigged filter to keep the worst nonsense from seeping into our brains. The real fix lies deep in the heart of the profession itself. In the previous article I made six specific accusations against the press, and having done so (and found the press guilty, I might add, in my self-appointed role as judge) it falls upon me to offer suggestions as to how to rehabilitate this most vital aspect of our democracy: not only a free but an objective and professional press.
In regards to the first two, and last two, accusations, that American journalists don't understand their own profession of journalism, are astonishingly ignorant of the country in which they live, and tend to “report” only what they want to happen and not what is manifestly happening, etc., etc., the fixes are actually quite simple, if not necessarily fast-acting or easy to execute. Beginning immediately, every student studying journalism should be required by his professors to study, at some length, the principles and ideas of every popular political ideology in the United States, and be able to state those principles, with objectivity and accuracy, back to the professor, as well as write about them in great detail. This is hardly an unreasonable demand: Dr. Joseph Goebbels, Hitler's Minister of Propaganda and in some ways a more fanatical Nazi than Hitler himself, once astonished and amused guests at a formal reception before WW2 by three times mounting a table and delivering, extemporaneously, seemingly heartfelt speeches in favor of monarchism, communism and democracy. Goebbels' point, seemingly, was that he could make eloquent arguments in favor of all rival systems of government without having the slightest belief in them. My point, in referencing this, is that Americans no longer seem to possess the courage to explore or try to understand the viewpoints and motivations of those they disagree with -- a key element not only to arguing intelligently with them, but to overcoming them in the political arena. The victory of Trump and "Brexit," the departure of the United Kingdom from the European Union, are both, in my mind, direct extensions of this willful ignorance born of over-sensitivity. The decayed, degraded, degenerate modern journalist (himself only an extension of a decayed, degraded and degenerate modern society), who, having shielded himself from the true motivations of those he disagrees with, is shocked when huge masses of people disagree with him. To many of our young people, and indeed, many of the professors who teach those young people, the very act of grasping, without exaggerations or distortions, the principles of one's political foes is deemed "offensive" and "micro-aggressive" (cue violin), when it fact it is simply educational. The old saying "Knowledge is power" has been replaced by the Oceanic credo of 1984: "Ignorance is Strength."
I do not wish to be redundant here, but the problem is so cyclical in nature that my point bears repeating, or rather rephrasing. American universities, which used to be oases of free thought and passionate but principled disagreement, are now veritable petri-dishes for breeding oversensitive, under-educated weaklings who can't bear to be exposed to ideas that make them uncomfortable. These weaklings -- it is the right word, I'm sad to say -- go into journalism without understanding or respecting any political belief they themselves don't hold, and not only produce “journalism” which is as weak and narrow-minded as they are, they are praised for their “sensitivity” by their colleagues and readers alike. But journalism, like education, is not and never has been for the sensitive: respect for the prejudices and preconceived notions of the readership -- for their political "sacred cows" and feelings generally -- is absolutely no concern of the journalist's art. On the contrary, the outstanding feature of any reporter is his is insensitivity, or more specifically, his toughness, and a tough reporter, be he conservative, liberal, libertarian, or what have you in his political sympathies, knows that the best way to deliver informed, even-handed journalism is to understand both sides of the story, not merely the one he is covertly in sympathy with.
In tandem with this, every major press organization, regardless of whether it is print or electronic in nature, must re-open the bureaus they once maintained in what Homer Simpson so eloquently called “the great useless mass of land between New York and Los Angeles we call America.” As I have previously stated, the mixture of shock and disbelief with which the big-name press regarded Donald Trump's election came largely from their own deliberate, wilful ignorance of the mood of “the rest” of the United States, which they more or less openly hold in contempt. The deep, searing vein of anger which exists in places like Youngstown, Ohio – a desolation of shuttered shops and dead factories left behind by Obama's economic policies – was either not paid serious heed or simply not grasped at all, because few reporters troubled to go to places like Youngstown, and when they did, few could refrain from depicting the inhabitants as ill-educated, racist bumpkins whose opinions didn't matter a damn anyway. Had the organizations in question stationed reporters in such places year-round, had they bothered to take the American pulse in the non-election years that make up most of our lives, they might have taken Trump, or at least what he represented, more seriously. In my previous article I noted that George Orwell once accused the British ruling class of being unable to grasp the dangers of Fascism, because to do so would have meant studying the theory of socialism, which would have in turn laid bare the gross social and economic inequalities that they, the British ruling class, existed to preserve. "To fight Fascism," Orwell noted. "One must begin by admitting that it contains some good as well as much that is evil." But this beginning never took place. To keep their sense of moral superiority, the British rulers had to remain permanently ignorant, and by this ignorance they led their country to war and their Empire to destruction. So it is with the American press. To understand the Trump phenomenon, or Brexit, or the rise of nationalist movements generally, they would have had to understand the anger which motivated them, which would have not only meant actually wading into the "great, useless mass of land we call America" but realizing that some of the Heartland grievances were actually legitimate. And this was precisely what they did not do. Indeed, they could not, because the deficiency in their political outlook, their innate "sensitivity" and refusal to grant even the possibility that the "other" party might have a point here and there, made it impossible. Their political ideology -- that only big-city-dwellers, Millennials, hipsters, progressives, liberals, gays, university professors, people of color and mixed race actually matter -- had forced them into a line of reasoning as fixed and an unalterable as a railroad track, with the exception that the track led right off a cliff. The number of bloggers and self-appointed pundits who were Googling "white working class" on election night must damn near have broken the Google servers, but even after belatedly discovering the existence of this seething mass of voters, tens of millions strong, their only reaction was to double down on their disdain. The amount of whining, weeping and tantrum-throwing that followed the election was at once amusing and infuriating. Only now, with the inauguration-day smoke beginning to clear, do I see any signs at all that the coastal press (Los Angeles, Washington, New York, Boston) is starting to grasp the necessity of "colonizing" the vast heartland of America. If this nascent policy is followed through -- and "if" is the longest word in the English language -- then within a year we ought be seeing reportage which will make the outcome of the next election far less shocking in its outcome.
As to my third point, that journalism-for-profit has debased the profession and led to an increase in sensationalism which has, in turn, led to further blurring the line between hard news and entertainment, it's necessary for me to explain that for much of their history, electronic news programs, from the earliest days of radio onward, were operated at a loss, simply as a public service. It was not until 60 Minutes debuted in 1968 that networks realized they could make a profit from radio or television news, and while 60 Minutes is in many ways a fine program, it definitely blurred the line between hard news and “news entertainment,” with its slanted coverage, aggressive, confrontational interviews and ambush tactics. Nowadays, it is taken as a matter of course that news programs of any kind be profitable, but news for profit contains a great evil within its core, to wit: what is true often does not sell. Many of the most important stories of modern times, such as the savings-and-loan collapse in the 1990s, or the fact that the Pentagon cannot account for 6.8 trillion (with a t dollars of taxpayer money, hardly entered the public consciousness because no way could be found to “sex them up” in the manner of, say, the O.J. Simpson Trial. The desire people have for salacious gossip and tawdry scandal is never going to go away, but that doesn't mean that legitimate news entities have to give these grotesqueries more coverage than they deserve. We must return, however painfully, to the idea that a large majority of the populace are uninterested in meaningful news, and leave them to TMZ and Extra!. But those with functioning brains, who want to know what the hell is doing and being done in the world, deserve the best, the hardest-hitting, the gutsiest journalism we can give them, and never mind the goddamned profit margin. Now, it so happens that the United States has 540 billionaires, vastly more than the rest of the world combined. These individuals have more money than they could ever possibly spend and many of them look for ways to spend some of it charitably. Surely one or two of them could be convinced, with the right arguments, to pump cash into the hardening arteries of our better news organizations, or failing that, fund their own, one which approaches stories without worrying about the economics of covering them, or whether their reportage will anger some important sponsor whose advertising money must not be lost. Such a news entity, freed from economic concerns, would provide Americans with a quality alternative to the ideologically bigoted trash we are now being served in super-sized helpings. It would also attract many young people, journalistically inclined, who are notweaklings, as well as many old journalistic veterans who have been marginalized and disenfranchised simply because their standards are too high for the great unwashed. The vets could supply the experience, the standards, and the discipline necessary to maintain them; the young could supply the passion and technical, social-media savvy.
Such a union, incidentally, would actually assist us with my fourth point, the idea that the 24 hour news cycle, by virtue of having to keep each of those hours filled with content, changed the character of journalism by forcing scribes to create, rather than merely report, the news. Our imaginary news organization – let's give it the working title "5WH" – would operate along older-fashioned lines, appearing for no more than six, or in times of crisis, perhaps eight hours a day on television. What's more, it would operate under these guidelines:
1. Understand that the first duty of a news agency is to inform the people, while shaping their views is fit only for the op-ed section. This leads us to....
2. Divide the “paper” between hard news and opinion-editorial, and make explicitly clear where the line is drawn, if necessary by the gross expedient of labeling each individual story. This is precisely the opposite of the present trend of mingling news and op-ed in such a way that the distinction no longer exists.
3. Insist on objective, factual reporting at all times. Avoid use of partisan political words and phraseology in reportage. Avoid the impulse to ridicule or condescend those who are interviewed, or, for that matter, to be overly sympathetic with them. Prevent journalists from injecting themselves, their personalities, or their politics into their hard-news stories, except in those cases (undercover/investigative journalism, for example), where it is impossible.
4. Practice public self-assessment and self-criticism, if necessary through the use of an ombudsman, an independent watchdog within the organization with the power to discipline those who fail to meet standards.
5. Invite scrutiny. Never hesitate to admit, and be held accountable for, errors and mistakes. They are going to occur, and when they do, they can be defused by a simple mea culpa, which will restore public confidence. Above all, avoid the practice of doubling down on tainted stories, a la Rolling Stone with its libelous “A Rape on Campus” piece, simply because you wanted the story to be true. Mistakes happen. Admit them, learn from them, and move on. Journalists live to destroy politicians who would rather cover up a minor peccadillo than confess to it, and what's sauce for the goose should be sauce for the gander.
5. Keep a finger on the pulse of America – not just New York or Los Angeles – by maintaining the afore-mentioned bureaus, or at least individual reporters, in cities and towns across the country, and keep those reporters reporting. Poke. Prod. Dig. Report. Ask questions and listen to the answers. And don't be afraid of Americans from Cleveland. They probably won't bite you. Probably.
6. As I stated in Part 1, anyone with a working internet connection can call themselves a reporter, and I asked that the reader learn to differentiate between the legitimate and the fake before clicking his or her mouse.
But legitimate journalistic agencies should have even less compunction about calling out hacks, frauds, race-baiters, fear-mongers, and the plain morons of the community for what they are. Here, as in all areas of reportage, objectivity is a must: a left-leaning news agency must drop the hammer on left-wing frauds, and vice-versa. Sympathy with someone's political aims must never be allowed to influence the way we grade the reliability of the news.
It will be seen here that many of my suggestions to date, are not solutions in themselves but merely components of solutions, and mostly very broad in character. This is true, because the problem we face is itself both broad in nature and deeply rooted in our societal flaws as a whole. A disease which has been progressing slowly for years will not be cured in a day, a fact which is disheartening at the outset of the struggle. On the other hand, a disease left untreated is invariably fatal -- in this case, to our democracy -- and this disease can be stopped. The task is large, the obstacles many, and the people most effected by the problem either unaware of it or too apathetic to do anything about it. This does not serve as an excuse for the awakened, active individual to do nothing: indeed, it robs him or her of that excuse. Generations of Americans, reaching back to the time of the Revolution, to the Civil War, through the age of the Civil Rights Movement and beyond, suffered and in some cases died so that basic human freedoms could be enjoyed by everyone in this country. Perhaps the most basic freedom we enjoy -- the reason "freedom of the press" was enshrined in the First Amendment -- is the right to be told the truth, and to make decisions based on those truths. Freedom of the press, which in this case is synonymous with "the honesty and integrity of the press" is the cornerstone of all democracy. But we cannot have the latter without the former, and we cannot have either unless everyone participates in this fight. In the 1950s, when civics were still taught to every schoolchild in America, one of the slogans was "Freedom is Everybody's Job." This holds no less true of freedom of the press. It's everybody's job. It's your job.
Now go out there and do it.
In regards to the first two, and last two, accusations, that American journalists don't understand their own profession of journalism, are astonishingly ignorant of the country in which they live, and tend to “report” only what they want to happen and not what is manifestly happening, etc., etc., the fixes are actually quite simple, if not necessarily fast-acting or easy to execute. Beginning immediately, every student studying journalism should be required by his professors to study, at some length, the principles and ideas of every popular political ideology in the United States, and be able to state those principles, with objectivity and accuracy, back to the professor, as well as write about them in great detail. This is hardly an unreasonable demand: Dr. Joseph Goebbels, Hitler's Minister of Propaganda and in some ways a more fanatical Nazi than Hitler himself, once astonished and amused guests at a formal reception before WW2 by three times mounting a table and delivering, extemporaneously, seemingly heartfelt speeches in favor of monarchism, communism and democracy. Goebbels' point, seemingly, was that he could make eloquent arguments in favor of all rival systems of government without having the slightest belief in them. My point, in referencing this, is that Americans no longer seem to possess the courage to explore or try to understand the viewpoints and motivations of those they disagree with -- a key element not only to arguing intelligently with them, but to overcoming them in the political arena. The victory of Trump and "Brexit," the departure of the United Kingdom from the European Union, are both, in my mind, direct extensions of this willful ignorance born of over-sensitivity. The decayed, degraded, degenerate modern journalist (himself only an extension of a decayed, degraded and degenerate modern society), who, having shielded himself from the true motivations of those he disagrees with, is shocked when huge masses of people disagree with him. To many of our young people, and indeed, many of the professors who teach those young people, the very act of grasping, without exaggerations or distortions, the principles of one's political foes is deemed "offensive" and "micro-aggressive" (cue violin), when it fact it is simply educational. The old saying "Knowledge is power" has been replaced by the Oceanic credo of 1984: "Ignorance is Strength."
I do not wish to be redundant here, but the problem is so cyclical in nature that my point bears repeating, or rather rephrasing. American universities, which used to be oases of free thought and passionate but principled disagreement, are now veritable petri-dishes for breeding oversensitive, under-educated weaklings who can't bear to be exposed to ideas that make them uncomfortable. These weaklings -- it is the right word, I'm sad to say -- go into journalism without understanding or respecting any political belief they themselves don't hold, and not only produce “journalism” which is as weak and narrow-minded as they are, they are praised for their “sensitivity” by their colleagues and readers alike. But journalism, like education, is not and never has been for the sensitive: respect for the prejudices and preconceived notions of the readership -- for their political "sacred cows" and feelings generally -- is absolutely no concern of the journalist's art. On the contrary, the outstanding feature of any reporter is his is insensitivity, or more specifically, his toughness, and a tough reporter, be he conservative, liberal, libertarian, or what have you in his political sympathies, knows that the best way to deliver informed, even-handed journalism is to understand both sides of the story, not merely the one he is covertly in sympathy with.
In tandem with this, every major press organization, regardless of whether it is print or electronic in nature, must re-open the bureaus they once maintained in what Homer Simpson so eloquently called “the great useless mass of land between New York and Los Angeles we call America.” As I have previously stated, the mixture of shock and disbelief with which the big-name press regarded Donald Trump's election came largely from their own deliberate, wilful ignorance of the mood of “the rest” of the United States, which they more or less openly hold in contempt. The deep, searing vein of anger which exists in places like Youngstown, Ohio – a desolation of shuttered shops and dead factories left behind by Obama's economic policies – was either not paid serious heed or simply not grasped at all, because few reporters troubled to go to places like Youngstown, and when they did, few could refrain from depicting the inhabitants as ill-educated, racist bumpkins whose opinions didn't matter a damn anyway. Had the organizations in question stationed reporters in such places year-round, had they bothered to take the American pulse in the non-election years that make up most of our lives, they might have taken Trump, or at least what he represented, more seriously. In my previous article I noted that George Orwell once accused the British ruling class of being unable to grasp the dangers of Fascism, because to do so would have meant studying the theory of socialism, which would have in turn laid bare the gross social and economic inequalities that they, the British ruling class, existed to preserve. "To fight Fascism," Orwell noted. "One must begin by admitting that it contains some good as well as much that is evil." But this beginning never took place. To keep their sense of moral superiority, the British rulers had to remain permanently ignorant, and by this ignorance they led their country to war and their Empire to destruction. So it is with the American press. To understand the Trump phenomenon, or Brexit, or the rise of nationalist movements generally, they would have had to understand the anger which motivated them, which would have not only meant actually wading into the "great, useless mass of land we call America" but realizing that some of the Heartland grievances were actually legitimate. And this was precisely what they did not do. Indeed, they could not, because the deficiency in their political outlook, their innate "sensitivity" and refusal to grant even the possibility that the "other" party might have a point here and there, made it impossible. Their political ideology -- that only big-city-dwellers, Millennials, hipsters, progressives, liberals, gays, university professors, people of color and mixed race actually matter -- had forced them into a line of reasoning as fixed and an unalterable as a railroad track, with the exception that the track led right off a cliff. The number of bloggers and self-appointed pundits who were Googling "white working class" on election night must damn near have broken the Google servers, but even after belatedly discovering the existence of this seething mass of voters, tens of millions strong, their only reaction was to double down on their disdain. The amount of whining, weeping and tantrum-throwing that followed the election was at once amusing and infuriating. Only now, with the inauguration-day smoke beginning to clear, do I see any signs at all that the coastal press (Los Angeles, Washington, New York, Boston) is starting to grasp the necessity of "colonizing" the vast heartland of America. If this nascent policy is followed through -- and "if" is the longest word in the English language -- then within a year we ought be seeing reportage which will make the outcome of the next election far less shocking in its outcome.
As to my third point, that journalism-for-profit has debased the profession and led to an increase in sensationalism which has, in turn, led to further blurring the line between hard news and entertainment, it's necessary for me to explain that for much of their history, electronic news programs, from the earliest days of radio onward, were operated at a loss, simply as a public service. It was not until 60 Minutes debuted in 1968 that networks realized they could make a profit from radio or television news, and while 60 Minutes is in many ways a fine program, it definitely blurred the line between hard news and “news entertainment,” with its slanted coverage, aggressive, confrontational interviews and ambush tactics. Nowadays, it is taken as a matter of course that news programs of any kind be profitable, but news for profit contains a great evil within its core, to wit: what is true often does not sell. Many of the most important stories of modern times, such as the savings-and-loan collapse in the 1990s, or the fact that the Pentagon cannot account for 6.8 trillion (with a t dollars of taxpayer money, hardly entered the public consciousness because no way could be found to “sex them up” in the manner of, say, the O.J. Simpson Trial. The desire people have for salacious gossip and tawdry scandal is never going to go away, but that doesn't mean that legitimate news entities have to give these grotesqueries more coverage than they deserve. We must return, however painfully, to the idea that a large majority of the populace are uninterested in meaningful news, and leave them to TMZ and Extra!. But those with functioning brains, who want to know what the hell is doing and being done in the world, deserve the best, the hardest-hitting, the gutsiest journalism we can give them, and never mind the goddamned profit margin. Now, it so happens that the United States has 540 billionaires, vastly more than the rest of the world combined. These individuals have more money than they could ever possibly spend and many of them look for ways to spend some of it charitably. Surely one or two of them could be convinced, with the right arguments, to pump cash into the hardening arteries of our better news organizations, or failing that, fund their own, one which approaches stories without worrying about the economics of covering them, or whether their reportage will anger some important sponsor whose advertising money must not be lost. Such a news entity, freed from economic concerns, would provide Americans with a quality alternative to the ideologically bigoted trash we are now being served in super-sized helpings. It would also attract many young people, journalistically inclined, who are notweaklings, as well as many old journalistic veterans who have been marginalized and disenfranchised simply because their standards are too high for the great unwashed. The vets could supply the experience, the standards, and the discipline necessary to maintain them; the young could supply the passion and technical, social-media savvy.
Such a union, incidentally, would actually assist us with my fourth point, the idea that the 24 hour news cycle, by virtue of having to keep each of those hours filled with content, changed the character of journalism by forcing scribes to create, rather than merely report, the news. Our imaginary news organization – let's give it the working title "5WH" – would operate along older-fashioned lines, appearing for no more than six, or in times of crisis, perhaps eight hours a day on television. What's more, it would operate under these guidelines:
1. Understand that the first duty of a news agency is to inform the people, while shaping their views is fit only for the op-ed section. This leads us to....
2. Divide the “paper” between hard news and opinion-editorial, and make explicitly clear where the line is drawn, if necessary by the gross expedient of labeling each individual story. This is precisely the opposite of the present trend of mingling news and op-ed in such a way that the distinction no longer exists.
3. Insist on objective, factual reporting at all times. Avoid use of partisan political words and phraseology in reportage. Avoid the impulse to ridicule or condescend those who are interviewed, or, for that matter, to be overly sympathetic with them. Prevent journalists from injecting themselves, their personalities, or their politics into their hard-news stories, except in those cases (undercover/investigative journalism, for example), where it is impossible.
4. Practice public self-assessment and self-criticism, if necessary through the use of an ombudsman, an independent watchdog within the organization with the power to discipline those who fail to meet standards.
5. Invite scrutiny. Never hesitate to admit, and be held accountable for, errors and mistakes. They are going to occur, and when they do, they can be defused by a simple mea culpa, which will restore public confidence. Above all, avoid the practice of doubling down on tainted stories, a la Rolling Stone with its libelous “A Rape on Campus” piece, simply because you wanted the story to be true. Mistakes happen. Admit them, learn from them, and move on. Journalists live to destroy politicians who would rather cover up a minor peccadillo than confess to it, and what's sauce for the goose should be sauce for the gander.
5. Keep a finger on the pulse of America – not just New York or Los Angeles – by maintaining the afore-mentioned bureaus, or at least individual reporters, in cities and towns across the country, and keep those reporters reporting. Poke. Prod. Dig. Report. Ask questions and listen to the answers. And don't be afraid of Americans from Cleveland. They probably won't bite you. Probably.
6. As I stated in Part 1, anyone with a working internet connection can call themselves a reporter, and I asked that the reader learn to differentiate between the legitimate and the fake before clicking his or her mouse.
But legitimate journalistic agencies should have even less compunction about calling out hacks, frauds, race-baiters, fear-mongers, and the plain morons of the community for what they are. Here, as in all areas of reportage, objectivity is a must: a left-leaning news agency must drop the hammer on left-wing frauds, and vice-versa. Sympathy with someone's political aims must never be allowed to influence the way we grade the reliability of the news.
It will be seen here that many of my suggestions to date, are not solutions in themselves but merely components of solutions, and mostly very broad in character. This is true, because the problem we face is itself both broad in nature and deeply rooted in our societal flaws as a whole. A disease which has been progressing slowly for years will not be cured in a day, a fact which is disheartening at the outset of the struggle. On the other hand, a disease left untreated is invariably fatal -- in this case, to our democracy -- and this disease can be stopped. The task is large, the obstacles many, and the people most effected by the problem either unaware of it or too apathetic to do anything about it. This does not serve as an excuse for the awakened, active individual to do nothing: indeed, it robs him or her of that excuse. Generations of Americans, reaching back to the time of the Revolution, to the Civil War, through the age of the Civil Rights Movement and beyond, suffered and in some cases died so that basic human freedoms could be enjoyed by everyone in this country. Perhaps the most basic freedom we enjoy -- the reason "freedom of the press" was enshrined in the First Amendment -- is the right to be told the truth, and to make decisions based on those truths. Freedom of the press, which in this case is synonymous with "the honesty and integrity of the press" is the cornerstone of all democracy. But we cannot have the latter without the former, and we cannot have either unless everyone participates in this fight. In the 1950s, when civics were still taught to every schoolchild in America, one of the slogans was "Freedom is Everybody's Job." This holds no less true of freedom of the press. It's everybody's job. It's your job.
Now go out there and do it.
Published on March 12, 2017 13:28
February 24, 2017
HOW JOURNALISTS CAN GET IT RIGHT: FIXING THE NEWS, PART 1
You cannot – thank God! – bribe or twist
The mind of the American journalist
But seeing what, unbribed, he'll do
Thank God! There is no reason to.
– George Orwell (paraphrase)
Not long ago I made an assessment of everything which ailed the American press (“Why Journalists Keep Getting It Wrong”), and concluded that journalism in the United States was, for lack of a better word, broken. Specifically I made the following accusations:
1. American journalists don't understand their own profession of journalism – its rules and basic standards. Because of their deficient training and knowledge, as well as their personal biases, reporters are unable to see the world as it is, but only as they want it to be.
2. The modern journalist suffers a critical deficiency in his social and political as well as his journalistic education; he is also astonishingly ignorant of the country in which he lives.
3. Journalism-for-profit has debased the profession and led to an increase in sensationalism which has, in turn, led to further blurring the line between hard news and entertainment.
4. The 24-hour news cycle has fundamentally changed the relationship of the press to the news. Before the cycle, the press reported the news; after the cycle, they created it.
5. Modern reporters lack objectivity, and allow that lack of objectivity to color their stories and their predictions of both specific political events (elections, referendums, etc.) and broad historical-social-economic trends (white working-class anger, etc.).
6. The roots of these deficiencies can be traced to the general failure of our educational system to teach critical thinking to our young people, and to a collapse in the overall standards of journalism (which brings us full-circle to #1).
Conspicuous by its absence in that mass of complaints was any constructive suggestion on how to fix our troubled Fourth Estate. To be honest, it is easier to whine that something is broken than to effect repairs, and so I had to take some time and really look at the problem from all angles. It was, to say the least, a daunting task, rather akin to patching up the Hindenburg with a bicycle repair kit; but there is an old joke in which a student, dismayed by the huge task which awaits her, goes to her master and says, “Where should I start?” And the master dryly replies: “You should start where you are.” So, without further ado, I start where I am...which is at the very beginning.
When you learn to box, the teacher tells you that the sweet science is actually very simple...provided you remember two fundamental rules. The first is “hands up, chin down.” The second is, “Boxing is the art of hitting and not getting hit.” Everything, absolutely everything, which follows in terms of conditioning, tactics, technique, diet, psychological preparation, etc. is merely an outgrown of those two fundamental truths you learn on the first day – hands up, chin down; hit and don't get hit. If you don't learn those two basic primers, everything else you do, no matter how well intentioned, will ultimately come to grief. It is exactly the same with journalism. No amount of knowledge, stylistic ability, sincerity, courage, intelligence, industry or integrity will prevent your journalism from sliding into disrepute unless your basics are rock-solid. The first and most serious problem modern journalism faces is that the appellation of “journalist,” which used to have such weight, is now thrown around willy-nilly, like the paper crowns they hand out on New Year's Eve. Anyone with a keyboard and working internet connection who wants to talk about politics or current events is handed a journalistic credential. This is as ridiculous as calling someone a black belt who happens to be wearing a belt which is black. There are basic standards which must be met, and having been met, be maintained, or else the honor is meaningless. In “Das Boot” the commander of a U-boat tells his men that “an Iron Cross is not a medal one receives for a specific act of courage, but must be re-earned continuously lest the recipient lose the right to wear it.” So it is with reportage. A reporter of 30 years good standing who commits egregious ethical or technical blunders on a current story is not forgiven because of his prior good acts; as both Dan Rather and Brian Williams discovered to their sorrow, the honor of their title “news correspondent” must be continuously re-earned. No resting on past laurels is permitted. But a title, like a medal, has no value if you bestow it upon yourself; I can pin all the Iron Crosses and “journalist” mantles on myself that I so desire, it doesn't make me either a war hero or a reporter. In every profession on earth, there is a peerage; a community of those in the same profession who set standards. This applies to stonemasons, electricians, barbers, plumbers, lawyers, physicians, you name it. Journalism has such profesional watchdog-entities, but they are increasingly impotent and recessive; they “police” only the dwindling number of “real” journalists and ignore the great mass of people who claim to be reporters but don't have a clue as to what being a reporter actually involves. Therefore the responsibility falls to us, the reading public. The first fix is to acknowledge that we are the watchdogs. Not the press, not the government; we the people. Carl Sagan once said that if only Americans would learn how to think scientifically – “carry a baloney-detection kit” in his memorable phrase – they could be their own intellectual policemen, and would not need others to rescue them from astrological con men and pseudo-scientific hucksters. But first they had to learn to think scientifically. Well, we have to think journalistically. So let's get at it.
In order for a news article to qualify as such, it must meet a certain minimum standard in regards to what questions it answers. That standard, which is known as “The 5 Ws and the H,” has been the same for hundreds of years and never varied in any way. It is completely objective and must be applied regardless of the reporter's personal feelings or political leanings. It boils down to six clear-cut questions. They are:
Who?
What?
Where?
When?
Why?
How?
Any newspaper story which does not ask and answer these questions, insomuch as it is humanly possible to do so, is not legitimate journalism. What's more, these questions must be answered as swiftly as possible, preferably within the first two paragraphs. For example:
Buffy Anne Summers (who) stabbed a vampire (who 2) through the heart (what) with a stake (how) last night (when) in a cemetery in Sunnydale, California (where). When questioned, Miss Summers, 20, identified herself to authorities as a “vampire slayer” placed on earth to fight the forces of evil (why).
Sometimes it is not possible to answer all six questions due to lack of information. This is perfectly acceptable provided that every effort is made to answer those questions which are, in fact, answerable and to ask knowledgeable authorities to speculate about those which are not. For example:
Colonel Mustard (who) was murdered (what) in the library (where) with a candlestick (how) sometime last night (when) by Miss Scarlet (who 2).
This answers, in a single sentence, everything about the murder but “why,” which may be unknown at the time of Miss Scarlet's arrest. Therefore a reporter could supply the as much of the “why” as possible by quoting from the police spokesman that “police have no motive for the crime at this time, but friends and family speculate it may have been due to Col. Mustard's excessive flatulence at the dinner table.”
Okay, so I'm using facetious examples here to lighten the mood, but only because real-life examples of proper journalism are becoming increasingly hard to come by. In 2014, California state senator Leland Ye was arrested by the FBI for his involvement “in an extensive criminal conspiracy to traffic guns and drugs, launder money, assassinate for cash, and influence public policy.” (Vice) When the L.A. Times broke the story, they consistently refused to identify Ye as a democrat; only a torrent of outraged e-mails and messages sent by their readership forced this left-leaning newspaper to admit that yes, a (D) had been buying guns from the Italian mafia to sell to the Chinese triads. I hardly think it conspiracy-theorizing to say that if he had been an (R), the Times would have trumpeted his political affiliation to the very heavens. The leftist sentiments of the Times are well-known and self-admitted, and not unethical as such (every newspaper and news entity in the world has a political slant of some sort) but that is not an excuse to excise information inconvenient to their politics from their reportage, which, unlike their op-ed pieces, must be objective (we will return to this later). What's important about the Ye example is that the absence of his political affiliation from the story was noticed immediately by thousands of people, and subsequently corrected, but only due to their continuous pressure. This incident gave me hope that the public can play the Watchdog successfully – provided, of course, that we continue to educate ourselves and our neighbors (and most especially our children) as to what constitutes “acceptable” journalism.
Any time you read a news article, it is important to ask yourself, after the second paragraph or so, whether a direct, clearly-worded effort has been made to answer these six fundamental questions. A reporter who fails to do so by, at least, the third or fourth paragraph is almost certainly negligent in his duties. A reporter who leaves any of the questions out without making a diligent attempt to answer them has simply failed in his job, either deliberately or through incompetence. Again, it may seem condescending or pretentious to offer such basic instruction, but the decline in journalistic standards has been so precipitous, so all-encompassing, and so extended in its duration – this has been going for decades, folks – that many otherwise intelligent people are no longer even aware that these standards exist. They have been so inundated by shoddy, incomplete, unobjective journalism that they can no longer discriminate; indeed, many can put down the newspaper without realizing how little of the story they have been told, or how badly they have been told it. So: the first task in fixing journalism lies equally in the hands of both reporters and the reading public. Reporters must remaster the fundamentals of their profession, and practice clarity and brevity in their writing, so that they dispense the maximum amount of information in the minimum amount of words; readers must satisfy themselves that in every news article they read, these six questions have either been answered, or that every effort has been made to answer them. And here is the crucial part: you must train yourself to ask this question every time you read a hard news article, regardless of the subject or the source. This takes mental discipline, which is precisely what is lacking in journalism itself nowadays, and more than mental discipline, it takes a measure of ruthlessness. For we all have our favorite news outlets, whatever they might be, and if we should discover that our favorite is not delivering the goods, then we must cut the cord right then and there, and look elsewhere. This is not as easy as it sounds; humans are creatures of habit, and Americans have been conditioned (in many cases, willingly) to take up residence in an echo chamber which permits them to hear only their own opinions delivered back to them at maximum volume, and never mind their accuracy. Leaving the echo chamber is not easy, nor is it comfortable; but it is necessary. The entire concept of “pick and choose news" is antithetical to the central purpose of journalism, which is finding the truth. Which often hurts.
As I mentioned above, the ability to discriminate between “hard news” and “editorial/opinion” is fundamental to legitimate journalism, and indeed, the line between the two is not fine. It is more like the Berlin Wall: a big sonofabitch made of concrete and barbed wire you can see from space. This Wall was built with both conscious and noble purpose; it is important, actually vital, to know whether a reporter is speaking objectively in the hopes of informing you about facts, or whether he is speaking with a personal agenda and trying to persuade you to come around to his point of view. In a legitimate news organization, the Wall is maintained in such a manner that you could never mistake news for op-ed, or op-ed for news; but in the last 25 years, through a quite deliberate and short-sighted effort, said barrier has been all but obliterated. Nowadays there is literally no difference between “hard news” (which uses the 5WH as its foundation stone) and “op-ed,” which by its nature is unobjective and biased. I can scarcely open the Los Angeles Times or Washington Post without being confused as to whether what I am reading falls in the former or the latter category. Often this sleight-of-hand is achieved subtly. A left-wing paper refers to “undocumented immigrants” or simply “immigrants,” as if there is no distinction between an illegal immigrant and a legal one. This is part of a political agenda, and should be recognized as such. On the other hand, a right-wing paper refers to the former president by his full name, “Barack Hussein Obama,” knowing that the name “Hussein” will inflame many of its readers and for no other reason. This, too, is part of a political agenda. And while I don't fault news organizations for having political agendas, I do fault them for expressing those agendas outside the walled-off arena of opinion-editorial, for it is precisely at that moment that we can't tell the difference between what we want and what we know, we begin to venture into the dreaded area of the map known as “fake news.”
It so happens that the definition of fake news is not as rigid as one might think. The Orwellian idea of the completelyfabricated news story, which does not have a relationship with the truth, “not even that implied by an ordinary lie,” is less common than the deliberate, systematic misinterpretation of existing news, the surgical removal of statements or images from their context, for the sole purpose of misleading the reader or slandering an individual. Indeed, in recent times we have been confronted with the idea, once discredited as a conspiracy theory, that much the news we read is either a) untrue or b) taken out of context to the point where it may as well be untrue. This sort of thing has always taken place, but it such incidents used to be exceptional; now they are so common that they go almost unnoticed. In the face of such systematic “warping” of the truth we seem helpless, for we have little in the way of trustworthy, independent means to verify anything we cannot ourselves see and hear; but the truth is our position is quite strong. The trouble lies in the effort and patience necessary to counteract this sort of twisted reportage. To fix this aspect of broken journalism, we must take the crucial step of identifying and categorizing the news, not by its political leanings but by its reliability. And the way to do this is by judging its speed.
Yes, I used the word speed. It has long been established, though not widely understood, that there is an inverse relationship between the speed at which information is disseminated and its accuracy. For example, many years ago the U.S. Navy vessel Stark was hit by an Iraqi cruise missile while on patrol in the Persian Gulf. The initial report was that the damage was slight, “only” one crewman had been killed, and the ship was proceeding under its own power back to base for repairs. In truth, the 47 sailors had been killed and the ship was dead in the water and burning; but it took days for this tragic fact to emerge from behind the literal and figurative smoke. Initial reports are by definition the most unreliable; if one wanted the full, objective facts, unearthed by exhaustive investigation and subjected to professional analysis, one had to wait several months, for the official Navy report – which was then subjected to yet further scrutiny by the press and its experts. Between the initial report and the “final” one, the key factor was not sincerity but time. Anything worthy of a news report usually involves some level of chaos or upset, and that chaos and that upset distort the truth the way a stone distorts the surface of a pond. The “real story” is often impossible to ascertain at first glance, no matter how diligently we try. We only know that something – an earthquake, a missile attack, a riot, a fire, an assassination – has occurred, and later, often much later, the details begin to filter in, conflicting with earlier accounts and sometimes making nonsense of them. Therefore the news source which is the fastest is also the least reliable. Understanding this fact is the key to grasping what much of supposedly “fake” news is, and how we can combat it.
Excluding word of mouth, there are five basic means by which we receive information. They are:
1. The internet
2. Newspapers
3. Magazines
4. Books
5. Peer-reviewed scholarly journals
I have stacked these means from fastest to slowest and least to most reliable, but also, without trying, by order of popularity. It so happens that the vast majority of Americans get their news from the internet, which is an instantaneous means of communication; a substantial minority by newspapers, which appear every 24 hours; a certain smallish percentage by news magazines, which appear once weekly or monthly; and a very small percentage from scholarly journals and books, which appear quarterly or even yearly. Therefore the ordinary American gets the majority of his or her news through a patently unreliable source. In an era like ours this is unavoidable, but what is not unavoidable is our reaction to such news. By keeping in mind the inverse relationship between speed and accuracy, we can always put a brake onto our emotions when seeing an inflammatory headline. “How did I come by this information?” is a simple question which is not asked anywhere near enough nowadays, but it begs a second question, to wit: “Is this a source which has been consistently credible in the past?” Remember that it falls on us to rate and judge the quality of what we are reading, based on its commitment to journalistic principles. A great deal of fake news could be killed at birth if people would take the half-second it requires to glance at the source of said news. (Example: I have joined numerous WW2-themed “enthusiast” sites on Facebook, which are for historical interest. One of them turned out to be nothing more than a clickbait “front” for a white supremacist website. Had I looked at where the links were headed before I moved my mouse over them, I could have avoided the embarrassment of having “White Resister Dot Com” on my internet history.)
Before I break, I'd like to add here that when I was in law enforcement, I realized the cliche taught to when I was studying criminal justice in college was entirely true: to properly police a community requires that both police and community work together, and take an equal share of the responsibility for keeping that community safe and orderly. The fact this so seldom happens is part of the reason why the police system in the United States does not work properly. So it is with journalism. It's not enough to say that the profession is broken, or enumerate the reasons why, or even trace them to their root causes; we must act on that knowledge. And the knowledge, however bitter it may taste, tells us that we, the reading public, must take a measure of responsibility for the sorry state of the press. We are too ready to believe anything that reinforces our existing prejudices, too eager to dismiss any fact which challenges them. It's a group effort, and we must pull our weight. But I agree that the majority of that weight resides on "them," and in the next installment, I'll discuss what journalists themselves can -- and should -- be doing to restore their profession to its proper place as the Watchdog of Democracy.
The mind of the American journalist
But seeing what, unbribed, he'll do
Thank God! There is no reason to.
– George Orwell (paraphrase)
Not long ago I made an assessment of everything which ailed the American press (“Why Journalists Keep Getting It Wrong”), and concluded that journalism in the United States was, for lack of a better word, broken. Specifically I made the following accusations:
1. American journalists don't understand their own profession of journalism – its rules and basic standards. Because of their deficient training and knowledge, as well as their personal biases, reporters are unable to see the world as it is, but only as they want it to be.
2. The modern journalist suffers a critical deficiency in his social and political as well as his journalistic education; he is also astonishingly ignorant of the country in which he lives.
3. Journalism-for-profit has debased the profession and led to an increase in sensationalism which has, in turn, led to further blurring the line between hard news and entertainment.
4. The 24-hour news cycle has fundamentally changed the relationship of the press to the news. Before the cycle, the press reported the news; after the cycle, they created it.
5. Modern reporters lack objectivity, and allow that lack of objectivity to color their stories and their predictions of both specific political events (elections, referendums, etc.) and broad historical-social-economic trends (white working-class anger, etc.).
6. The roots of these deficiencies can be traced to the general failure of our educational system to teach critical thinking to our young people, and to a collapse in the overall standards of journalism (which brings us full-circle to #1).
Conspicuous by its absence in that mass of complaints was any constructive suggestion on how to fix our troubled Fourth Estate. To be honest, it is easier to whine that something is broken than to effect repairs, and so I had to take some time and really look at the problem from all angles. It was, to say the least, a daunting task, rather akin to patching up the Hindenburg with a bicycle repair kit; but there is an old joke in which a student, dismayed by the huge task which awaits her, goes to her master and says, “Where should I start?” And the master dryly replies: “You should start where you are.” So, without further ado, I start where I am...which is at the very beginning.
When you learn to box, the teacher tells you that the sweet science is actually very simple...provided you remember two fundamental rules. The first is “hands up, chin down.” The second is, “Boxing is the art of hitting and not getting hit.” Everything, absolutely everything, which follows in terms of conditioning, tactics, technique, diet, psychological preparation, etc. is merely an outgrown of those two fundamental truths you learn on the first day – hands up, chin down; hit and don't get hit. If you don't learn those two basic primers, everything else you do, no matter how well intentioned, will ultimately come to grief. It is exactly the same with journalism. No amount of knowledge, stylistic ability, sincerity, courage, intelligence, industry or integrity will prevent your journalism from sliding into disrepute unless your basics are rock-solid. The first and most serious problem modern journalism faces is that the appellation of “journalist,” which used to have such weight, is now thrown around willy-nilly, like the paper crowns they hand out on New Year's Eve. Anyone with a keyboard and working internet connection who wants to talk about politics or current events is handed a journalistic credential. This is as ridiculous as calling someone a black belt who happens to be wearing a belt which is black. There are basic standards which must be met, and having been met, be maintained, or else the honor is meaningless. In “Das Boot” the commander of a U-boat tells his men that “an Iron Cross is not a medal one receives for a specific act of courage, but must be re-earned continuously lest the recipient lose the right to wear it.” So it is with reportage. A reporter of 30 years good standing who commits egregious ethical or technical blunders on a current story is not forgiven because of his prior good acts; as both Dan Rather and Brian Williams discovered to their sorrow, the honor of their title “news correspondent” must be continuously re-earned. No resting on past laurels is permitted. But a title, like a medal, has no value if you bestow it upon yourself; I can pin all the Iron Crosses and “journalist” mantles on myself that I so desire, it doesn't make me either a war hero or a reporter. In every profession on earth, there is a peerage; a community of those in the same profession who set standards. This applies to stonemasons, electricians, barbers, plumbers, lawyers, physicians, you name it. Journalism has such profesional watchdog-entities, but they are increasingly impotent and recessive; they “police” only the dwindling number of “real” journalists and ignore the great mass of people who claim to be reporters but don't have a clue as to what being a reporter actually involves. Therefore the responsibility falls to us, the reading public. The first fix is to acknowledge that we are the watchdogs. Not the press, not the government; we the people. Carl Sagan once said that if only Americans would learn how to think scientifically – “carry a baloney-detection kit” in his memorable phrase – they could be their own intellectual policemen, and would not need others to rescue them from astrological con men and pseudo-scientific hucksters. But first they had to learn to think scientifically. Well, we have to think journalistically. So let's get at it.
In order for a news article to qualify as such, it must meet a certain minimum standard in regards to what questions it answers. That standard, which is known as “The 5 Ws and the H,” has been the same for hundreds of years and never varied in any way. It is completely objective and must be applied regardless of the reporter's personal feelings or political leanings. It boils down to six clear-cut questions. They are:
Who?
What?
Where?
When?
Why?
How?
Any newspaper story which does not ask and answer these questions, insomuch as it is humanly possible to do so, is not legitimate journalism. What's more, these questions must be answered as swiftly as possible, preferably within the first two paragraphs. For example:
Buffy Anne Summers (who) stabbed a vampire (who 2) through the heart (what) with a stake (how) last night (when) in a cemetery in Sunnydale, California (where). When questioned, Miss Summers, 20, identified herself to authorities as a “vampire slayer” placed on earth to fight the forces of evil (why).
Sometimes it is not possible to answer all six questions due to lack of information. This is perfectly acceptable provided that every effort is made to answer those questions which are, in fact, answerable and to ask knowledgeable authorities to speculate about those which are not. For example:
Colonel Mustard (who) was murdered (what) in the library (where) with a candlestick (how) sometime last night (when) by Miss Scarlet (who 2).
This answers, in a single sentence, everything about the murder but “why,” which may be unknown at the time of Miss Scarlet's arrest. Therefore a reporter could supply the as much of the “why” as possible by quoting from the police spokesman that “police have no motive for the crime at this time, but friends and family speculate it may have been due to Col. Mustard's excessive flatulence at the dinner table.”
Okay, so I'm using facetious examples here to lighten the mood, but only because real-life examples of proper journalism are becoming increasingly hard to come by. In 2014, California state senator Leland Ye was arrested by the FBI for his involvement “in an extensive criminal conspiracy to traffic guns and drugs, launder money, assassinate for cash, and influence public policy.” (Vice) When the L.A. Times broke the story, they consistently refused to identify Ye as a democrat; only a torrent of outraged e-mails and messages sent by their readership forced this left-leaning newspaper to admit that yes, a (D) had been buying guns from the Italian mafia to sell to the Chinese triads. I hardly think it conspiracy-theorizing to say that if he had been an (R), the Times would have trumpeted his political affiliation to the very heavens. The leftist sentiments of the Times are well-known and self-admitted, and not unethical as such (every newspaper and news entity in the world has a political slant of some sort) but that is not an excuse to excise information inconvenient to their politics from their reportage, which, unlike their op-ed pieces, must be objective (we will return to this later). What's important about the Ye example is that the absence of his political affiliation from the story was noticed immediately by thousands of people, and subsequently corrected, but only due to their continuous pressure. This incident gave me hope that the public can play the Watchdog successfully – provided, of course, that we continue to educate ourselves and our neighbors (and most especially our children) as to what constitutes “acceptable” journalism.
Any time you read a news article, it is important to ask yourself, after the second paragraph or so, whether a direct, clearly-worded effort has been made to answer these six fundamental questions. A reporter who fails to do so by, at least, the third or fourth paragraph is almost certainly negligent in his duties. A reporter who leaves any of the questions out without making a diligent attempt to answer them has simply failed in his job, either deliberately or through incompetence. Again, it may seem condescending or pretentious to offer such basic instruction, but the decline in journalistic standards has been so precipitous, so all-encompassing, and so extended in its duration – this has been going for decades, folks – that many otherwise intelligent people are no longer even aware that these standards exist. They have been so inundated by shoddy, incomplete, unobjective journalism that they can no longer discriminate; indeed, many can put down the newspaper without realizing how little of the story they have been told, or how badly they have been told it. So: the first task in fixing journalism lies equally in the hands of both reporters and the reading public. Reporters must remaster the fundamentals of their profession, and practice clarity and brevity in their writing, so that they dispense the maximum amount of information in the minimum amount of words; readers must satisfy themselves that in every news article they read, these six questions have either been answered, or that every effort has been made to answer them. And here is the crucial part: you must train yourself to ask this question every time you read a hard news article, regardless of the subject or the source. This takes mental discipline, which is precisely what is lacking in journalism itself nowadays, and more than mental discipline, it takes a measure of ruthlessness. For we all have our favorite news outlets, whatever they might be, and if we should discover that our favorite is not delivering the goods, then we must cut the cord right then and there, and look elsewhere. This is not as easy as it sounds; humans are creatures of habit, and Americans have been conditioned (in many cases, willingly) to take up residence in an echo chamber which permits them to hear only their own opinions delivered back to them at maximum volume, and never mind their accuracy. Leaving the echo chamber is not easy, nor is it comfortable; but it is necessary. The entire concept of “pick and choose news" is antithetical to the central purpose of journalism, which is finding the truth. Which often hurts.
As I mentioned above, the ability to discriminate between “hard news” and “editorial/opinion” is fundamental to legitimate journalism, and indeed, the line between the two is not fine. It is more like the Berlin Wall: a big sonofabitch made of concrete and barbed wire you can see from space. This Wall was built with both conscious and noble purpose; it is important, actually vital, to know whether a reporter is speaking objectively in the hopes of informing you about facts, or whether he is speaking with a personal agenda and trying to persuade you to come around to his point of view. In a legitimate news organization, the Wall is maintained in such a manner that you could never mistake news for op-ed, or op-ed for news; but in the last 25 years, through a quite deliberate and short-sighted effort, said barrier has been all but obliterated. Nowadays there is literally no difference between “hard news” (which uses the 5WH as its foundation stone) and “op-ed,” which by its nature is unobjective and biased. I can scarcely open the Los Angeles Times or Washington Post without being confused as to whether what I am reading falls in the former or the latter category. Often this sleight-of-hand is achieved subtly. A left-wing paper refers to “undocumented immigrants” or simply “immigrants,” as if there is no distinction between an illegal immigrant and a legal one. This is part of a political agenda, and should be recognized as such. On the other hand, a right-wing paper refers to the former president by his full name, “Barack Hussein Obama,” knowing that the name “Hussein” will inflame many of its readers and for no other reason. This, too, is part of a political agenda. And while I don't fault news organizations for having political agendas, I do fault them for expressing those agendas outside the walled-off arena of opinion-editorial, for it is precisely at that moment that we can't tell the difference between what we want and what we know, we begin to venture into the dreaded area of the map known as “fake news.”
It so happens that the definition of fake news is not as rigid as one might think. The Orwellian idea of the completelyfabricated news story, which does not have a relationship with the truth, “not even that implied by an ordinary lie,” is less common than the deliberate, systematic misinterpretation of existing news, the surgical removal of statements or images from their context, for the sole purpose of misleading the reader or slandering an individual. Indeed, in recent times we have been confronted with the idea, once discredited as a conspiracy theory, that much the news we read is either a) untrue or b) taken out of context to the point where it may as well be untrue. This sort of thing has always taken place, but it such incidents used to be exceptional; now they are so common that they go almost unnoticed. In the face of such systematic “warping” of the truth we seem helpless, for we have little in the way of trustworthy, independent means to verify anything we cannot ourselves see and hear; but the truth is our position is quite strong. The trouble lies in the effort and patience necessary to counteract this sort of twisted reportage. To fix this aspect of broken journalism, we must take the crucial step of identifying and categorizing the news, not by its political leanings but by its reliability. And the way to do this is by judging its speed.
Yes, I used the word speed. It has long been established, though not widely understood, that there is an inverse relationship between the speed at which information is disseminated and its accuracy. For example, many years ago the U.S. Navy vessel Stark was hit by an Iraqi cruise missile while on patrol in the Persian Gulf. The initial report was that the damage was slight, “only” one crewman had been killed, and the ship was proceeding under its own power back to base for repairs. In truth, the 47 sailors had been killed and the ship was dead in the water and burning; but it took days for this tragic fact to emerge from behind the literal and figurative smoke. Initial reports are by definition the most unreliable; if one wanted the full, objective facts, unearthed by exhaustive investigation and subjected to professional analysis, one had to wait several months, for the official Navy report – which was then subjected to yet further scrutiny by the press and its experts. Between the initial report and the “final” one, the key factor was not sincerity but time. Anything worthy of a news report usually involves some level of chaos or upset, and that chaos and that upset distort the truth the way a stone distorts the surface of a pond. The “real story” is often impossible to ascertain at first glance, no matter how diligently we try. We only know that something – an earthquake, a missile attack, a riot, a fire, an assassination – has occurred, and later, often much later, the details begin to filter in, conflicting with earlier accounts and sometimes making nonsense of them. Therefore the news source which is the fastest is also the least reliable. Understanding this fact is the key to grasping what much of supposedly “fake” news is, and how we can combat it.
Excluding word of mouth, there are five basic means by which we receive information. They are:
1. The internet
2. Newspapers
3. Magazines
4. Books
5. Peer-reviewed scholarly journals
I have stacked these means from fastest to slowest and least to most reliable, but also, without trying, by order of popularity. It so happens that the vast majority of Americans get their news from the internet, which is an instantaneous means of communication; a substantial minority by newspapers, which appear every 24 hours; a certain smallish percentage by news magazines, which appear once weekly or monthly; and a very small percentage from scholarly journals and books, which appear quarterly or even yearly. Therefore the ordinary American gets the majority of his or her news through a patently unreliable source. In an era like ours this is unavoidable, but what is not unavoidable is our reaction to such news. By keeping in mind the inverse relationship between speed and accuracy, we can always put a brake onto our emotions when seeing an inflammatory headline. “How did I come by this information?” is a simple question which is not asked anywhere near enough nowadays, but it begs a second question, to wit: “Is this a source which has been consistently credible in the past?” Remember that it falls on us to rate and judge the quality of what we are reading, based on its commitment to journalistic principles. A great deal of fake news could be killed at birth if people would take the half-second it requires to glance at the source of said news. (Example: I have joined numerous WW2-themed “enthusiast” sites on Facebook, which are for historical interest. One of them turned out to be nothing more than a clickbait “front” for a white supremacist website. Had I looked at where the links were headed before I moved my mouse over them, I could have avoided the embarrassment of having “White Resister Dot Com” on my internet history.)
Before I break, I'd like to add here that when I was in law enforcement, I realized the cliche taught to when I was studying criminal justice in college was entirely true: to properly police a community requires that both police and community work together, and take an equal share of the responsibility for keeping that community safe and orderly. The fact this so seldom happens is part of the reason why the police system in the United States does not work properly. So it is with journalism. It's not enough to say that the profession is broken, or enumerate the reasons why, or even trace them to their root causes; we must act on that knowledge. And the knowledge, however bitter it may taste, tells us that we, the reading public, must take a measure of responsibility for the sorry state of the press. We are too ready to believe anything that reinforces our existing prejudices, too eager to dismiss any fact which challenges them. It's a group effort, and we must pull our weight. But I agree that the majority of that weight resides on "them," and in the next installment, I'll discuss what journalists themselves can -- and should -- be doing to restore their profession to its proper place as the Watchdog of Democracy.
Published on February 24, 2017 16:35
February 22, 2017
PA vs. CA: A Tale of Two States
Here are some of the differences between the Pennsylvanian and the Californian as observed by someone who is not native to either state, but has lived ten years in each:
PA = touchy and tense. CA = relaxed and easygoing.
CA = concerned over LGBTQ. PA = thinks LGBTQ is either a rib joint or some kind of prog-rock band.
PA = wears shorts when it is 47 degrees. CA = wears parka when it is 68 degrees.
PA = will knock on your door at 3 AM to tell you you're an asshole. CA = will smile at you in the hallway and then send you a text message at 3 AM telling you you're an asshole.
CA = knows an actress, a model, a comedian, a writer, and a musician, and none of them have jobs. PA = knows a welder, a mechanic, a construction worker, and a waitress, and it's the same fucking person.
CA = thinks you're a tool if you drink cheap beer. PA = thinks you're a tool if you don't.
PA = unpretentious to the point of being slovenly. CA = pretentious to the point of being ridiculous.
PA = poorly educated, and still emotionally in high school. CA = highly educated, and still emotionally in pre-school.
PA = will go out of their way to tell a person they don't like them. CA = will go out of their way to pretend they like someone they can't stand.
PA = buys weed from sleazy drug dealer in an alley near the park. CA = buys weed from a licensed Pot Dispensary while driving kids home from day care.
CA = pickup in front of you on the freeway most likely filled with Mexican illegals. PA = pickup parked next to you at Denny's most likely filled with dismembered parts of 12-point buck.
PA = ancestors fought for North but wears Stars 'n Bars T-shirt. CA = parents' fortune founded on exploitation of poor, but wears Che Guevara T-shirt.
CA = tormented over "white privilege" PA = tormented by complete absence of anything that could be remotely called a privilege, never mind the goddamn color.
PA = poor but tries to look wealthier than he is. CA = rich but pays $350 for ripped jeans.
CA = threatens to sue. PA = beats you with a pipe if you threaten to sue. ("In for a penny, in for a pound.")
PA girls = I like you, I'll fuck you. CA = I hate you, I'll fuck you (if you're a casting director).
CA = Thinks NASCAR is a hybrid. PA = Thinks NASCAR is a religion.
PA = is 27 but looks 40. CA = is 50 but dresses like a teenager.
CA = Thinks Neegan is a cool character. PA = Confused why their Uncle Charlie is on "The Walking Dead."
PA boys = Is a computer programmer and wear a John Deere hat, but I will beat you like a red-headed stepchild. CA boys = Wears Affliction shirts and spent 3 hours a day in the gym, but will piss himself if challenged to fight.
CA = tries to impress you by name-dropping pop stars he knows. PA = tries to impress you by dropping 20 shots of Jaegermeister at happy hour.
PA = Wears "L.A." hat, has never been to California. CA = Wears "Compton" hat, would befoul trousers if they came within 5 miles of the place.
CA = sees public service ad promoting gay tolerance and is moved. PA = sees same ad and is moved...to hurl ashtray at television.
PA = sees no contradiction between rebel flag on front porch and black best friend. CA = sees no contradiction between liberal political views and crossing to the other side of the street when he sees a black person.
CA = claims to care about environment but leaves organic, non-GMO, gluten-free, vegan-approved garbage at state park. PA = recycles cans but shoots deer.
CA = hates socialism because he lives in it. PA = hates socialism but has no idea what the fuck it is.
PA = still talking about the time Charlie Sheen came to town nine years ago. CA = still talking about sleeping with Charlie Sheen three days ago.
PA (and MD) = Says they'll be there at 6:30, shows up at 6:25. CA = Says they'll be there at 6:30, shows up at 8. Then gets out of car and says, "Traffic."
-----------------------------------------------------
As a bonus, I'd like to add a difference between CA, NY/NJ and PA:
NY/NJ: brags about non-existent Mafia connections.
CA = brags about non-existent movie industry connections.
PA = brags about the time Dale Earnhardt puked all over his shoes, but isn't lying.
PA = touchy and tense. CA = relaxed and easygoing.
CA = concerned over LGBTQ. PA = thinks LGBTQ is either a rib joint or some kind of prog-rock band.
PA = wears shorts when it is 47 degrees. CA = wears parka when it is 68 degrees.
PA = will knock on your door at 3 AM to tell you you're an asshole. CA = will smile at you in the hallway and then send you a text message at 3 AM telling you you're an asshole.
CA = knows an actress, a model, a comedian, a writer, and a musician, and none of them have jobs. PA = knows a welder, a mechanic, a construction worker, and a waitress, and it's the same fucking person.
CA = thinks you're a tool if you drink cheap beer. PA = thinks you're a tool if you don't.
PA = unpretentious to the point of being slovenly. CA = pretentious to the point of being ridiculous.
PA = poorly educated, and still emotionally in high school. CA = highly educated, and still emotionally in pre-school.
PA = will go out of their way to tell a person they don't like them. CA = will go out of their way to pretend they like someone they can't stand.
PA = buys weed from sleazy drug dealer in an alley near the park. CA = buys weed from a licensed Pot Dispensary while driving kids home from day care.
CA = pickup in front of you on the freeway most likely filled with Mexican illegals. PA = pickup parked next to you at Denny's most likely filled with dismembered parts of 12-point buck.
PA = ancestors fought for North but wears Stars 'n Bars T-shirt. CA = parents' fortune founded on exploitation of poor, but wears Che Guevara T-shirt.
CA = tormented over "white privilege" PA = tormented by complete absence of anything that could be remotely called a privilege, never mind the goddamn color.
PA = poor but tries to look wealthier than he is. CA = rich but pays $350 for ripped jeans.
CA = threatens to sue. PA = beats you with a pipe if you threaten to sue. ("In for a penny, in for a pound.")
PA girls = I like you, I'll fuck you. CA = I hate you, I'll fuck you (if you're a casting director).
CA = Thinks NASCAR is a hybrid. PA = Thinks NASCAR is a religion.
PA = is 27 but looks 40. CA = is 50 but dresses like a teenager.
CA = Thinks Neegan is a cool character. PA = Confused why their Uncle Charlie is on "The Walking Dead."
PA boys = Is a computer programmer and wear a John Deere hat, but I will beat you like a red-headed stepchild. CA boys = Wears Affliction shirts and spent 3 hours a day in the gym, but will piss himself if challenged to fight.
CA = tries to impress you by name-dropping pop stars he knows. PA = tries to impress you by dropping 20 shots of Jaegermeister at happy hour.
PA = Wears "L.A." hat, has never been to California. CA = Wears "Compton" hat, would befoul trousers if they came within 5 miles of the place.
CA = sees public service ad promoting gay tolerance and is moved. PA = sees same ad and is moved...to hurl ashtray at television.
PA = sees no contradiction between rebel flag on front porch and black best friend. CA = sees no contradiction between liberal political views and crossing to the other side of the street when he sees a black person.
CA = claims to care about environment but leaves organic, non-GMO, gluten-free, vegan-approved garbage at state park. PA = recycles cans but shoots deer.
CA = hates socialism because he lives in it. PA = hates socialism but has no idea what the fuck it is.
PA = still talking about the time Charlie Sheen came to town nine years ago. CA = still talking about sleeping with Charlie Sheen three days ago.
PA (and MD) = Says they'll be there at 6:30, shows up at 6:25. CA = Says they'll be there at 6:30, shows up at 8. Then gets out of car and says, "Traffic."
-----------------------------------------------------
As a bonus, I'd like to add a difference between CA, NY/NJ and PA:
NY/NJ: brags about non-existent Mafia connections.
CA = brags about non-existent movie industry connections.
PA = brags about the time Dale Earnhardt puked all over his shoes, but isn't lying.
Published on February 22, 2017 19:38
February 11, 2017
Damn Baby Damn
The following was posted to Facebook some time ago. I'm reposting it here because it's raining here in Burbank, an event about as rare as a total eclipse, and it got me strangely nostalgic for bad weather. Which got me nostalgic for the bad weather of Pennsylvania. Which got me nostalgic about college. Which got me nostalgic about intoxication. Which led to me opening that bottle of Jamesons.
Maybe it's the whiskey talking, but I heard The Dream Academy's "Life in a Northern Town" on the radio tonight and I was suddenly hit with a terrific gut-shot of nostalgia for my college days. Unlike most people I've actually gotten less rather than more sentimental as I've gotten older, probably because I managed to extend my adolescence to superhuman lengths, so what is there to be nostalgic about? But goddamn. Goddamn baby those cold Pennsylvania nights. The rain that soaked your acrylic fraternity jacket. The snow that had only half-melted on the tips of your Timberlands when class was over and you had to slog back out into that shit again. The smell of frying chicken wings and cigarette smoke and last night's beer. The way the skin on your knuckles looked after you punched someone in the mouth. The deep scarlet cut on the lip where they'd punched you. The graffiti carved not written carved with the tip of your dormitory key into the wood over the urinal at Murph's and maybe amid all those scratches is your name. The cases of American Light that went for $5.19, so there was enough left over from that sawbuck to buy two packs of Marlboro Reds and a taco on the ride home. The way you had to arch your feet walking over the bathroom linoleum so they wouldn't stick to the floor. Those arguments you had at three in the morning in the alley with that girl that was never quite yours and twenty years later you wonder if she still thinks about you as much as you still think about her, because why doesn't anything now feel the way that did, that moment in the alley in the rain? The way the moisture crawled down the walls of the Depot and the crunch of the glass beneath your boots as you staggered out onto the dance floor with whatsherface from that class you sometimes went to but mostly didn't. The mason jars full of flat beer and the broken bottles of Rolling Rock and the math textbook you never opened and sold for beer money the last week of school, and what, you're only giving me a buck for it, you motherfuckers? Oh well that's two tacos and you like tacos and McNuggets and the ranch dressing and Denny's. And the taste of cigarettes and wine on that girl's tongue. Walking down Jackson to Murph's on those afternoons in the late fall when the air was just so crisp and clear and the sun going down through the trees over the creek behind you looked like a molten coin, six of us in our black jackets and our jeans and lace-up boots, and when you went inside the air was as heavy as an old cavalry blanket but you liked that smell too, and you liked the way the foam clung to the sides of the empty beer pitches sitting on the edge of the pool table, and you liked the blonde bartender and the glint in her eye that promised and promised and never delivered. The milk jugs of beer and the hammers of malt liquor. The red lightbulb I'd screw into my lamp over the bar in my room, and don't ask me what bar because you wrote your name on it. All of you did. Days that began at sundown. Nights that didn't end until three in the afternoon. That fucking juke box at Murph's. Some jackass always had to play "The Devil Went Down to Georgia" on it but that was better than the girls singing fucking "Brown-Eyed Girl" or "Paradise by the Dashboard Light" for Crissake. And the couches that had so much old weed in the cushions you could pack a bowl with it and sometimes did. And smoked that shit and it was good even when it was bad. And it was bad. And Baywatch and 90210 (Miles you look just like Brandon with those sideburns no you look like Dylan with that big forehead) and Tuesday Night Fights on the television, and Mike Tyson's Punch Out and Mortal Kombat in all of its forms. And watching Michael Jordan play with the Bulls. And going to the Dive in the watches of the night when you were so drunk the food fell off your fork on the way to your mouth, and the next day you had a row of marks on your lower lip where you'd stabbed yourself with the tines, you dumbass. And standing in those long thin lines by the intramural field watching the rugby games or the football games or the soccer games, and the girls in their white jackets and their tight jeans standing hip to hip to hip. And the way your triceps would quiver when you waited too long to carry your laundry basket to Murph's Other Suds or the Jackson Street Laundromat and fuck, how many blocks do I have to carry this thing it's got every piece of clothing I own. And BBQs with those net bags of cherrystone clams and the kegs in those tubs packed with ice. And Greek Week and Exam Week and Hell Week. And Campbell Hall and that one Coke machine that gave you fountain Coke and it was so sweet it was worth going to that horrible 7 - 10 just so you could have that sucker in its plastic cup with the shaved ice. And that fucking fountain out front. And that fucking rock where you painted your name when you graduated and then they chipped it off, and next year somebody painted his name on the same spot and they chipped that off too. And this and this and this, and other things, little things, things that didn't matter then but seem to matter now. Like when was the last time you climbed out your window onto the flat tin roof three stories up, and drank some beer there? When was the last time you walked down an alley eight blocks long and pissed on a fence? When was the last time your female roommate played the soundtrack to "Grease" so goddamn much you flung the CD into the snow when she wasn't looking, but the joke was on you because three months later the snow melted and she found it lying there on the tin roof and put it back in the player and it worked just fine, and look at me I'm Sandra Dee? When was the last time you walked up the street six deep to the Sunday meeting, and we need to get this business concluded because kickoff is at one o'clock sharp, brothers? When was the last time you saw a man and a woman fight over a Taco Bell coupon, and use their fists? When was the last time you went out on a Tuesday night for pitchers and pool, or played touch football, or got into a street fight, or sang "Piano Man" on the dance floor with twenty sweaty drunks, or made out with a total stranger in the rain? I don't know. I'm not nostalgic so I don't know. It's the whiskey talking. Certainly it is the whiskey.
But damn baby damn.
Maybe it's the whiskey talking, but I heard The Dream Academy's "Life in a Northern Town" on the radio tonight and I was suddenly hit with a terrific gut-shot of nostalgia for my college days. Unlike most people I've actually gotten less rather than more sentimental as I've gotten older, probably because I managed to extend my adolescence to superhuman lengths, so what is there to be nostalgic about? But goddamn. Goddamn baby those cold Pennsylvania nights. The rain that soaked your acrylic fraternity jacket. The snow that had only half-melted on the tips of your Timberlands when class was over and you had to slog back out into that shit again. The smell of frying chicken wings and cigarette smoke and last night's beer. The way the skin on your knuckles looked after you punched someone in the mouth. The deep scarlet cut on the lip where they'd punched you. The graffiti carved not written carved with the tip of your dormitory key into the wood over the urinal at Murph's and maybe amid all those scratches is your name. The cases of American Light that went for $5.19, so there was enough left over from that sawbuck to buy two packs of Marlboro Reds and a taco on the ride home. The way you had to arch your feet walking over the bathroom linoleum so they wouldn't stick to the floor. Those arguments you had at three in the morning in the alley with that girl that was never quite yours and twenty years later you wonder if she still thinks about you as much as you still think about her, because why doesn't anything now feel the way that did, that moment in the alley in the rain? The way the moisture crawled down the walls of the Depot and the crunch of the glass beneath your boots as you staggered out onto the dance floor with whatsherface from that class you sometimes went to but mostly didn't. The mason jars full of flat beer and the broken bottles of Rolling Rock and the math textbook you never opened and sold for beer money the last week of school, and what, you're only giving me a buck for it, you motherfuckers? Oh well that's two tacos and you like tacos and McNuggets and the ranch dressing and Denny's. And the taste of cigarettes and wine on that girl's tongue. Walking down Jackson to Murph's on those afternoons in the late fall when the air was just so crisp and clear and the sun going down through the trees over the creek behind you looked like a molten coin, six of us in our black jackets and our jeans and lace-up boots, and when you went inside the air was as heavy as an old cavalry blanket but you liked that smell too, and you liked the way the foam clung to the sides of the empty beer pitches sitting on the edge of the pool table, and you liked the blonde bartender and the glint in her eye that promised and promised and never delivered. The milk jugs of beer and the hammers of malt liquor. The red lightbulb I'd screw into my lamp over the bar in my room, and don't ask me what bar because you wrote your name on it. All of you did. Days that began at sundown. Nights that didn't end until three in the afternoon. That fucking juke box at Murph's. Some jackass always had to play "The Devil Went Down to Georgia" on it but that was better than the girls singing fucking "Brown-Eyed Girl" or "Paradise by the Dashboard Light" for Crissake. And the couches that had so much old weed in the cushions you could pack a bowl with it and sometimes did. And smoked that shit and it was good even when it was bad. And it was bad. And Baywatch and 90210 (Miles you look just like Brandon with those sideburns no you look like Dylan with that big forehead) and Tuesday Night Fights on the television, and Mike Tyson's Punch Out and Mortal Kombat in all of its forms. And watching Michael Jordan play with the Bulls. And going to the Dive in the watches of the night when you were so drunk the food fell off your fork on the way to your mouth, and the next day you had a row of marks on your lower lip where you'd stabbed yourself with the tines, you dumbass. And standing in those long thin lines by the intramural field watching the rugby games or the football games or the soccer games, and the girls in their white jackets and their tight jeans standing hip to hip to hip. And the way your triceps would quiver when you waited too long to carry your laundry basket to Murph's Other Suds or the Jackson Street Laundromat and fuck, how many blocks do I have to carry this thing it's got every piece of clothing I own. And BBQs with those net bags of cherrystone clams and the kegs in those tubs packed with ice. And Greek Week and Exam Week and Hell Week. And Campbell Hall and that one Coke machine that gave you fountain Coke and it was so sweet it was worth going to that horrible 7 - 10 just so you could have that sucker in its plastic cup with the shaved ice. And that fucking fountain out front. And that fucking rock where you painted your name when you graduated and then they chipped it off, and next year somebody painted his name on the same spot and they chipped that off too. And this and this and this, and other things, little things, things that didn't matter then but seem to matter now. Like when was the last time you climbed out your window onto the flat tin roof three stories up, and drank some beer there? When was the last time you walked down an alley eight blocks long and pissed on a fence? When was the last time your female roommate played the soundtrack to "Grease" so goddamn much you flung the CD into the snow when she wasn't looking, but the joke was on you because three months later the snow melted and she found it lying there on the tin roof and put it back in the player and it worked just fine, and look at me I'm Sandra Dee? When was the last time you walked up the street six deep to the Sunday meeting, and we need to get this business concluded because kickoff is at one o'clock sharp, brothers? When was the last time you saw a man and a woman fight over a Taco Bell coupon, and use their fists? When was the last time you went out on a Tuesday night for pitchers and pool, or played touch football, or got into a street fight, or sang "Piano Man" on the dance floor with twenty sweaty drunks, or made out with a total stranger in the rain? I don't know. I'm not nostalgic so I don't know. It's the whiskey talking. Certainly it is the whiskey.
But damn baby damn.
Published on February 11, 2017 10:53
February 7, 2017
Authors Who Inspire
In the last week I've been wearing many hats -- sketching ideas for blogs, working on a new novel, redesigning my website, and taking advantage of the freakishly wet weather in California to play outdoors. (When you live in a semi-desert climate which has been experiencing an "exceptional" drought for the past five years, rain is always a novelty, but nowadays it's practically a unicorn-sighting.) However, since I have a minute to myself just now, I thought I'd take a minute to talk about a subject dear to my heart, and no doubt to the heart of everyone on this website...their favorite authors. More specifically, I wanted to talk briefly about the authors who inspire me as a writer.
It's important to understand that there is a distinction between enjoying an author's work and being inspired by it. Many authors have entertained me without influencing my style. I enjoyed reading J.K. Rowling's "Harry Potter" series as a reader, but I can't say she inspired me as an writer. I'm a huge fan of George R.R. Martin's "Game of Thrones" saga as well, but again, I don't believe Martin has actually exerted what could be called a literary influence. What I'm talking about here are the authors whose fingerprints are (metaphorically) all over my own work, the folks who, for lack of a better phrase, taught me to write. And at the risk of destroying my own credibility before we even get properly underway, I have to say I'm going to start with some books which will probably surprise you.
The first time I can remember being stirred by the power of an author's prose was when I slipped a biography called CAPONE, by John Kobler, out of its niche on my father's bookshelf and tentatively scanned the opening chapters. Though a historian, Kobler's command of the English language was such that opening the book was like falling through a hole in time -- not only could I see what he was describing, I could feel it, taste it, touch it. And this when I was probably still shy of ten years old. I credit Kobler in part with giving me a taste for evocative description in fiction -- even though, as I said, he was writing a historical biography.
Tied with Kobler, or a close second, was an author named Joe Silva (this is probably a pen name) who wrote -- don't laugh, damn it, if you can admit to reading the opening volumes of "Harry Potter" you have no right -- a Captain America novel called HOLOCAUST FOR HIRE. My older brother bought this lurid-covered little volume in the late 70s, and I cracked it open one day out of sheer curiosity. Though it is obviously a YA novel, and a short one at that, I was deeply impressed by Silva's command of the language -- the way he painted alluring word-pictures on the one hand while writing, on the other, in a highly economical vein that never let the story, or the reader, take a rest.
Another YA novelist, Frank Bonham, grabbed my attention while still in junior high school, with such disparate works as THE GHOST FRONT and DURANGO STREET. Bonham is one of those authors who writes so realistically that even when his stories wander into implausibility, you feel as if you're there, and that everything happening is happening to you. He achieved this by getting the little details right, which is probably the hardest thing for any novelist to do. He was also a first-class researcher, which is why he was able to achieve equal verisimilitude in novels about The Battle of the Bulge, and a young black kid struggling with inner-city gang life, respectively.
I'd be lying if I said that Sir Arthur Conan-Doyle didn't influence me in very much the same way. Though his prose most definitely belonged to the period in which he wrote (the late 19th, early 20th centuries), he too could make marvelous use of English to paint word-pictures of people and objects. The best of the "Sherlock Holmes" stories (THE SIGN OF FOUR in particular) always leave me with a feeling that I have just stepped out of Victorian London and need to shake out my umbrella. At the top of his game, Sir Arthur was able to leave me feeling as if I could almost touch whatever he was describing -- be it a jewel, a brass pocket watch, or a person. I credit him as being one of the authors who triggered my lifelong affair with we call "atmosphere."
During my teenage years my influences expanded. I struggled through Frank Herbert's DUNE on the second or third attempt -- the subject matter was above me until I hit about fifteen or so -- and came out realizing that one could employ entirely different techniques than those I'd become familiar with and still leave vivid images in the readers' mind. Specifically, Herbert used a kind of "subtractive drawing" method in descriptions, especially physical ones. That is to say, Herbert often left us with strong impressions of how characters or objects looked, yet when one looks back on the passages in question, there is little in the way of actual, direct description. He describes around the person, one might say, triggering the imagination of the reader and letting him or her fill in crucial details. He applied this method as well to the universe which the characters inhabit, revealing many details but withholding many others, so that once again, the reader willingly fills in the gaps. This style might be called non-linear, oblique, selective, or impressionistic, and while it's a bit risky -- if you fail, you fail badly, and Herbert sometimes did, especially in his later novels -- it achieves its object brilliantly. What Herbert did for me was show that there were alternatives to the direct approach. Which brings me to Tom Clancy.
Everyone who lived in the 80s probably had a go at at least one Clancy novel, and I think I must have read five or so while the Cold War was still reasonably hot. I'm almost embarrassed to admit it now, but Clancy did have a profound, if brief, effect on my own own developing style during this time. Like, say, Robert Heinlein, Clancy provided very little in the way of physical description in his novels, and almost no use of color. For someone who had been obsessed with these things, it was jarring to read a writer who didn't seem to give a fuck about them.
While still a teenager, I experimented a great deal with trying to "describe without describing" by talking about secondary details such as haircuts, rings, equipment, accents, hand-gestures and so forth, and while I eventually abandoned the experiments, they remain useful to me to this day. There are times when an author has neither the room or the desire to describe a character, yet feels he must leave a strong impression nonetheless. The methods of both Herbert and Clancy, though they could not be more different from one another, are both useful in this regard.
Another enormous influence on me, particularly in terms of dialogue was Thomas Harris, author of BLACK SUNDAY, RED DRAGON, and THE SILENCE OF THE LAMBS. Though it's impossible to credit Harris without crediting his spiritual mentor, Hemingway, Harris took the basic Hemingway techniques -- stripped-down prose, few adjectives, restrained punctuation, terse description, use of simple words to evoke imagery, and of course the "Iceberg Theory" that less is more when it comes to just about everything -- and refined them for the later 20th century. Simply put, Harris wrote the first novels in which the dialogue generally sounded "real" to me -- that is, the way people actually spoke. (This is separate from "good dialogue," which is pleasing to read but not necessarily realistic.) What's more, his sense of restraint on backstory blew my mind. He gave us glimpses and the occasional reveal about our key characters, but kept exposition to an absolute minimum. It was the ultimate example of "showing, not telling" and to this very hour I struggle to emulate him in this regard and curb the impulse to tell the reader too much, too soon. Like Herbert, Harris wrote popular fiction in a literary vein, which ultimately became the key element of my own style.
Two German writers I read (in translation) while still a teen who inspired me enormously were Hasso G. Stachow and Lothar-Günther Bucheim. The former wrote IF THIS BE GLORY, the latter the much better-known DAS BOOT. Stachow, though not the best structurally speaking, is probably the best prose-writer I've ever read in my entire life. His descriptive powers are nothing short of awesome, and I flatter myself that at my best I can do 75% of what he did in his terse little "novel" (it's really a true story written in third person) about a young German soldier fighting in Russia during WW2. Bucheim's "novel" (based on his own experiences) about a U-boat officer, fighting on the same side, in the same war, is equally awe-inspiring, though in a different way. Bucheim, though masterful in description, achieves his apotheosis in dialogue. It's not only witty, sarcastic, and brutally vulgar, it's also so realistic you'd swear you can hear pissed-off German sailors discussing their X-rated shore-leave antics while cowering from depth-charges. And since I mentioned it, the depth-charge sequence in the book, which goes on for 26 pages, is probably the most harrowing thing I've ever head. A masterpiece of atmosphere, description and human psychology.
Three authors who were terrific at "bringing me there" were George Orwell, Lawrence Sanders and P.J. Caputo. Orwell's novels 1984, KEEP THE ASPIDISTRA FLYING, and COMING UP FOR AIR, are all amazing in their ability to make you feel whatever it is the protagonist is feeling, be it loneliness, hunger or the stickiness of his socks. Lawrence Sanders, a prolific novelist of the 70s - 80s, also had superlative descriptive powers -- the opening chapters of THE FOURTH DEADLY SIN are as evocative as anything you'll ever read. Likewise, P.J. Captuto's memoir of Vietnam, A RUMOR OF WAR, might as well be a time machine. You are there, whether you want to be or not.
My biggest weakness as an author has always been structure, so I remain perpetually in awe of Howard Fast's classic SPARTACUS. In addition to being a fine if heavy-handed "period" novel about the eternal conflict between the desire of man for freedom, and the desire to exploit one's fellow man for profit, this book is a towering masterwork of structure. It begins near the end of the story, long after the Spartacus Rebellion has been crushed, and then unfolds both in the "now" and through a series of extended flashbacks told from differing perspectives. There is no attempt at linear storytelling except via the main narrative, and yet by the end all the character arcs (and there are many) have been neatly and finally resolved, and the story told. It's a remarkable feat, and part of the reason why I reread this book once a year.
Ernest K. Gann wrote two books that affected me greatly: THE ANTAGONISTS (which later became a lavish mini-series called MASADA) and COMPANY OF EAGLES. The former is impressive for bringing drama to the story of a long siege during which not much fighting occurred -- Gann makes the story about the opposing generals, Flavius Silva and Eleazar ben Yair, and the struggles each must face while commanding armies in such a horrible place as the Judean desert. The latter, while a deeply flawed book in some ways, was the first time I ever saw an author use the device of having entire chapters written using dialogue alone. I'm always delighted by new approaches to storytelling, and this one was a gem.
From Herman Wouk, author of THE WINDS OF WAR and WAR & REMEMBRANCE, I learned a great deal, not least of which were the possibilities inherent in historical fiction told on a grand scale. These two huge novels encompass the years 1939 - 1945, i.e. WW2, and are told with an overlapping cast of characters spread all over the world. Though the Henry, Jastrow and Tudsbury families are fictitious, the way they interact with such historical figures as Adolf Hitler, Benito Mussolini, Josef Stalin, Winston Churchill, and various lesser generals, politicians, businessmen, etc. of the day seems entirely realistic. Wouk sets out to do nothing less than tell the story of WW2, and insomuch as that is possible, he actually achieves it. Epic storytelling seems to be on the decline in recent decades (notwithstanding the popularity of "Game of Thrones") but Wouk handled it beautifully, and more importantly from my point of view, showed me that it could be done at all.
In the last few years I've discovered the dark pleasure of Clive Barker's work, which impresses me -- the little that I've read of it, anyway -- in its fearlessness. Barker is a man with a very checkered past, including a lot of sexual confusion and debauchery in his early life, and something of that experience colors all of his work. There is a strong charge of frustrated and sometimes perverted sexual energy running through his stories, even those stories which don't touch even lightly on sex. From THE BOOKS OF BLOOD and to some extent THE DAMNATION GAME I took away a refusal to be cowed or intimidated by the subject of sexuality in my own work. What's more, Barker writes absolutely first-class descriptive prose when he wants to, stuff that I could barely dream up, much less execute. He has a refreshingly honest approach to horror, viewing it not as an end in itself so much as a means by which fundamental truths about life and humanity can be taught. Conversely, his novel THE HELLBOUND HEART (the basis for the movie HELLRAISER) is a tour-de-force of literary minimalism. Unlike DAMNATION, which was brilliant but bloated, HELLBOUND uses no more words than are absolutely required to tell the grim and grisly tale. A writer who can use multiple approaches to storytelling, who isn't locked into a single gear, always impresses me.
The British author Derek Robinson also influenced me deeply, most notably with PIECE OF CAKE, his cynical, myth-debunking take on the Battle of Britain. This hefty tome of a novel takes us through 365 days of war, from September 1940 to September 1941, using a rotating cast of characters constantly being changed by virtue of death. Robinson's dialogue is often marvelous, and his mastery of point-of-view, which can be so odious to authors writing big, sweeping stories, nails down what might have become a meaningless jumble of individual stories into a single, unifying narrative.
I see here I haven't selected any female authors, so I'll mention Ursula K. le Guin, whose THE LEFT HAND OF DARKNESS showed how it's possible to incorporate artistic techniques into novel-writing. In art, it is possible to paint or draw using anti-intuitive techniques such as pointilism, where the artist paints not with strokes but with dots, or subtractive drawing, where the eraser and not the tip is the key element in creating the picture. (There is even an approach, whose name I can't remember, in which you draw an object by drawing everything around it and leaving the actual object space completely blank.) Le Guin told her story by refusing to stick to any one method of storytelling. Some of the chapters are first-person in the present tense, some told via a journal in past tense, some by interludes which explain the mythology of the planet and society in question without explaining who the narrator is, and through it all there is a shift in perspective between several different characters. It shouldn't work, but Le Guin pulls it off, and in pulling it off, showed me that outside-the-box thinking can do a lot for a novel if it is executed to serve the story and not for its own sake.
I'd be lying if I didn't say that Anne Rice's INTERVIEW WITH A VAMPIRE, which I finished at some point or other in my early 20s after a false start or two, didn't influence me, either -- if only by virtue of re-igniting my taste for Victorian and Gothic-style prose. If writers like Hemingway and Harris have an exact opposite, it's Rice -- at least in the approach she took with her baroque vampire odyssey. Throwing open the sealed-shut doors of the 19th century style, she dusted off a rich, ornate, slow-moving detail-heavy style of storytelling that, even when it went off the deep end into massive internal monologues that sounded like free verse poetry, struck me almost senseless with its beauty. This style doesn't really suit many subsets of fiction-writing, but it was practically made for the vampire genre, and for any historical genre set before, say, 1914. Thanks to Rice I now have this tool in my toolbox, and while I seldom use it, it's nice to know it's there.
Tami Hoag, though she's mainly known for romantic suspense potboilers, is very good at what she does. Her two-book "Deer Lake" series (NIGHT SINS, GUILTY AS SIN) which I read about five years ago after seeing a TV movie of the former, impressed me deeply as examples of how to employ "other elements" as characters in a story -- in this case, a town (Deer Lake) serves as one of the chief antagonists, while the overall atmosphere of suspicion, paranoia, fear and dread caused by a mysterious kidnapping, become yet another. Hell, even the ferocious cold of Michigan in winter is employed to menacing effect, becoming yet a third opponent for our troubled heroine, Megan O'Malley. At the risk of beating a very dead horse, I'm a sucker for atmosphere; I believe it can make a bad project good and a good one better, and whatever the shortcomings of romantic suspense novels, with their tendency toward predictability and formula, Hoag delivered two immensely readable novels in which one is terrorized in broad daylight in a generic American town.
I thought Matthew Stover did a helluva job on his novelization of REVENGE OF THE SITH, utilizing a rich and varied prose to overcome the many natural weaknesses of writing novelizations of films. In a novelization, the author is usually forbidden by virtue of the studio from describing his characters very well (if at all), engaging in long internal monologues, or even employing much color. Yet sometimes laboring under constraints makes us better by forcing us to make the most of the limited canvas upon which we paint. The "Star Wars" universe never got richer treatment of characterization, better-written action sequences, or a keener and more philosophical analysis of the Force in action, than it did in this surprising gem of a book.
Comac McCarthy (NO COUNTRY FOR OLD MEN) and Frank McCourt (ANGELA'S ASHES, 'TIS) both employ an unusual style in their novels which does not utilize quotation marks, get deeply into description, or take the narrative past the surface level. It can become tiring at points, and certainly limits the writer in terms of what they can actually describe, yet it does produce a distinct effect on the reader's mind -- an intensely personal effect, as if you were eavesdropping on these characters rather than simply reading about them. I haven't yet employed this method in any of my own novels, but I have used it on a smaller scale within them or in short stories, and one day intend to try my hand at it. In the mean time, it's nice to know this technique does exist.
These authors I've mentioned are by no means all-inclusive on my list of inspirations -- I could probably name as many non-fiction authors who've had a hand in developing my style as novelists -- but I think they give you a pretty good idea of where my own style came from, and what moves me as a student of the craft. But it's important to note that this fund of names is always being increased: as I continue to read, I continue to be influenced and inspired by "new" authors (meaning ones I hadn't read before). Granted, sometimes I'm inspired only by virtue of saying, "Christ, this sucks! A monkey chained to a typewriter could do better!" but that's another subject entirely.
It's important to understand that there is a distinction between enjoying an author's work and being inspired by it. Many authors have entertained me without influencing my style. I enjoyed reading J.K. Rowling's "Harry Potter" series as a reader, but I can't say she inspired me as an writer. I'm a huge fan of George R.R. Martin's "Game of Thrones" saga as well, but again, I don't believe Martin has actually exerted what could be called a literary influence. What I'm talking about here are the authors whose fingerprints are (metaphorically) all over my own work, the folks who, for lack of a better phrase, taught me to write. And at the risk of destroying my own credibility before we even get properly underway, I have to say I'm going to start with some books which will probably surprise you.
The first time I can remember being stirred by the power of an author's prose was when I slipped a biography called CAPONE, by John Kobler, out of its niche on my father's bookshelf and tentatively scanned the opening chapters. Though a historian, Kobler's command of the English language was such that opening the book was like falling through a hole in time -- not only could I see what he was describing, I could feel it, taste it, touch it. And this when I was probably still shy of ten years old. I credit Kobler in part with giving me a taste for evocative description in fiction -- even though, as I said, he was writing a historical biography.
Tied with Kobler, or a close second, was an author named Joe Silva (this is probably a pen name) who wrote -- don't laugh, damn it, if you can admit to reading the opening volumes of "Harry Potter" you have no right -- a Captain America novel called HOLOCAUST FOR HIRE. My older brother bought this lurid-covered little volume in the late 70s, and I cracked it open one day out of sheer curiosity. Though it is obviously a YA novel, and a short one at that, I was deeply impressed by Silva's command of the language -- the way he painted alluring word-pictures on the one hand while writing, on the other, in a highly economical vein that never let the story, or the reader, take a rest.
Another YA novelist, Frank Bonham, grabbed my attention while still in junior high school, with such disparate works as THE GHOST FRONT and DURANGO STREET. Bonham is one of those authors who writes so realistically that even when his stories wander into implausibility, you feel as if you're there, and that everything happening is happening to you. He achieved this by getting the little details right, which is probably the hardest thing for any novelist to do. He was also a first-class researcher, which is why he was able to achieve equal verisimilitude in novels about The Battle of the Bulge, and a young black kid struggling with inner-city gang life, respectively.
I'd be lying if I said that Sir Arthur Conan-Doyle didn't influence me in very much the same way. Though his prose most definitely belonged to the period in which he wrote (the late 19th, early 20th centuries), he too could make marvelous use of English to paint word-pictures of people and objects. The best of the "Sherlock Holmes" stories (THE SIGN OF FOUR in particular) always leave me with a feeling that I have just stepped out of Victorian London and need to shake out my umbrella. At the top of his game, Sir Arthur was able to leave me feeling as if I could almost touch whatever he was describing -- be it a jewel, a brass pocket watch, or a person. I credit him as being one of the authors who triggered my lifelong affair with we call "atmosphere."
During my teenage years my influences expanded. I struggled through Frank Herbert's DUNE on the second or third attempt -- the subject matter was above me until I hit about fifteen or so -- and came out realizing that one could employ entirely different techniques than those I'd become familiar with and still leave vivid images in the readers' mind. Specifically, Herbert used a kind of "subtractive drawing" method in descriptions, especially physical ones. That is to say, Herbert often left us with strong impressions of how characters or objects looked, yet when one looks back on the passages in question, there is little in the way of actual, direct description. He describes around the person, one might say, triggering the imagination of the reader and letting him or her fill in crucial details. He applied this method as well to the universe which the characters inhabit, revealing many details but withholding many others, so that once again, the reader willingly fills in the gaps. This style might be called non-linear, oblique, selective, or impressionistic, and while it's a bit risky -- if you fail, you fail badly, and Herbert sometimes did, especially in his later novels -- it achieves its object brilliantly. What Herbert did for me was show that there were alternatives to the direct approach. Which brings me to Tom Clancy.
Everyone who lived in the 80s probably had a go at at least one Clancy novel, and I think I must have read five or so while the Cold War was still reasonably hot. I'm almost embarrassed to admit it now, but Clancy did have a profound, if brief, effect on my own own developing style during this time. Like, say, Robert Heinlein, Clancy provided very little in the way of physical description in his novels, and almost no use of color. For someone who had been obsessed with these things, it was jarring to read a writer who didn't seem to give a fuck about them.
While still a teenager, I experimented a great deal with trying to "describe without describing" by talking about secondary details such as haircuts, rings, equipment, accents, hand-gestures and so forth, and while I eventually abandoned the experiments, they remain useful to me to this day. There are times when an author has neither the room or the desire to describe a character, yet feels he must leave a strong impression nonetheless. The methods of both Herbert and Clancy, though they could not be more different from one another, are both useful in this regard.
Another enormous influence on me, particularly in terms of dialogue was Thomas Harris, author of BLACK SUNDAY, RED DRAGON, and THE SILENCE OF THE LAMBS. Though it's impossible to credit Harris without crediting his spiritual mentor, Hemingway, Harris took the basic Hemingway techniques -- stripped-down prose, few adjectives, restrained punctuation, terse description, use of simple words to evoke imagery, and of course the "Iceberg Theory" that less is more when it comes to just about everything -- and refined them for the later 20th century. Simply put, Harris wrote the first novels in which the dialogue generally sounded "real" to me -- that is, the way people actually spoke. (This is separate from "good dialogue," which is pleasing to read but not necessarily realistic.) What's more, his sense of restraint on backstory blew my mind. He gave us glimpses and the occasional reveal about our key characters, but kept exposition to an absolute minimum. It was the ultimate example of "showing, not telling" and to this very hour I struggle to emulate him in this regard and curb the impulse to tell the reader too much, too soon. Like Herbert, Harris wrote popular fiction in a literary vein, which ultimately became the key element of my own style.
Two German writers I read (in translation) while still a teen who inspired me enormously were Hasso G. Stachow and Lothar-Günther Bucheim. The former wrote IF THIS BE GLORY, the latter the much better-known DAS BOOT. Stachow, though not the best structurally speaking, is probably the best prose-writer I've ever read in my entire life. His descriptive powers are nothing short of awesome, and I flatter myself that at my best I can do 75% of what he did in his terse little "novel" (it's really a true story written in third person) about a young German soldier fighting in Russia during WW2. Bucheim's "novel" (based on his own experiences) about a U-boat officer, fighting on the same side, in the same war, is equally awe-inspiring, though in a different way. Bucheim, though masterful in description, achieves his apotheosis in dialogue. It's not only witty, sarcastic, and brutally vulgar, it's also so realistic you'd swear you can hear pissed-off German sailors discussing their X-rated shore-leave antics while cowering from depth-charges. And since I mentioned it, the depth-charge sequence in the book, which goes on for 26 pages, is probably the most harrowing thing I've ever head. A masterpiece of atmosphere, description and human psychology.
Three authors who were terrific at "bringing me there" were George Orwell, Lawrence Sanders and P.J. Caputo. Orwell's novels 1984, KEEP THE ASPIDISTRA FLYING, and COMING UP FOR AIR, are all amazing in their ability to make you feel whatever it is the protagonist is feeling, be it loneliness, hunger or the stickiness of his socks. Lawrence Sanders, a prolific novelist of the 70s - 80s, also had superlative descriptive powers -- the opening chapters of THE FOURTH DEADLY SIN are as evocative as anything you'll ever read. Likewise, P.J. Captuto's memoir of Vietnam, A RUMOR OF WAR, might as well be a time machine. You are there, whether you want to be or not.
My biggest weakness as an author has always been structure, so I remain perpetually in awe of Howard Fast's classic SPARTACUS. In addition to being a fine if heavy-handed "period" novel about the eternal conflict between the desire of man for freedom, and the desire to exploit one's fellow man for profit, this book is a towering masterwork of structure. It begins near the end of the story, long after the Spartacus Rebellion has been crushed, and then unfolds both in the "now" and through a series of extended flashbacks told from differing perspectives. There is no attempt at linear storytelling except via the main narrative, and yet by the end all the character arcs (and there are many) have been neatly and finally resolved, and the story told. It's a remarkable feat, and part of the reason why I reread this book once a year.
Ernest K. Gann wrote two books that affected me greatly: THE ANTAGONISTS (which later became a lavish mini-series called MASADA) and COMPANY OF EAGLES. The former is impressive for bringing drama to the story of a long siege during which not much fighting occurred -- Gann makes the story about the opposing generals, Flavius Silva and Eleazar ben Yair, and the struggles each must face while commanding armies in such a horrible place as the Judean desert. The latter, while a deeply flawed book in some ways, was the first time I ever saw an author use the device of having entire chapters written using dialogue alone. I'm always delighted by new approaches to storytelling, and this one was a gem.
From Herman Wouk, author of THE WINDS OF WAR and WAR & REMEMBRANCE, I learned a great deal, not least of which were the possibilities inherent in historical fiction told on a grand scale. These two huge novels encompass the years 1939 - 1945, i.e. WW2, and are told with an overlapping cast of characters spread all over the world. Though the Henry, Jastrow and Tudsbury families are fictitious, the way they interact with such historical figures as Adolf Hitler, Benito Mussolini, Josef Stalin, Winston Churchill, and various lesser generals, politicians, businessmen, etc. of the day seems entirely realistic. Wouk sets out to do nothing less than tell the story of WW2, and insomuch as that is possible, he actually achieves it. Epic storytelling seems to be on the decline in recent decades (notwithstanding the popularity of "Game of Thrones") but Wouk handled it beautifully, and more importantly from my point of view, showed me that it could be done at all.
In the last few years I've discovered the dark pleasure of Clive Barker's work, which impresses me -- the little that I've read of it, anyway -- in its fearlessness. Barker is a man with a very checkered past, including a lot of sexual confusion and debauchery in his early life, and something of that experience colors all of his work. There is a strong charge of frustrated and sometimes perverted sexual energy running through his stories, even those stories which don't touch even lightly on sex. From THE BOOKS OF BLOOD and to some extent THE DAMNATION GAME I took away a refusal to be cowed or intimidated by the subject of sexuality in my own work. What's more, Barker writes absolutely first-class descriptive prose when he wants to, stuff that I could barely dream up, much less execute. He has a refreshingly honest approach to horror, viewing it not as an end in itself so much as a means by which fundamental truths about life and humanity can be taught. Conversely, his novel THE HELLBOUND HEART (the basis for the movie HELLRAISER) is a tour-de-force of literary minimalism. Unlike DAMNATION, which was brilliant but bloated, HELLBOUND uses no more words than are absolutely required to tell the grim and grisly tale. A writer who can use multiple approaches to storytelling, who isn't locked into a single gear, always impresses me.
The British author Derek Robinson also influenced me deeply, most notably with PIECE OF CAKE, his cynical, myth-debunking take on the Battle of Britain. This hefty tome of a novel takes us through 365 days of war, from September 1940 to September 1941, using a rotating cast of characters constantly being changed by virtue of death. Robinson's dialogue is often marvelous, and his mastery of point-of-view, which can be so odious to authors writing big, sweeping stories, nails down what might have become a meaningless jumble of individual stories into a single, unifying narrative.
I see here I haven't selected any female authors, so I'll mention Ursula K. le Guin, whose THE LEFT HAND OF DARKNESS showed how it's possible to incorporate artistic techniques into novel-writing. In art, it is possible to paint or draw using anti-intuitive techniques such as pointilism, where the artist paints not with strokes but with dots, or subtractive drawing, where the eraser and not the tip is the key element in creating the picture. (There is even an approach, whose name I can't remember, in which you draw an object by drawing everything around it and leaving the actual object space completely blank.) Le Guin told her story by refusing to stick to any one method of storytelling. Some of the chapters are first-person in the present tense, some told via a journal in past tense, some by interludes which explain the mythology of the planet and society in question without explaining who the narrator is, and through it all there is a shift in perspective between several different characters. It shouldn't work, but Le Guin pulls it off, and in pulling it off, showed me that outside-the-box thinking can do a lot for a novel if it is executed to serve the story and not for its own sake.
I'd be lying if I didn't say that Anne Rice's INTERVIEW WITH A VAMPIRE, which I finished at some point or other in my early 20s after a false start or two, didn't influence me, either -- if only by virtue of re-igniting my taste for Victorian and Gothic-style prose. If writers like Hemingway and Harris have an exact opposite, it's Rice -- at least in the approach she took with her baroque vampire odyssey. Throwing open the sealed-shut doors of the 19th century style, she dusted off a rich, ornate, slow-moving detail-heavy style of storytelling that, even when it went off the deep end into massive internal monologues that sounded like free verse poetry, struck me almost senseless with its beauty. This style doesn't really suit many subsets of fiction-writing, but it was practically made for the vampire genre, and for any historical genre set before, say, 1914. Thanks to Rice I now have this tool in my toolbox, and while I seldom use it, it's nice to know it's there.
Tami Hoag, though she's mainly known for romantic suspense potboilers, is very good at what she does. Her two-book "Deer Lake" series (NIGHT SINS, GUILTY AS SIN) which I read about five years ago after seeing a TV movie of the former, impressed me deeply as examples of how to employ "other elements" as characters in a story -- in this case, a town (Deer Lake) serves as one of the chief antagonists, while the overall atmosphere of suspicion, paranoia, fear and dread caused by a mysterious kidnapping, become yet another. Hell, even the ferocious cold of Michigan in winter is employed to menacing effect, becoming yet a third opponent for our troubled heroine, Megan O'Malley. At the risk of beating a very dead horse, I'm a sucker for atmosphere; I believe it can make a bad project good and a good one better, and whatever the shortcomings of romantic suspense novels, with their tendency toward predictability and formula, Hoag delivered two immensely readable novels in which one is terrorized in broad daylight in a generic American town.
I thought Matthew Stover did a helluva job on his novelization of REVENGE OF THE SITH, utilizing a rich and varied prose to overcome the many natural weaknesses of writing novelizations of films. In a novelization, the author is usually forbidden by virtue of the studio from describing his characters very well (if at all), engaging in long internal monologues, or even employing much color. Yet sometimes laboring under constraints makes us better by forcing us to make the most of the limited canvas upon which we paint. The "Star Wars" universe never got richer treatment of characterization, better-written action sequences, or a keener and more philosophical analysis of the Force in action, than it did in this surprising gem of a book.
Comac McCarthy (NO COUNTRY FOR OLD MEN) and Frank McCourt (ANGELA'S ASHES, 'TIS) both employ an unusual style in their novels which does not utilize quotation marks, get deeply into description, or take the narrative past the surface level. It can become tiring at points, and certainly limits the writer in terms of what they can actually describe, yet it does produce a distinct effect on the reader's mind -- an intensely personal effect, as if you were eavesdropping on these characters rather than simply reading about them. I haven't yet employed this method in any of my own novels, but I have used it on a smaller scale within them or in short stories, and one day intend to try my hand at it. In the mean time, it's nice to know this technique does exist.
These authors I've mentioned are by no means all-inclusive on my list of inspirations -- I could probably name as many non-fiction authors who've had a hand in developing my style as novelists -- but I think they give you a pretty good idea of where my own style came from, and what moves me as a student of the craft. But it's important to note that this fund of names is always being increased: as I continue to read, I continue to be influenced and inspired by "new" authors (meaning ones I hadn't read before). Granted, sometimes I'm inspired only by virtue of saying, "Christ, this sucks! A monkey chained to a typewriter could do better!" but that's another subject entirely.
Published on February 07, 2017 16:24
January 23, 2017
When to Shut Up: Why Prequels Usually Suck
Don't complete your own revolution.
--- Leonardo da Vinci to Michelangelo
I know that I promised I would return with a blog continuing my attack on the American monetary system, but I find that post-Christmas, my desire to talk about finances is at an all-time low. Since yours probably is too, I shall lay turn away from the windmill that is the Federal Reserve, and, like Don Quixote, tilt again at it later. God knows it will still be there. In the mean time I've decided to tackle a more approachable subject: Hollywood. Or, specifically, Hollywood's recent obsession with what are known as “origin stories.”
The name “origin story” is a little deceptive and needs fast explaining. It denotes a story which details the origin of a character, but it connotes a story which is written after the character's debut in a book, movie or television series. (In “The Godfather” we are introduced to Vito Corleone, but only in “The Godfather, Part II,” we learn the circumstances by which he came to power – his childhood and early life.) However, an origin story is not limited to explaining the backstory of characters; it can also explain the backstory of a universe, or focus on a set of events which led to the circumstances obtaining later in a story's timeline. (For example, the third film in the Kate Beckinsale series Underworld, called Rise of the Lycans, is actually set before the previous two films.)
Origin stories, which in movies are usually called prequels, have always been with us, and I am not opposed to them per se, but I've found that unless they are very skilfilly done (see the afformentioned “The Godfather, Part II”), they tend to do more harm than good. And in most cases they can only be skilfully handled if the writer's motivation is passion rather than profit – if he or she feels a need, not just a want but a need, to expand upon the character or the story. In recent years there has been a veritable orgy of backstory in film which has spilled over into the literary world as well; but the motivation for writing it seems to be the opposite of ideal: profit rather than passion. Of course money is not always the reasoning behind such excursions into prequel; in some instances it is simply sentimentality or bad judgement; but regardless of the reason, the outcome tends to be either forgettable, regrettable, or just plain awful.
To understand the core of my grudge against origin stories, it is necessary to understand the so-called “iceberg theory” of Ernest Hemingway, also known as “the theory of omission.” As Hemingway himself said: “If a writer of prose knows enough of what he is writing about he may omit things that he knows and the reader, if the writer is writing truly enough, will have a feeling of those things as strongly as though the writer had stated them. The dignity of movement of an ice-berg is due to only one-eighth of it being above water.” This theory, that less is more, that what is implied is often stronger than what is explicitly stated, is implicit in most great storytelling. The fact is, what we do not know about our characters and universe is often the source of our greatest pleasure, since, as Stephen King told us in Danse Macabre, our minds tend to fill in gaps in information with details far more imaginative and satisfying than anything even the best writers could conceive. If you have been moved by a series of any kind, be it film, television or written word, you've probably speculated a great deal about the characters and universe of that series; you may have also experienced the peculiar emotion which occurs when some cherished theory of yours, possibly held for years, gets smashed to bits by a new storyline that makes nonsense of it.
When I was growing up – and today, for that matter – one of my favorite films was “Alien.” What I loved most about the story was its underlying sense of mystery. Our heroes are hauling mineral ore across the galaxy when diverted to an obscure, uninhabited planet by a mysterious distress signal. They set down and find, upon its barren surface, a huge, derelict alien spacecraft. Within that spacecraft are multitudes of eggs, one of which hatches, with very unfortunate result. For “Alien” the central plot point is the hatching, but for me, a tiny part of the audience, the appeal lay in wondering about these things:
1. Where did the alien ship come from?
2. Why did it land on this barren planet?
3. How long was it there?
4. Are the eggs native to the planet, were they cargo on the ship, or were they lain after the ship's crew was dead?
5. What was the exact text of the “SOS” which turned out to be a warning?
“Alien” presents us with a huge mystery which it never even attempts to solve, and the result is a far more effective and terrifying story than if these questions had been answered, because one of the central themes of the movie is primal fear – and our strongest primal fear is probably fear of the unknown. Ignorance and mystery inspire horror, because horror is the anticipation of a terrifying outcome, and humans, by virtue of the hard experience we call instinct or race memory, tend to fear the worst. Also, by some perversity of nature, we tend to enjoy the feeling of being intrigued, teased, titillated, even. As with certain hobbies and activities, the fun lies in the process and not in its completion – the riddle, and not its answer. Well, the unanswered riddles from this film haunted and intrigued and delighted me...for about 30 years. Then one day Ridley Scott & Co. decided to make a prequel, called “Prometheus,” and I felt my heart sink. On the one hand, the movie might supply these long-craved answers; on the other hand, well...they might supply those long-craved answers. It was a case of be careful what you wish for. As it happens, “Prometheus” tells, or rather begins to tell, the story of how the derelict spacecraft came to be resting on that haunted-looking planet. It answers two of the questions above, and sets up the answer to at least two more...and, I find, it disappoints the living shit out of me every time I think about it. The writers were unable to provide steak equal to the sizzle of the questions posed by the original film; all they did was demystify something most cinephiles consider sacred. It turns out that King was right: what's in the dark is all the more frightening because we can't see it. “Alien,” to me, is so much more effective if you pretend that “Prometheus” never existed, because it re-enshrines the story in darkness and conundrum, and leaves that uneasy question mark hanging in space. Where no one can hear you scream.
Speaking of sacred, let's talk for a moment about The Force. It is one of the cornerstone-concepts of the “Star Wars” universe, yet in the first two movies of that series the total explanations we get of it boil down to a handful of short sentences, such as: “The Force is an energy field created by all living things. It surrounds us, it penetrates us, it binds the galaxy together." And indeed, we neither need nor want deeper explanations than this: the unifying theory of the original “Star Wars” films is simplicity. The story is almost pathetically simple, and the central theme as old as King Arthur, but this is in large part why it works. Just as we accept Merlin's magic as being part of the world of Camelot without experiencing any desire to know where magical power comes from, we accept these fortune-cookie explanations of The Force because it is just magic by another name, and magic is self-justifying. A world that possesses magic, like Middle Earth or the Harry Potter universe, does not as a rule question the source of that magic – it's simply a fait accompli at page one, a deus ex machina we swallow smoothly and whole. And yet George Lucas didn't see it that way when, in 1999, he decided to pin a scientific explanation on The Force. The Force, he tells us in “The Phantom Menace,” is actually caused by microscopic life-forms called midi-chlorians that reside within the cells of all living things, but it some things more than others. When I heard Liam Neeson utter this line in a small-town Pennsylvania theater all those years ago, I recall the immediate sound from the audience was groans of unbelief, as if they had just seen something holy recklessly profaned. And it is hard not to see the midi-chlorians as anything but profanation, given the deliberately mystical tone The Force occupies in the original S.W. trilogy. What Lucas accomplished here was as vulgar as explaining how a magic trick works to a nine year-old; it cheapens what should be a wondrous experience. And this is a hallmark of bad writing -- refusal to give the audience credit for making the leap with you.
The iceberg theory as it applies to film had a fine exemplar in John Carptenter's horror classic “Halloween,” a film which, in a very real if extremely simplified sense, is simply the Book of Job updated to 1979. As with “Alien,” one of the central strengths of the movie is in the unanswered mystery shrouding its antagonist, Michael Meyers. In “Halloween,” we see a young Meyers stalk and murder his older sister at the opening of the film, but we never get an explanation as to why he did it, or what he was like before he committed the murder, or why he escapes from the nut-hatch 13 years later and tries to relive the crime with a new set of victims. The only explanation we get is from Michael's psychiatrist, Dr. Sam Loomis, who identifies Myers as “purely and simply evil” and, therefore, not really human at all (he often refers to Meyers as "it" rather than "him"). Loomis has a few different speeches on his favorite ex-patient, but nothing he says is really scientific. Science, to Loomis, breaks down at the point of contact with Meyers' skin, and beyond that we're in the devil's country, where you don't need fancy talk and theories, you need a freaking gun. Indeed, at the end of the movie, when Michael's would-be victim, Laurie, sobs to Loomis, “Was that the boogeyman?” the little Englishman dryly replies, “As a matter of fact...it was.”
The Boogeyman! Think to your own first experience with that dread name. Your older brother or sister said to you one night when you were six, “Don't let the Boogeyman get you!” to which you fearfully replied, “What's the Boogeyman?” And they said with a leer, “He comes and gets little kids!” And this ended the conversation – your mind did not require any further knowledge. The origin of the Boogeyman and his motives for wanting to “get” you were both irrelevant; knowledge of his existence was sufficient to be afraid of him, and the fact that he came with no physical description or known method of “getting” merely provoked your brain to supplying the grisly details. This sense of restraint is the genius of Carpenter's film. The less we know about Myers and his motivations, the more frightening they are; the less we know about why this is happening, the more terrifying the moral is -- that no one is safe, that bad things can happen to good people, that the answer to the question “Why is this happening to me?” is, horribly, “Because! Just because!” to the tune of a plunging kitchen knife.
And yet – ! A few years ago Rob Zombie took it upon himself to “re-imagine” Halloween in a two-part explosion of violence by the same name. In addition to showing more gore and graphic violence in any given thirty seconds than the original film did in its whole two hours, Zombie's films take the precisely opposite tack that Carpenter's did, and attempt, almost from their first frame, to break down Michael's motivations and influences and let us know exactly who he is and why he is doing what he's doing. Michael, we are shown, is the son of a (very) broken and (very) abusive home, and there is a direct emotional-psychological cause for his eventual transformation into a mouth-breathing spree-killer; indeed, his psychiatrist laments, “I failed you!” to his homicidal ex-patient, which is, again, the polar-opposite reaction of the original Loomis, who realized that no psychological technique would have availed him anything against The Boogeyman. The Boogeyman is too elemental, too much a force of nature, to be reasoned with or "reached." By playing down this force of nature and turning it into a mere set of causes, A + B = C, we once again rob it of its most valuable element, which is mystery. Why? is so much more devastating a question when left unanswered!
Lest you think I'm picking on the celluloid set, it's not only in film that we find our needless origin stories of late. On the contrary, they have quite a place in literature of all kinds, including the “Hannibal Lecter” series by Thomas Harris. In the first of these four novels the Lecter character is introduced to us in a way very similar to Michael Meyers, in that while we know something about his crimes, we do not really understand his motivations or their root cause. In the second book we are told a little more, given a more extended tease, as it were; but the larger questions are, once again, deliberately unanswered. Lecter jeers his curious interrogator; “Nothing happened to me, Officer Starling – I happened. You can't reduce me to a set of influences.” And indeed, Harris is wise enough not to try here. He's content to leave Lecter somewhere just beyond the comforting frontiers of scientific understanding, an existential question with no definite answer. Unfortunately, this is a course he abandoned completely with the fourth book, “Hannibal Rising.” In this unfortunate tome, Harris does indeed “reduce to a set of influences” the hitherto enigmatic and mysterious doctor. By the end of the book we know everything about him and how he came to be the way he is, down to the last dull detail. The macabre vista of atrocities which were implied by chilling little half-sentences jabbed here and there like slivers of ice in the first two books (“And how is Officer Stewart? I heard he retired after he saw my basement.”) was blotted out by an avalanche of minutiae. The mystery, having been solved, ceased to be interesting.
Origin stories are everywhere and coming in increasingly unusual forms. “Rogue One,” the latest Star Wars story, is just that, “A Star Wars story,” skived off from the fatty tissue surrounding the original film, “A New Hope.” In essence, this film exists to an answer to a question no one asked, specifically, what did those lines (“Rebel spaceships, striking from a hidden base, have won their first victory over the evil Galactic Empire. During the battle, rebel spies managed to steal secret plans to the Empire's ultimate weapon, the DEATH STAR.”) really entail? The answer wasn't necessary to the integrity of the original trilogy or the prequels, it was supplied because there is money in exploiting the nostalgia surrounding the S.W. franchise. However you liked or disliked the result, the fact remains that the story didn't need to be told, and at the same time it filled in details which were, perhaps anyway, better left to the audience's imagination. I know that most people liked “Rogue One,” and I certainly didn't hate it (I preferred it to “The Force Awakens” by a wide margin), but again, I question the necessity of filling in every nook and cranny in a story -- of, as it were, mapping out the exact size of the iceberg. Let the idea retain some mystery, some borderlands beyond which the rest is unexplored and left to our imaginations. As a friend of mine who is passionate about fantasy games told me recently, “What sells fantasy is the same thing that turns a lot of people off of it – the deep lore.” But it's important to note that “the deep lore” in fantasy is always thickest where it is unwritten. Most of the concepts which underpinned Frank Herbert's DUNE series were either left out of, or only very lightly touched upon, in the first book in the series, and a similar thing could be said of Tolkien's Middle Earth saga, George R.R. Martin's “Game of Thrones” series and Rowling's Harry Potter books. In each instance, the author left large quantities of information about their respective universes out of the stories themselves, which they then published separately in supplementary texts -- the literary equivalent of “DVD special features” you do not have to watch to enjoy the film.
Now, it so happens that I too am a writer, and that one of my principal flaws as a young one was precisely the sin I am castigating here -- over-explanations of story brought about, in part, by lack of trust in the readership's intelligence. (My brother is enormously fond of pointing out that, at the age of 12, I felt it necessary to explain to those in the theater around me during Return of the Jedi that Darth Vader was being sarcastic when he uttered the line, "The Emperor is not as forgiving as I am.") It was not until many years later that a tough lesson drove home what it means to run into iceberg theory Titanic style. I had written a huge backstory for a character and decided to use it to introduce him to the audience. My editor said, "Very fine writing -- now cut all of it." He went on to explain that the subsequent actions of the character made his backstory plain; there was no need to elucidate it. "Show don't tell" is an old rule for novelists but it is, I admit, also very difficult to follow, especially when your readership (or viewership) is hungry for details. The lesson, however, was plain: keeping the audience a little hungry is better than feeding them too much. The hungry come back for more.
At the beginning of this blog I quoted a line from Irving Stone's epic novel “The Agony and the Ecstasy.” In that memorable sequence, Leonardo da Vinci scolds Michelangelo for taking his new style of painting to such unreachable levels of expertise and mastery that he had, in effect, left nowhere for any other artist to go – including, we are to suppose, Michelangelo himself. He had “completed his own revolution,” and as I write these lines, it seems to me that is the trend nowadays. At every turn, we see a lantern lighting the way, but somehow this does not suit our nature. Our minds seem to crave dark corners in which to project our fears and fantasies and to unleash our imaginations– we don't always want or need a damned lantern. But this frenzy for origin stories continues, here as elsewhere: “Game of Thrones” is in negotiations for a prequel series set when the Targeryans ruled Westeros, there are origin-movies about Han Solo and Boba Fett already in pre-production, and one of the most popular detective series in Europe, “Detective Montalbano” (which has run for 16 years), recently spun off an origin series called simply “Young Montalbano,” which delves deeply into the hitherto undiscussed past of the eponymous Sicilian hero. So on and on – and on, so that now there are several "origin stories" in print about The Godfather, too.
As I said before, I am not actually opposed to origin stories as a rule. They are tempting targets for a reason: they seem to shine with infinite possibility, and in some cases I believe they can add a great deal to the canon of a series; but in those instances their author usually had something definite to say, and a powerful reason for wanting to say it. “Grendel” is a prequel to “Beowulf” but adds rather than subtracts to the lore of its inspiration by giving us an epic from the perspective of its villain: it infringes very little, if at all, on its predecessor. On the other hand, stories that are told simply because there is space left over to tell them, or for purely financial reasons, tend not merely to debase their own selves but to damage the integrity of the originals upon which they are founded. A friend of mine, criticizing the Metallica album “St. Anger,” said to me, “This is the sort of music that, if you're in a band, you don't want people to hear – it's garage practice, jam session stuff. It's cutting-room floor, out-takes, blooper reel shit. It's a 'you at six in the morning with no coffee and no makeup' type of deal.” In other words, the album's crime was not in existing but rather being shown in public. Well, it seems to me many of these origin stories fall into just that category – their crime lies not in the fact that someone dreamed them up (quite the contrary!), but in the fact that they were made canonical, and so “closed off” yet another avenue for our individual imaginations. If some of the joy of the journey is in the journey itself, then it seems to me that some of the joy of the story is in the wider world that story inhabits and implies. It is a beautiful thing to start a revolution, and to maintain it; but its completion is perhaps best left in the hands of its audience and not its author.
--- Leonardo da Vinci to Michelangelo
I know that I promised I would return with a blog continuing my attack on the American monetary system, but I find that post-Christmas, my desire to talk about finances is at an all-time low. Since yours probably is too, I shall lay turn away from the windmill that is the Federal Reserve, and, like Don Quixote, tilt again at it later. God knows it will still be there. In the mean time I've decided to tackle a more approachable subject: Hollywood. Or, specifically, Hollywood's recent obsession with what are known as “origin stories.”
The name “origin story” is a little deceptive and needs fast explaining. It denotes a story which details the origin of a character, but it connotes a story which is written after the character's debut in a book, movie or television series. (In “The Godfather” we are introduced to Vito Corleone, but only in “The Godfather, Part II,” we learn the circumstances by which he came to power – his childhood and early life.) However, an origin story is not limited to explaining the backstory of characters; it can also explain the backstory of a universe, or focus on a set of events which led to the circumstances obtaining later in a story's timeline. (For example, the third film in the Kate Beckinsale series Underworld, called Rise of the Lycans, is actually set before the previous two films.)
Origin stories, which in movies are usually called prequels, have always been with us, and I am not opposed to them per se, but I've found that unless they are very skilfilly done (see the afformentioned “The Godfather, Part II”), they tend to do more harm than good. And in most cases they can only be skilfully handled if the writer's motivation is passion rather than profit – if he or she feels a need, not just a want but a need, to expand upon the character or the story. In recent years there has been a veritable orgy of backstory in film which has spilled over into the literary world as well; but the motivation for writing it seems to be the opposite of ideal: profit rather than passion. Of course money is not always the reasoning behind such excursions into prequel; in some instances it is simply sentimentality or bad judgement; but regardless of the reason, the outcome tends to be either forgettable, regrettable, or just plain awful.
To understand the core of my grudge against origin stories, it is necessary to understand the so-called “iceberg theory” of Ernest Hemingway, also known as “the theory of omission.” As Hemingway himself said: “If a writer of prose knows enough of what he is writing about he may omit things that he knows and the reader, if the writer is writing truly enough, will have a feeling of those things as strongly as though the writer had stated them. The dignity of movement of an ice-berg is due to only one-eighth of it being above water.” This theory, that less is more, that what is implied is often stronger than what is explicitly stated, is implicit in most great storytelling. The fact is, what we do not know about our characters and universe is often the source of our greatest pleasure, since, as Stephen King told us in Danse Macabre, our minds tend to fill in gaps in information with details far more imaginative and satisfying than anything even the best writers could conceive. If you have been moved by a series of any kind, be it film, television or written word, you've probably speculated a great deal about the characters and universe of that series; you may have also experienced the peculiar emotion which occurs when some cherished theory of yours, possibly held for years, gets smashed to bits by a new storyline that makes nonsense of it.
When I was growing up – and today, for that matter – one of my favorite films was “Alien.” What I loved most about the story was its underlying sense of mystery. Our heroes are hauling mineral ore across the galaxy when diverted to an obscure, uninhabited planet by a mysterious distress signal. They set down and find, upon its barren surface, a huge, derelict alien spacecraft. Within that spacecraft are multitudes of eggs, one of which hatches, with very unfortunate result. For “Alien” the central plot point is the hatching, but for me, a tiny part of the audience, the appeal lay in wondering about these things:
1. Where did the alien ship come from?
2. Why did it land on this barren planet?
3. How long was it there?
4. Are the eggs native to the planet, were they cargo on the ship, or were they lain after the ship's crew was dead?
5. What was the exact text of the “SOS” which turned out to be a warning?
“Alien” presents us with a huge mystery which it never even attempts to solve, and the result is a far more effective and terrifying story than if these questions had been answered, because one of the central themes of the movie is primal fear – and our strongest primal fear is probably fear of the unknown. Ignorance and mystery inspire horror, because horror is the anticipation of a terrifying outcome, and humans, by virtue of the hard experience we call instinct or race memory, tend to fear the worst. Also, by some perversity of nature, we tend to enjoy the feeling of being intrigued, teased, titillated, even. As with certain hobbies and activities, the fun lies in the process and not in its completion – the riddle, and not its answer. Well, the unanswered riddles from this film haunted and intrigued and delighted me...for about 30 years. Then one day Ridley Scott & Co. decided to make a prequel, called “Prometheus,” and I felt my heart sink. On the one hand, the movie might supply these long-craved answers; on the other hand, well...they might supply those long-craved answers. It was a case of be careful what you wish for. As it happens, “Prometheus” tells, or rather begins to tell, the story of how the derelict spacecraft came to be resting on that haunted-looking planet. It answers two of the questions above, and sets up the answer to at least two more...and, I find, it disappoints the living shit out of me every time I think about it. The writers were unable to provide steak equal to the sizzle of the questions posed by the original film; all they did was demystify something most cinephiles consider sacred. It turns out that King was right: what's in the dark is all the more frightening because we can't see it. “Alien,” to me, is so much more effective if you pretend that “Prometheus” never existed, because it re-enshrines the story in darkness and conundrum, and leaves that uneasy question mark hanging in space. Where no one can hear you scream.
Speaking of sacred, let's talk for a moment about The Force. It is one of the cornerstone-concepts of the “Star Wars” universe, yet in the first two movies of that series the total explanations we get of it boil down to a handful of short sentences, such as: “The Force is an energy field created by all living things. It surrounds us, it penetrates us, it binds the galaxy together." And indeed, we neither need nor want deeper explanations than this: the unifying theory of the original “Star Wars” films is simplicity. The story is almost pathetically simple, and the central theme as old as King Arthur, but this is in large part why it works. Just as we accept Merlin's magic as being part of the world of Camelot without experiencing any desire to know where magical power comes from, we accept these fortune-cookie explanations of The Force because it is just magic by another name, and magic is self-justifying. A world that possesses magic, like Middle Earth or the Harry Potter universe, does not as a rule question the source of that magic – it's simply a fait accompli at page one, a deus ex machina we swallow smoothly and whole. And yet George Lucas didn't see it that way when, in 1999, he decided to pin a scientific explanation on The Force. The Force, he tells us in “The Phantom Menace,” is actually caused by microscopic life-forms called midi-chlorians that reside within the cells of all living things, but it some things more than others. When I heard Liam Neeson utter this line in a small-town Pennsylvania theater all those years ago, I recall the immediate sound from the audience was groans of unbelief, as if they had just seen something holy recklessly profaned. And it is hard not to see the midi-chlorians as anything but profanation, given the deliberately mystical tone The Force occupies in the original S.W. trilogy. What Lucas accomplished here was as vulgar as explaining how a magic trick works to a nine year-old; it cheapens what should be a wondrous experience. And this is a hallmark of bad writing -- refusal to give the audience credit for making the leap with you.
The iceberg theory as it applies to film had a fine exemplar in John Carptenter's horror classic “Halloween,” a film which, in a very real if extremely simplified sense, is simply the Book of Job updated to 1979. As with “Alien,” one of the central strengths of the movie is in the unanswered mystery shrouding its antagonist, Michael Meyers. In “Halloween,” we see a young Meyers stalk and murder his older sister at the opening of the film, but we never get an explanation as to why he did it, or what he was like before he committed the murder, or why he escapes from the nut-hatch 13 years later and tries to relive the crime with a new set of victims. The only explanation we get is from Michael's psychiatrist, Dr. Sam Loomis, who identifies Myers as “purely and simply evil” and, therefore, not really human at all (he often refers to Meyers as "it" rather than "him"). Loomis has a few different speeches on his favorite ex-patient, but nothing he says is really scientific. Science, to Loomis, breaks down at the point of contact with Meyers' skin, and beyond that we're in the devil's country, where you don't need fancy talk and theories, you need a freaking gun. Indeed, at the end of the movie, when Michael's would-be victim, Laurie, sobs to Loomis, “Was that the boogeyman?” the little Englishman dryly replies, “As a matter of fact...it was.”
The Boogeyman! Think to your own first experience with that dread name. Your older brother or sister said to you one night when you were six, “Don't let the Boogeyman get you!” to which you fearfully replied, “What's the Boogeyman?” And they said with a leer, “He comes and gets little kids!” And this ended the conversation – your mind did not require any further knowledge. The origin of the Boogeyman and his motives for wanting to “get” you were both irrelevant; knowledge of his existence was sufficient to be afraid of him, and the fact that he came with no physical description or known method of “getting” merely provoked your brain to supplying the grisly details. This sense of restraint is the genius of Carpenter's film. The less we know about Myers and his motivations, the more frightening they are; the less we know about why this is happening, the more terrifying the moral is -- that no one is safe, that bad things can happen to good people, that the answer to the question “Why is this happening to me?” is, horribly, “Because! Just because!” to the tune of a plunging kitchen knife.
And yet – ! A few years ago Rob Zombie took it upon himself to “re-imagine” Halloween in a two-part explosion of violence by the same name. In addition to showing more gore and graphic violence in any given thirty seconds than the original film did in its whole two hours, Zombie's films take the precisely opposite tack that Carpenter's did, and attempt, almost from their first frame, to break down Michael's motivations and influences and let us know exactly who he is and why he is doing what he's doing. Michael, we are shown, is the son of a (very) broken and (very) abusive home, and there is a direct emotional-psychological cause for his eventual transformation into a mouth-breathing spree-killer; indeed, his psychiatrist laments, “I failed you!” to his homicidal ex-patient, which is, again, the polar-opposite reaction of the original Loomis, who realized that no psychological technique would have availed him anything against The Boogeyman. The Boogeyman is too elemental, too much a force of nature, to be reasoned with or "reached." By playing down this force of nature and turning it into a mere set of causes, A + B = C, we once again rob it of its most valuable element, which is mystery. Why? is so much more devastating a question when left unanswered!
Lest you think I'm picking on the celluloid set, it's not only in film that we find our needless origin stories of late. On the contrary, they have quite a place in literature of all kinds, including the “Hannibal Lecter” series by Thomas Harris. In the first of these four novels the Lecter character is introduced to us in a way very similar to Michael Meyers, in that while we know something about his crimes, we do not really understand his motivations or their root cause. In the second book we are told a little more, given a more extended tease, as it were; but the larger questions are, once again, deliberately unanswered. Lecter jeers his curious interrogator; “Nothing happened to me, Officer Starling – I happened. You can't reduce me to a set of influences.” And indeed, Harris is wise enough not to try here. He's content to leave Lecter somewhere just beyond the comforting frontiers of scientific understanding, an existential question with no definite answer. Unfortunately, this is a course he abandoned completely with the fourth book, “Hannibal Rising.” In this unfortunate tome, Harris does indeed “reduce to a set of influences” the hitherto enigmatic and mysterious doctor. By the end of the book we know everything about him and how he came to be the way he is, down to the last dull detail. The macabre vista of atrocities which were implied by chilling little half-sentences jabbed here and there like slivers of ice in the first two books (“And how is Officer Stewart? I heard he retired after he saw my basement.”) was blotted out by an avalanche of minutiae. The mystery, having been solved, ceased to be interesting.
Origin stories are everywhere and coming in increasingly unusual forms. “Rogue One,” the latest Star Wars story, is just that, “A Star Wars story,” skived off from the fatty tissue surrounding the original film, “A New Hope.” In essence, this film exists to an answer to a question no one asked, specifically, what did those lines (“Rebel spaceships, striking from a hidden base, have won their first victory over the evil Galactic Empire. During the battle, rebel spies managed to steal secret plans to the Empire's ultimate weapon, the DEATH STAR.”) really entail? The answer wasn't necessary to the integrity of the original trilogy or the prequels, it was supplied because there is money in exploiting the nostalgia surrounding the S.W. franchise. However you liked or disliked the result, the fact remains that the story didn't need to be told, and at the same time it filled in details which were, perhaps anyway, better left to the audience's imagination. I know that most people liked “Rogue One,” and I certainly didn't hate it (I preferred it to “The Force Awakens” by a wide margin), but again, I question the necessity of filling in every nook and cranny in a story -- of, as it were, mapping out the exact size of the iceberg. Let the idea retain some mystery, some borderlands beyond which the rest is unexplored and left to our imaginations. As a friend of mine who is passionate about fantasy games told me recently, “What sells fantasy is the same thing that turns a lot of people off of it – the deep lore.” But it's important to note that “the deep lore” in fantasy is always thickest where it is unwritten. Most of the concepts which underpinned Frank Herbert's DUNE series were either left out of, or only very lightly touched upon, in the first book in the series, and a similar thing could be said of Tolkien's Middle Earth saga, George R.R. Martin's “Game of Thrones” series and Rowling's Harry Potter books. In each instance, the author left large quantities of information about their respective universes out of the stories themselves, which they then published separately in supplementary texts -- the literary equivalent of “DVD special features” you do not have to watch to enjoy the film.
Now, it so happens that I too am a writer, and that one of my principal flaws as a young one was precisely the sin I am castigating here -- over-explanations of story brought about, in part, by lack of trust in the readership's intelligence. (My brother is enormously fond of pointing out that, at the age of 12, I felt it necessary to explain to those in the theater around me during Return of the Jedi that Darth Vader was being sarcastic when he uttered the line, "The Emperor is not as forgiving as I am.") It was not until many years later that a tough lesson drove home what it means to run into iceberg theory Titanic style. I had written a huge backstory for a character and decided to use it to introduce him to the audience. My editor said, "Very fine writing -- now cut all of it." He went on to explain that the subsequent actions of the character made his backstory plain; there was no need to elucidate it. "Show don't tell" is an old rule for novelists but it is, I admit, also very difficult to follow, especially when your readership (or viewership) is hungry for details. The lesson, however, was plain: keeping the audience a little hungry is better than feeding them too much. The hungry come back for more.
At the beginning of this blog I quoted a line from Irving Stone's epic novel “The Agony and the Ecstasy.” In that memorable sequence, Leonardo da Vinci scolds Michelangelo for taking his new style of painting to such unreachable levels of expertise and mastery that he had, in effect, left nowhere for any other artist to go – including, we are to suppose, Michelangelo himself. He had “completed his own revolution,” and as I write these lines, it seems to me that is the trend nowadays. At every turn, we see a lantern lighting the way, but somehow this does not suit our nature. Our minds seem to crave dark corners in which to project our fears and fantasies and to unleash our imaginations– we don't always want or need a damned lantern. But this frenzy for origin stories continues, here as elsewhere: “Game of Thrones” is in negotiations for a prequel series set when the Targeryans ruled Westeros, there are origin-movies about Han Solo and Boba Fett already in pre-production, and one of the most popular detective series in Europe, “Detective Montalbano” (which has run for 16 years), recently spun off an origin series called simply “Young Montalbano,” which delves deeply into the hitherto undiscussed past of the eponymous Sicilian hero. So on and on – and on, so that now there are several "origin stories" in print about The Godfather, too.
As I said before, I am not actually opposed to origin stories as a rule. They are tempting targets for a reason: they seem to shine with infinite possibility, and in some cases I believe they can add a great deal to the canon of a series; but in those instances their author usually had something definite to say, and a powerful reason for wanting to say it. “Grendel” is a prequel to “Beowulf” but adds rather than subtracts to the lore of its inspiration by giving us an epic from the perspective of its villain: it infringes very little, if at all, on its predecessor. On the other hand, stories that are told simply because there is space left over to tell them, or for purely financial reasons, tend not merely to debase their own selves but to damage the integrity of the originals upon which they are founded. A friend of mine, criticizing the Metallica album “St. Anger,” said to me, “This is the sort of music that, if you're in a band, you don't want people to hear – it's garage practice, jam session stuff. It's cutting-room floor, out-takes, blooper reel shit. It's a 'you at six in the morning with no coffee and no makeup' type of deal.” In other words, the album's crime was not in existing but rather being shown in public. Well, it seems to me many of these origin stories fall into just that category – their crime lies not in the fact that someone dreamed them up (quite the contrary!), but in the fact that they were made canonical, and so “closed off” yet another avenue for our individual imaginations. If some of the joy of the journey is in the journey itself, then it seems to me that some of the joy of the story is in the wider world that story inhabits and implies. It is a beautiful thing to start a revolution, and to maintain it; but its completion is perhaps best left in the hands of its audience and not its author.
Published on January 23, 2017 22:12
January 9, 2017
What Lies Beneath Economic Bullshit?
When I was in first grade, I used to take a nickel to school every morning with which to buy milk at lunchtime. In those days -- and yes, sadly I am old enough to have lived through a period I can refer to as "those days" -- milk was a huge part of a schoolkid's diet, and anyone who forgot his nickel was in a sorry state, for the cafeteria at Brookmont Elementary School did not hand out half-pints of milk to first graders on credit. Well, it so happened that one day -- I think I was in third grade by this time -- I went to school and discovered, to my horror, that the price of the half-pint had gone up from five to ten cents over the weekend, and nobody had bothered to inform my mother, who had equipped me only with a shiny nickel. No amount of pleading could persuade the Easter Island-faced cafeteria attendant to accept half-payment, so I went milkless that Monday. Two years later, the very same thing happened again: the price of milk had again risen, this time to fifteen cents, though by that time the importance of milk in the diet had faded, replaced by the allure of Capri Sun, or, on special occasions when a quarter could be obtained, Coca-Cola in a fluted bottle of clear green glass. The issue of rising prices flummoxed me, however, and I asked my father to explain why more and more money had been required to obtain the same amount of milk. Dad lowered his newspaper long enough to explain, in the very precise way a parent uses when they don't really understand the subject about which they are speaking, that there was this thing called inflation, which lowered the value of money over time. Later -- much later -- I discovered that inflation was caused by an imbalance between the amount of money in circulation and the amount of goods actually for sale. If there was too much money in the country, the purchasing power of money would inevitably decrease; if there were not enough money, on other hand, the value of cash would rise. Supply and demand, I was told, dictated the value of everything -- both money and products. Therefore (drum roll) my milk had tripled in price in just a few years because there were too many nickels in America; or perhaps not enough milk.
This seemed to make sense, but it led me to the inevitable question(s): How could there be too much money in the country? Who determines how much money is produced, and why would they produce so much that its overall value would decrease? Who sets the value of money, and who benefits from having money lose its value? You didn't have to be a genius to see that if inflation went up continuously, as it seemed to be doing year after year, decade after decade, that any fixed sum of money you had saved or received regularly via a paycheck or a pension would decrease accordingly in value, until it was virtually worthless. And it raised an even larger question: why was the trend only one way -- toward increasing worthlessness? Inflation seemed to be constantly rising; sometimes quickly, sometimes slowly, but it was always going up (and even when it went down, the overall trend was skyward). Therefore, if I understood the theory correctly, the amount of money in circulation had to be likewise increasing, because God knew the amount of goods for sale in America wasn't going down.
Trying to answer some of these questions led me to a series of astonishing discoveries. At least they were astonishing to me, who had never been interested in and therefore knew very little about basic economic realities.
Money, it turned out, came in two distinct categories; commodity money and fiat currency. Commodity money is backed up by gold or silver -- either the money itself is made of gold or silver, or it is redeemable for gold or silver; and the total amount of money in existence is determined solely by the total amount of gold or silver in the government's coffers. If your government has a billion dollars in gold in its vaults, you can print up to a billion dollars of currency, and not one penny more, elsewise that currency would be worthless -- literally counterfeit.
Fiat currency, on the other hand, is money which is in itself worthless (paper, for example, or coins made of largely worthless metals) whose value is backed by the government that issued it. Its value comes from "faith and credit" in the government itself. In "olden times," as we used to say when a nickel bought me milk, all governmental currency was commodity money. The concept of fiat currency is relatively new, and indeed, the United States itself remained partially on the gold standard until the Nixon presidency. Yet the differences between commodity and fiat are crucial. Under a commodity money system, the value of your money is determined by two outside factors -- i.e. by the price of gold or silver on the international market, and how much gold or silver your government has in its coffers. Under a fiat system, it is determined by "faith and credit." Under a commodity system, the government cannot issue more money than it has precious metal to back it up. Under a fiat system, the government can print as much money as it damn well pleases. And it is the fiat system that we use today, which is one of the reasons why the price of my milk kept going up, so that now, if my 8 year-old niece wants a half-pint of milk with her lunch, she will pay a buck instead of a nickel.
This was not always the case, and it is instructive to look at the Consumer Price Index between the Revolutionary War and WWI, when America was on either the commodity system or a modified version of one. Because during that time, the purchasing power of a dollar remained almost exactly the same. There was some inflation and deflation in that time, especially during war and economic crisis, but the overall trend over those 137 years was a flat line, so that a dollar in 1914 bought you the same basic amount of goods and services as it would have in 1776. (Yes, you read that correctly: a dollar issued by the Continental Congress during the first year of the War of Independence had the same purchasing power as a dollar during the first year of the First World War!)
Today, in the age of fiat currency and central banks which print trillions of dollars out of thin air and pump them into circulation without any corresponding collateral, this stability seems almost incredible. We expect that inflation will rise. We expectthat purchasing power will go down. And we expect these trends always to continue in the same direction. We know we have fiat currency, and we know its value is regulated by supply and demand and "faith and credit," but we still don't know who benefits from this system -- not the ordinary man, certainly. He can only be hurt by inflation. So why does the government, whose sole justification is to serve the needs of the ordinary man, use this system?
Trying to get to the bottom of this question led to another awful realization which I will share momentarily, though you've probably guessed the answer by now. At any rate it led me into a thicket of economic terminology designed to numb the brain and glaze the eyeballs. I concluded very rapidly that most of the terms and phrases used by economists were designed specifically to baffle and bewilder the ordinary person and drive them away from the subject, but I can be quite dogged when I want to be and gradually I got a sense of just what all this fancy, scientific-sounding bullshit actually meant. In particular there were recurring phrases I did not understand, and whose definitions did not leave me with any clear meaning, but which in time I have come to define in my own crude way. What follows is a short list of some of the most common terms you'll hear bandied about by economists, politicians and pundits; grasping them is key to grasping why our economy is always in the soup and why the temperature of that soup is almost always rising.
BULLSHIT TERM #1: FRACTIONAL RESERVE BANKING. Simply put, this is a practice by which banks can lend money that don't actually have. You read that right, but I'll say it again. This is a practice by which banks can lend money they don't actually have, and do it with full knowledge they don't have it. When I was with the District Attorney's Office, we called this "writing bad checks," and we put people in jail for it, often for months, occasionally for years. But banks do this on an infinitely larger scale, lending out huge sums of money with only a fraction of that money actually on hand, and nobody gets arrested, because the basic principle of fiat currency is that it, unlike commodity money, it doesn't require a physical commodity, like gold, to back it up. Therefore there are no limits, theoretically, on how much fiat currency can be printed. And I need for you to pay particular attention to the former point about what would happen to you for "writing bad checks," because it is a recurring theme in all discussions of economics and government: if you do it, it's a crime. If they do it...well, it's policy.
BULLSHIT TERM #2: QUANTITATIVE EASING. Godfrey Bloom calls it "counterfeiting and theft." David Stockman calls it "fraud." Andrew Huszar calls it "the greatest backdoor bailout of Wall Street of all time." But it's been practiced on Many Wall Streets all over the world, from Japan to Britain, and many consider it the fix-all of any economic crisis. "Q.E." as it is known, is quite simply the process of printing more money to lower the overall value of currency and degrade existing debt. In other words, it is a deliberate increase in inflation. Say you're a bad rich boy and owe $10,000,000 to your creditors. Through the process of Easing, you still owe the same amount on paper, but the actual value of your money is lowered to the point where your debt burden shrinks in proportion. (The more worthless your dollars, the easy it is for you to shoulder your debts.) The reason Huszar calls this a "backdoor bailout" is that it favors only the rich, who remain rich despite inflation because of what might be called the power of very large numbers: inflation does not make rich people poor, it simply makes them a little less rich. The poor, on the other hand, end up simply being unable to buy milk...or bread...or gasoline. Or much of anything at all. The reason Bloom calls this "counterfeiting" and Stockman "fraud" is because if an ordinary person were to print money on his own, he would immediately be imprisoned, on the grounds that phony money in circulation weakens the overall value of the dollar. Indeed, for much of history counterfeiters had their hands hacked off or were even executed for doing just precisely this. But when central banks print trillions of dollars, pounds or yen out of thin air with nothing to back them up except the "faith and credit" of the government, causing inflation to rise and purchasing power to plummet...it isn't called fraud or counterfeiting, it's called "quantitative easing." It sounds great, but the people it helps are precisely the people whose greed and incompetence created the seeming need for Quantitative Easing in the first place: bankers and the super-rich. Meanwhile, those who have small savings or live on a fixed income, like retirees, find our buying power has plummeted, because our savings are now devalued.
BULLSHIT TERM #3: MORAL HAZARD - I was once at a poker tournament which did not require a buy-in. We were playing for a monetary prize, but the chips issued to us had no actual monetary value. This encouraged what card players call "suck out poker" -- competitors betting and bluffing recklessly, going all-in with weak hands and refusing to fold even when they had nothing, simply because they had nothing to lose. Well, in economic terms this is called "moral hazard," and is the practice of taking risks solely because one is protected from the potential consequences. Government bailouts and quantitative easing are two prime examples of things which create moral hazard in economics. When a bank engages in, say, issuing sub-prime mortgages, it knows perfectly well that it is creating a situation in which economic collapse is inevitable, but it is free to do so anyway, because it knows the government will claim it is "too big to fail" and bail it out...with taxpayer money. The banks literally cannot lose, so they keep playing.
BULLSHIT TERM #4: MARK TO MARKET ACCOUNTING - This is a process whereby a business can claim projected profits as actual profits, i.e. they can claim that the money they expect to make on a product or service, based on current market prices, is money they actually made. In financial companies this method of accounting supposedly makes sense, but when used in other businesses it takes on a whole different aspect, one which looks suspiciously like fraud. Why fraud? Because theoretical profits are recorded on the books as actual ones, so that investors (potential stockholders, for example) looking at the company's books will be fooled as to the profitability of the company. This sleight-of-hand is part of what sank Enron a few years back, and if you tried this shit in your own life, you'd be arrested on any number of charges, including filing false tax returns. Again, however, there is a double standard at work.
BULLSHIT TERM #5: SPECIAL PURPOSE ENTITIES - These are shell companies set up by a parent company for specific business purposes. Seems logical enough, but as usual, there's a potential dark side. "Special purpose entities" -- what the Mafia would call "front companies" -- can and often are used for nefarious purposes, to play a sort of shell-game with the truth. In the case of the above-mentioned Enron, S.P.E.'s were employed in huge numbers -- literally by the hundreds -- to hide the debts of the parent company. By doing this, and by using mark-to-market accounting, they managed to put on an appearance of financial strength, keep their stock prices inflated, and hide their liabilities from investors and from Wall Street. The trouble is -- as with all Ponzi schemes and shell games -- there inevitably comes a moment of critical mass, when one ball too many is added to the juggling and everything comes crashing down. I don't mean to sound redundant, but if they call it a "front for laundering money" when the Mafia does it, and an "S.P.E." when a big company does it, well, there's a reason for that
I chose these terms more or less at random, and I'm sure that many would dispute the cavalier way in which I have defined them. Nevertheless I feel justified in doing so. When I was working for the District Attorney's Office, I came to grasp that a great deal of the fancy-Dan terminology spewed by lawyers in court was bullshit; the basic concepts of our legal system are simple and easy to understand when not obscured by clouds of complex-sounding legal jargon. By and large, that jargon exists solely to intimidate the ordinary man into believing the law, which belongs to everyone ought to be the sole province of lawyers. It exists further to couch inherently unjust and corrupt practices in impressive-sounding Latin, so that few people understand the actual nature of what is being said and done. Well, the same principle applies to economics and finance. The terminology is deliberately arcane, so the most hypocritical, ridiculous and outright criminal behavior take on an air of almost scientific respectability. As George Orwell noted, clarity is the enemy of the lie, just as obscurity is the enemy of truth. And this brings us not only to our earlier question but a host of others. Why does the government, and Wall Street, hide simple economic truths behind terms deliberately designed to confuse and mislead? Why do they reinforce a system whose shortsighted greed leads to things like the housing bubble and the Great Recession? A system which continuously increases inflation even though such increases only benefit the mega-rich? A system which rewards bankers for taking stupid risks which endanger the economy yet places the financial burden of the business failures which follow those risks on the taxpayer and not the bank?
The answer would seem to be, "To benefit the rich, of course!" And this is, in fact, true; but it is not the whole answer nor even the crucial part. The real one can be summed up in one dread title: THE FEDERAL RESERVE SYSTEM. Understanding what the Fed does for us -- or rather, to us -- is key to grasping most of the evils which presently plague our economy, our society, and to a great extent, human life on this planet. Bold statement? Yes. But backed up by grim battalions of facts. And this will be the subject of my next piece, eloquently titled: WHY THE FEDERAL RESERVE IS THE EMPIRE IN "STAR WARS" (AND WE ARE THE REBELLION!).
This seemed to make sense, but it led me to the inevitable question(s): How could there be too much money in the country? Who determines how much money is produced, and why would they produce so much that its overall value would decrease? Who sets the value of money, and who benefits from having money lose its value? You didn't have to be a genius to see that if inflation went up continuously, as it seemed to be doing year after year, decade after decade, that any fixed sum of money you had saved or received regularly via a paycheck or a pension would decrease accordingly in value, until it was virtually worthless. And it raised an even larger question: why was the trend only one way -- toward increasing worthlessness? Inflation seemed to be constantly rising; sometimes quickly, sometimes slowly, but it was always going up (and even when it went down, the overall trend was skyward). Therefore, if I understood the theory correctly, the amount of money in circulation had to be likewise increasing, because God knew the amount of goods for sale in America wasn't going down.
Trying to answer some of these questions led me to a series of astonishing discoveries. At least they were astonishing to me, who had never been interested in and therefore knew very little about basic economic realities.
Money, it turned out, came in two distinct categories; commodity money and fiat currency. Commodity money is backed up by gold or silver -- either the money itself is made of gold or silver, or it is redeemable for gold or silver; and the total amount of money in existence is determined solely by the total amount of gold or silver in the government's coffers. If your government has a billion dollars in gold in its vaults, you can print up to a billion dollars of currency, and not one penny more, elsewise that currency would be worthless -- literally counterfeit.
Fiat currency, on the other hand, is money which is in itself worthless (paper, for example, or coins made of largely worthless metals) whose value is backed by the government that issued it. Its value comes from "faith and credit" in the government itself. In "olden times," as we used to say when a nickel bought me milk, all governmental currency was commodity money. The concept of fiat currency is relatively new, and indeed, the United States itself remained partially on the gold standard until the Nixon presidency. Yet the differences between commodity and fiat are crucial. Under a commodity money system, the value of your money is determined by two outside factors -- i.e. by the price of gold or silver on the international market, and how much gold or silver your government has in its coffers. Under a fiat system, it is determined by "faith and credit." Under a commodity system, the government cannot issue more money than it has precious metal to back it up. Under a fiat system, the government can print as much money as it damn well pleases. And it is the fiat system that we use today, which is one of the reasons why the price of my milk kept going up, so that now, if my 8 year-old niece wants a half-pint of milk with her lunch, she will pay a buck instead of a nickel.
This was not always the case, and it is instructive to look at the Consumer Price Index between the Revolutionary War and WWI, when America was on either the commodity system or a modified version of one. Because during that time, the purchasing power of a dollar remained almost exactly the same. There was some inflation and deflation in that time, especially during war and economic crisis, but the overall trend over those 137 years was a flat line, so that a dollar in 1914 bought you the same basic amount of goods and services as it would have in 1776. (Yes, you read that correctly: a dollar issued by the Continental Congress during the first year of the War of Independence had the same purchasing power as a dollar during the first year of the First World War!)
Today, in the age of fiat currency and central banks which print trillions of dollars out of thin air and pump them into circulation without any corresponding collateral, this stability seems almost incredible. We expect that inflation will rise. We expectthat purchasing power will go down. And we expect these trends always to continue in the same direction. We know we have fiat currency, and we know its value is regulated by supply and demand and "faith and credit," but we still don't know who benefits from this system -- not the ordinary man, certainly. He can only be hurt by inflation. So why does the government, whose sole justification is to serve the needs of the ordinary man, use this system?
Trying to get to the bottom of this question led to another awful realization which I will share momentarily, though you've probably guessed the answer by now. At any rate it led me into a thicket of economic terminology designed to numb the brain and glaze the eyeballs. I concluded very rapidly that most of the terms and phrases used by economists were designed specifically to baffle and bewilder the ordinary person and drive them away from the subject, but I can be quite dogged when I want to be and gradually I got a sense of just what all this fancy, scientific-sounding bullshit actually meant. In particular there were recurring phrases I did not understand, and whose definitions did not leave me with any clear meaning, but which in time I have come to define in my own crude way. What follows is a short list of some of the most common terms you'll hear bandied about by economists, politicians and pundits; grasping them is key to grasping why our economy is always in the soup and why the temperature of that soup is almost always rising.
BULLSHIT TERM #1: FRACTIONAL RESERVE BANKING. Simply put, this is a practice by which banks can lend money that don't actually have. You read that right, but I'll say it again. This is a practice by which banks can lend money they don't actually have, and do it with full knowledge they don't have it. When I was with the District Attorney's Office, we called this "writing bad checks," and we put people in jail for it, often for months, occasionally for years. But banks do this on an infinitely larger scale, lending out huge sums of money with only a fraction of that money actually on hand, and nobody gets arrested, because the basic principle of fiat currency is that it, unlike commodity money, it doesn't require a physical commodity, like gold, to back it up. Therefore there are no limits, theoretically, on how much fiat currency can be printed. And I need for you to pay particular attention to the former point about what would happen to you for "writing bad checks," because it is a recurring theme in all discussions of economics and government: if you do it, it's a crime. If they do it...well, it's policy.
BULLSHIT TERM #2: QUANTITATIVE EASING. Godfrey Bloom calls it "counterfeiting and theft." David Stockman calls it "fraud." Andrew Huszar calls it "the greatest backdoor bailout of Wall Street of all time." But it's been practiced on Many Wall Streets all over the world, from Japan to Britain, and many consider it the fix-all of any economic crisis. "Q.E." as it is known, is quite simply the process of printing more money to lower the overall value of currency and degrade existing debt. In other words, it is a deliberate increase in inflation. Say you're a bad rich boy and owe $10,000,000 to your creditors. Through the process of Easing, you still owe the same amount on paper, but the actual value of your money is lowered to the point where your debt burden shrinks in proportion. (The more worthless your dollars, the easy it is for you to shoulder your debts.) The reason Huszar calls this a "backdoor bailout" is that it favors only the rich, who remain rich despite inflation because of what might be called the power of very large numbers: inflation does not make rich people poor, it simply makes them a little less rich. The poor, on the other hand, end up simply being unable to buy milk...or bread...or gasoline. Or much of anything at all. The reason Bloom calls this "counterfeiting" and Stockman "fraud" is because if an ordinary person were to print money on his own, he would immediately be imprisoned, on the grounds that phony money in circulation weakens the overall value of the dollar. Indeed, for much of history counterfeiters had their hands hacked off or were even executed for doing just precisely this. But when central banks print trillions of dollars, pounds or yen out of thin air with nothing to back them up except the "faith and credit" of the government, causing inflation to rise and purchasing power to plummet...it isn't called fraud or counterfeiting, it's called "quantitative easing." It sounds great, but the people it helps are precisely the people whose greed and incompetence created the seeming need for Quantitative Easing in the first place: bankers and the super-rich. Meanwhile, those who have small savings or live on a fixed income, like retirees, find our buying power has plummeted, because our savings are now devalued.
BULLSHIT TERM #3: MORAL HAZARD - I was once at a poker tournament which did not require a buy-in. We were playing for a monetary prize, but the chips issued to us had no actual monetary value. This encouraged what card players call "suck out poker" -- competitors betting and bluffing recklessly, going all-in with weak hands and refusing to fold even when they had nothing, simply because they had nothing to lose. Well, in economic terms this is called "moral hazard," and is the practice of taking risks solely because one is protected from the potential consequences. Government bailouts and quantitative easing are two prime examples of things which create moral hazard in economics. When a bank engages in, say, issuing sub-prime mortgages, it knows perfectly well that it is creating a situation in which economic collapse is inevitable, but it is free to do so anyway, because it knows the government will claim it is "too big to fail" and bail it out...with taxpayer money. The banks literally cannot lose, so they keep playing.
BULLSHIT TERM #4: MARK TO MARKET ACCOUNTING - This is a process whereby a business can claim projected profits as actual profits, i.e. they can claim that the money they expect to make on a product or service, based on current market prices, is money they actually made. In financial companies this method of accounting supposedly makes sense, but when used in other businesses it takes on a whole different aspect, one which looks suspiciously like fraud. Why fraud? Because theoretical profits are recorded on the books as actual ones, so that investors (potential stockholders, for example) looking at the company's books will be fooled as to the profitability of the company. This sleight-of-hand is part of what sank Enron a few years back, and if you tried this shit in your own life, you'd be arrested on any number of charges, including filing false tax returns. Again, however, there is a double standard at work.
BULLSHIT TERM #5: SPECIAL PURPOSE ENTITIES - These are shell companies set up by a parent company for specific business purposes. Seems logical enough, but as usual, there's a potential dark side. "Special purpose entities" -- what the Mafia would call "front companies" -- can and often are used for nefarious purposes, to play a sort of shell-game with the truth. In the case of the above-mentioned Enron, S.P.E.'s were employed in huge numbers -- literally by the hundreds -- to hide the debts of the parent company. By doing this, and by using mark-to-market accounting, they managed to put on an appearance of financial strength, keep their stock prices inflated, and hide their liabilities from investors and from Wall Street. The trouble is -- as with all Ponzi schemes and shell games -- there inevitably comes a moment of critical mass, when one ball too many is added to the juggling and everything comes crashing down. I don't mean to sound redundant, but if they call it a "front for laundering money" when the Mafia does it, and an "S.P.E." when a big company does it, well, there's a reason for that
I chose these terms more or less at random, and I'm sure that many would dispute the cavalier way in which I have defined them. Nevertheless I feel justified in doing so. When I was working for the District Attorney's Office, I came to grasp that a great deal of the fancy-Dan terminology spewed by lawyers in court was bullshit; the basic concepts of our legal system are simple and easy to understand when not obscured by clouds of complex-sounding legal jargon. By and large, that jargon exists solely to intimidate the ordinary man into believing the law, which belongs to everyone ought to be the sole province of lawyers. It exists further to couch inherently unjust and corrupt practices in impressive-sounding Latin, so that few people understand the actual nature of what is being said and done. Well, the same principle applies to economics and finance. The terminology is deliberately arcane, so the most hypocritical, ridiculous and outright criminal behavior take on an air of almost scientific respectability. As George Orwell noted, clarity is the enemy of the lie, just as obscurity is the enemy of truth. And this brings us not only to our earlier question but a host of others. Why does the government, and Wall Street, hide simple economic truths behind terms deliberately designed to confuse and mislead? Why do they reinforce a system whose shortsighted greed leads to things like the housing bubble and the Great Recession? A system which continuously increases inflation even though such increases only benefit the mega-rich? A system which rewards bankers for taking stupid risks which endanger the economy yet places the financial burden of the business failures which follow those risks on the taxpayer and not the bank?
The answer would seem to be, "To benefit the rich, of course!" And this is, in fact, true; but it is not the whole answer nor even the crucial part. The real one can be summed up in one dread title: THE FEDERAL RESERVE SYSTEM. Understanding what the Fed does for us -- or rather, to us -- is key to grasping most of the evils which presently plague our economy, our society, and to a great extent, human life on this planet. Bold statement? Yes. But backed up by grim battalions of facts. And this will be the subject of my next piece, eloquently titled: WHY THE FEDERAL RESERVE IS THE EMPIRE IN "STAR WARS" (AND WE ARE THE REBELLION!).
Published on January 09, 2017 19:36
December 27, 2016
DEAR 2016: STOP KILLING MY CHILDHOOD ICONS, YOU BASTARD
So, as I sit here digesting breakfast and turning coffee into urine, Carrie Fisher has died. It's enough to make a man throw up his hands, remove his just-buttoned clothes, and crawl back in bed. A few weeks ago, when yet another icon of my childhood went prematurely West, I had a mental image of the Grim Reaper, his scythe resting on one bony, cloak-shrouded shoulder, whistling through his lipless mouth as he tallied up his kills for 2016. But never mind the Reaper image now; today I feel as if there's an overcaffeineated sniper sitting in the upper deck of some grandiose red-carpet event in Hollywood, picking the heroes of my formative years off one at a time. I'm not surprised, mind you; I knew 2016 wasn't finished tallying its Butcher's Bill, but I'd hoped Fisher was a case of misdirection -- a heart-attack scare to have us looking her way while he sneaked away and killed George Michael. But no, not a bit of it, the greedy fucker had to do for them both.
Don't misunderstand me. I'm 44 years old, which is old enough to have gone through several celebrity mass extinctions already. First, when I was in my pre-teens, came the deaths of all my parents' icons -- heroes of WW2, movie stars, journalists, figures from the Vietnam era and the civil rights movement, radio personalities from their earliest childhood. In the majority of cases I had no idea who these people were and didn't much care, though there were notable exceptions -- I can tell you exactly where I was when my brother told me John Lennon had been shot, for example. A broader awareness that I was living through one of those generational transition-periods only occurred about a decade later, when the character actors who I had grown up watching on TV and in the movies began to expire of natural causes with numbing regularity. Having lived out the years 1980 - 1990 from the ages of 8 - 18, I had grown up watching a lot of TV which predated my birth or my conscious awareness -- shows from the 60s and early 70s. The older actors on these shows, as well as the elder politicians who had guided the country during my earliest youth, and war heroes which I was now old enough to know existed and to admire or at least respect, were now old men and women and beginning to die off. It made me somewhat sad, but I took the position that these people had all immortalized themselves through their accomplishments, so what difference did it really make if their flesh-and-blood selves were no longer with us? It wasn't like I was ever going to physically meet the casts of Welcome Back Cotter, All in the Family, M*A*S*H, Battlestar Galactica, Star Trek, What's Happening, Good Times, Sanford & Son, Barney Miller, Three's Company, Hogan's Heroes, Gilligan's Island or any of the other shows I grew up watching anytime soon. Ditto my movie-star heroes. Whether they lived in the world as I did made little difference so long as I could pick up a remote control and summon up permanently ageless images of them.
Truth be told, I've found a lot of philosophical comfort in the idea that cameras, both still and moving, can trap us at a particular moment in time like genies in a bottle. We may turn into mummified wrecks in just a few decades, but a simple device can freeze us at a particular moment of development, and now, due to technology, the image itself never needs to fade or decay due to acids in the paper. It can be electronically reproduced an infinite number of times, without any cost, and sent anywhere in the world instantaneously with the press of a button. In principle this also obtains in regards to television episodes, movies and even to music. Thanks DVDs, to Netflix, Amazon Prime, iTunes and what not, we are no longer hostage to the capricious programming schedules that plagued me as a youth, or to record stores that might or might not have the rare album we wanted in stock. The whole history of entertainment, going back a hundred-odd years is now available, literally, at our fingertips, often at very little cost. Take, for example, one of my many hobbies -- listening to the great programs from the Golden Age of Radio (The Shadow, Suspense, Escape, Inner Sanctum Mysteries, etc., etc.). Intellectually I am aware that every last performer, every last writer, producer, musician, and even the fucking production assistants involved in these shows are long in their graves. Emotionally, however, it doesn't matter, because when I press play on my iPod, they immediately spring to life once more, as young, as strong, and as vital as they ever were. And, on a more selfish note, the fact that they have achieved this sort of immortality gives me, as a writer, hope that I might someday achieve the same, and I am comforted.
But then -- then you have a year like 2016, and my whole comfy-cozy security-blanket of an philosophy goes up in flames.
As I said before, I know that people die. Even famous ones. As Thomas More says in A Man For All Seasons, "Death comes for us all, my lords. Yes, even for Kings he comes, to whom amidst all their Royalty and brute strength he will neither kneel nor make them any reverence nor pleasantly desire them to come forth, but roughly grasp them by the very breast and rattle them until they be stark dead!" True. And because of this, no year goes by without a steady attrition of famous names in the obituaries. Yet whether it is fact or merely perception driven by internet hysteria, 2016 does seem as if it is operating against my childhood heroes as if it bears them some kind of deeply personal grudge. A complete list, as of December 27, would go on forever, so I've decided to commemorate those whose passing effected me the most.
Prince. When the news of Prince's death reached me, I had just completed a long hike around the Hollywood Reservoir and was tired, sweaty and ready for a shower and possibly a cold beer. At first, I admit I had no reaction at all except a lift of the eyebrows. Though I knew the usual number of his songs ("1999," "Let's Go Crazy," "Raspberry Beret" etc.) by heart, I had never bought one of his albums or even one of his singles; I had never watched Purple Rain even though it was one of The auditory-cinematic moments of the 80s; and in fact I'd found him, overall, to be unpleasantly arrogant and self-reverential, not to mention inconsistent in the quality of his music. Nevertheless, in the brief interval between initially reading the news off my phone and getting to my car, my eyes began to mist over with tears, and I had to concentrate hard on my driving to avoid an accident on the way home. I began to realize that when one of the pillars of your childhood -- he more or less wrote the soundtrack to my formative period --dies, it doesn't really matter whether you were a fan of his or not, cared for his music or not, or even liked the personality he exuded in interviews and television appearances. The pillar has been broken. That's what matters. It cut me especially deep when I read sentiments on Twitter and the like which, remarking on Prince's death, read, "Don't worry -- we still share the same planet as Beyonce." Huh? What? How the fuck can you mention a performance artist who doesn't write her own lyrics, arrange her own music, or choreograph her own dance moves with someone who played 27 musical instruments, many at the very highest level of technical ability? Who wrote his own songs, at a rate of one per day for 33 years? Arranged, produced, engineered and mastered his own albums? Don't make me fucking laugh. Prince was that rarest and now most endangered of breeds: a genius within his own field, someone who didn't need an army to produce masterpieces. Mourn him, folks, because in an increasingly corporate era of music, we may not see his like again.
Abe Vigoda and Ron Glass. Depending on when you were born, these names may not mean much to you, but their deaths stung me. Vigoda, a long-faced, floppy-eared, sad-eyed actor who seemed to have been born old, is best known for playing Tessio in The Godfather, but I knew him best as Detective Phil Fish on Barney Miller a 70s-era sit-com about a group of NYPD detectives, mostly racial, ethnic or class archetypes, who do everything well except police work. Fish was a grumpy, world-weary Jew, hen-pecked by his unseen wife; Ron Glass, on the other hand, played Ron Harris, a smooth-talking, lady-killing black detective with expensive tastes and bad judgement. (He went on to achieve notoriety with a new generation as Shepherd Book on Firefly.) Both were nightly sights in my house -- from Monday through Friday anyway -- when I was growing up.
John Glenn. The term "American hero" is bandied about so much it has lost all value, but Glenn really was an American hero. Trained as an engineer, he flew as a fighter pilot in two wars, earning six Distinguished Flying Crosses and the nickname "Magnet Ass" because his fighter often landed with as many as 250 holes in it, courtesy of enemy flak. He shot down three MiGs over Korea in 1953 -- the last three American aerial victories of the conflict -- and then served as a test pilot, a job generally regarded as being even more dangerous than combat flying. Tapped as one of the "Mercury Seven," he became the first American to orbit the Earth and had a long and distinguished career in NASA. He then ran for, and served in, the United States Senate for 24 years. In 1998, while still in the Senate, he became the oldest astronaut in history, flying aboard the space shuttle Discovery at the age of 77. He was also an ordained Presbyterian minister, a 33rd degree Mason, and remained married to his wife for 73 years until the day he died. I met Glenn once, in '92, during Senate hearings I attended as part of a college assignment, and was struck by the smile he had on his face -- an inward sort of smile, as if he were perpetually chuckling at an inside joke.
Gene Wilder. If you have no feelings for Gene Wilder you've ever never owned a television or you simply have no heart. Though beloved by most for portraying Willy Wonka or Young Frankenstein, my own favorite Wilder performance was as the Waco Kid in Blazing Saddles. With his mop of gold curls, startling blue eyes and homely, highly mobile face, he made the perfect counterpoint to Cleavon Little's black sheriff. And I'll never forget the way he delivered that line: "Yeah, I was the Kid...it got so that every pissant prairie punk who thought he could shoot a gun would ride into town to try out the Waco Kid. I must've killed more men than Cecil B Demille. Got pretty gritty. I started to hear the word 'draw' in my sleep. Then one day, I was just walking down the street, and I heard a voice behind me say, 'Reach for it Mister!' I spun around and there I was face to face with a six-year-old kid. Well I just threw my guns down and walked away. (pause) Little bastard shot me in the ass!!"
Tom Huddleston. Another veteran of Blazing Saddles, the bald, round-faced, heavyset, cigar-chewing Huddleston was such a staple on TV and in film, usually playing pompous blowhards, that you'd be hard-pressed to find anyone who doesn't recognize his face. He will probably go down in history, however, as being The Big Lebowski in, well, The Big Lebowski.
John McLaughlin. Did I say pompous blowhard? McLaughlin ran his self-titled political show The McLaughlin Group for 34 years. Growing up in Washington, D.C., you could not escape this insufferable know-it-all, who bullied and upbraided his fellow journalists for a half hour each week, but also made some of them (i.e. Pat Buchanan) famous. At first merely a local phenomenon, McLaughlin eventually became such fixture in Washington politics that he was tapped to appear (as himself) in Independence Day. Although my father knew the man professionally and told me he was a jerk who "could dish it out but couldn't take it," he was disappointed when his own guest appearance on the show got cancelled. The fact is, in an age when most "journalists" have become mere talking heads reading off telepromters, McLaughin actually knew his stuff.
George Kennedy. Another staple of American film, Kennedy was a master of both heavy drama and slapstick comedy, bringing a rugged, straight-man demeanor to both with equal ability. In "serious" films he held his own when up against Burt Lancaster, Charleton Heston, Jack Lemmon, Clint Eastwood, Chuck Norris, and others, and won an Oscar for playing Dragline opposite Paul Newman in Cool Hand Luke, but I confess I'll remember him as the earnest, bumbling Captain Ed Hocken in The Naked Gun trilogy. I had a chance to attend a memorial showing of some of his films this year at the Aero and Egyptian Theaters here in L.A., and he was very warmly remembered by those who had worked for him.
Kenny Baker. Baker played R2D2 in six Star Wars films. He was "the ghost in the machine," which was possible only because he stood 3'8" tall, proving once and for all that you don't need height to have impact. Anyone who saw the original trilogy in the theaters fell in love with the feisty little droid, who endured C3PO, Darth Vader, Jawas, swamp monsters, Ewoks, and all sorts of additional drama while trying to save the universe.
Morley Safer. When I was growing up, 60 Minutes was the one program no adult ever seemed to miss, and I confess that I not only watched, but enjoyed this gutsy, scrappy "TV news magazine." The mainstays of the show were Mike Wallace, Harry Reasoner, Ed Bradley, Andy Rooney...and Morley Safer, who like many great American journalists, was actually Canadian. It's true that the show sensationalized the profession of journalism and favored a confrontational style of interviewing that was only dubiously ethical, but there is no doubt that in its endless run on television, Safer -- who was on the show from 1970 until a week before his death -- did a lot of good as well. He is best remembered by many as the man who brought "the Vietnam war into the living room," but that was before my time. I myself associate him with skewering interviews of various sweaty-browed politicians and businessmen caught with their hands in the cookie jar. His loss ended an era that, given the ongoing decline of journalism, may never return.
Robert Vaughn. The words "Silky menace" and "Robert Vaughn" go hand in hand. Few guys acted with more aristocratic hauteur and -- if necessary -- ice-cold nastiness than Vaughn. My mom and dad knew him as The Man From U.N.C.L.E." but I remember him from such films as The Magnificent Seven, where he played a gun-for-hire nursing a terrible secret, and 'Bullitt' in which he airily told Steve McQueen, "Integrity is something we sell to the public." In The Bridge at Remagen he played sympathetic German officer Paul Krueger, who fights the Americans on one hand and his own high command on the other. Even as an old man he retained the same cold fire, dueling with Steven Hill (who also fucking died this year!) in several memorable episodes of Law & Order. Some actors are interchangeable; some are unique. Vaughn didn't have extreme range, but what he could do he did better than just about anyone else.
Alan Rickman. Do I really need to write a eulogy for an actor this goddamn good? Most people in Rickman's profession would give a finger to be remembered for one, signature role in a 30 or 40 year career. Rickman had about half a dozen. There was his letter-perfect turn as suave villain Hans Gruber in Die Hard, his over-the-top performance as the Sheriff of Nottingham in Robin Hood, seven films worth of mean-spirited Snape nastiness in the Harry Potter franchise, washed up sci-fi actor Alexander Dane in the cult hit Galaxy Quest and, in perhaps his subtlest turn, Harry the Almost Cheating Husband in Love, Actually. Waking up to discover him dead was like coming down on Christmas morning to find the tree on fire and the presents melted into glue.
George Michael. As I said, I'm basically an 80's kid, and if you were in any way conscious during the 80's you both loved and hated George Michael. You hated him because for what seemed like aeons he was inescapable on both radio and video with a relentless series of horribly catchy hits, each more cloying -- and yet somehow pleasurable -- than the last. Already a star when he left Wham!, he annihilated the charts and conquered an unwilling world with Faith, a solo album that, beneath its pop-music gloss, has real profundity in it and is one of those records that gains rather than loses luster with time. After a deliberate un-glamming of himself in the early 90s, an ugly fight with Sony Records over his contract, and an embarrassing arrest here in La La Land just a few years back, Michael faded from the public eye, but recently there were intimations of a comeback. Alas, it will never happen now. As John Greenleaf Whitter once wrote: "Of all sad words of tongue and pen, the saddest are these: 'It might have been.'"
And this brings me back to Carrie Fisher. I first met this lady in 1977, when my father took me to Star Wars at the Uptown Theater in Washington, D.C. I was only five years old, but her Princess Leia made quite the impression on me -- ram horn hairdo, flowing white robes, refusal to be cowed by Darth Vader, Governor Tarkin, Han Solo, or Chewbacca, to whom she referred to as a "walking carpet." A few years later she charmed me further in The Empire Strikes Back, giving a more nuanced performance as the coldly efficient Rebel leader smitten with an emotionally unavailable space pirate. And then came Return of the Jedi, and her notorious turn in the "slave bikini," which accelerated the puberty of millions of adolescent boys, your humble correspondent included. After that she faded from our collective radar screen for many years amidst stories of manic depression and heavy drug abuse, only to refashion herself as a novelist and autobiographer of surprising subtlety and depth. "I am a spy in the house of me," she wrote. "I report back from the front lines of the battle that is me. I am somewhat nonplused by the event that is my life." And I, Carrie, am somewhat nonplussed by the event that was your death.
I realize, on the one hand, how absurd it is to mourn for people you didn't know, who, if you ever happened to meet them, might offer you a pained, insincere, slightly bored smile while you mumbled, star-struck, at what an effect they'd had on your life. No matter how polite they are, the fact remains that the relationship you have, or think you have, with them, is entirely one-way: you grew up watching them; they, on the other hand, did not grow up watching you. You are not a familiar, welcome face. You are just some random dude making cow-eyes at them, possibly in a very unglamorous setting -- the CVS on Sunset Boulevard, for example, or a coffee shop on King's Road and Beverly. I know this to be true because I work in the entertainment industry and I live in Los Angeles and those two situations have allowed me to bump into a number of famous folks, some of whom I idolized as a youth, and the result is not always heartwarming. It may seem staggeringly obvious, but actors and comedians and musicians are human beings, and have the usual battery of human failings, and just because they played a great guy or a great gal on TV, or sang a sweet tune, or wrote a kick-ass comedy sketch you still quote after your third beer, doesn't mean they're nice folks. Some -- shocker! -- are what Al Pacino once referred to as "large-type assholes."
At the same time, I also recognize the absurdity of personifying a year in the way I have done with the title of this blog. A "year" is a construct of the human mind; it has no physical reality, no consciousness, no intent. As I said above, I realize, intellectually, that this steady stream of deaths is simply a statistical oddity -- and perhaps not even that, perhaps only the perception of one. People have got to die sometime, and random chance will ultimately see to it that one "year" claims more of their lives than another. So it must be here.
And yet! My mental, emotional image of 2016 remains the same. I still see Death striding eagerly and almost gleefully down an alley somewhere in Hollywood, his trusty scythe gripped in bony hands, his sightless eyes peering into lighted windows of the homes where the celebrities of my youth reside. The hourglass around his neck still has a few sands within it, and he there remain a few names on the list tucked into his belt. Who else can he claim before the clock retires this version of him for good? I can't answer that, but I can say this to him in closing:
If you take Sean Connery, we riot.
Don't misunderstand me. I'm 44 years old, which is old enough to have gone through several celebrity mass extinctions already. First, when I was in my pre-teens, came the deaths of all my parents' icons -- heroes of WW2, movie stars, journalists, figures from the Vietnam era and the civil rights movement, radio personalities from their earliest childhood. In the majority of cases I had no idea who these people were and didn't much care, though there were notable exceptions -- I can tell you exactly where I was when my brother told me John Lennon had been shot, for example. A broader awareness that I was living through one of those generational transition-periods only occurred about a decade later, when the character actors who I had grown up watching on TV and in the movies began to expire of natural causes with numbing regularity. Having lived out the years 1980 - 1990 from the ages of 8 - 18, I had grown up watching a lot of TV which predated my birth or my conscious awareness -- shows from the 60s and early 70s. The older actors on these shows, as well as the elder politicians who had guided the country during my earliest youth, and war heroes which I was now old enough to know existed and to admire or at least respect, were now old men and women and beginning to die off. It made me somewhat sad, but I took the position that these people had all immortalized themselves through their accomplishments, so what difference did it really make if their flesh-and-blood selves were no longer with us? It wasn't like I was ever going to physically meet the casts of Welcome Back Cotter, All in the Family, M*A*S*H, Battlestar Galactica, Star Trek, What's Happening, Good Times, Sanford & Son, Barney Miller, Three's Company, Hogan's Heroes, Gilligan's Island or any of the other shows I grew up watching anytime soon. Ditto my movie-star heroes. Whether they lived in the world as I did made little difference so long as I could pick up a remote control and summon up permanently ageless images of them.
Truth be told, I've found a lot of philosophical comfort in the idea that cameras, both still and moving, can trap us at a particular moment in time like genies in a bottle. We may turn into mummified wrecks in just a few decades, but a simple device can freeze us at a particular moment of development, and now, due to technology, the image itself never needs to fade or decay due to acids in the paper. It can be electronically reproduced an infinite number of times, without any cost, and sent anywhere in the world instantaneously with the press of a button. In principle this also obtains in regards to television episodes, movies and even to music. Thanks DVDs, to Netflix, Amazon Prime, iTunes and what not, we are no longer hostage to the capricious programming schedules that plagued me as a youth, or to record stores that might or might not have the rare album we wanted in stock. The whole history of entertainment, going back a hundred-odd years is now available, literally, at our fingertips, often at very little cost. Take, for example, one of my many hobbies -- listening to the great programs from the Golden Age of Radio (The Shadow, Suspense, Escape, Inner Sanctum Mysteries, etc., etc.). Intellectually I am aware that every last performer, every last writer, producer, musician, and even the fucking production assistants involved in these shows are long in their graves. Emotionally, however, it doesn't matter, because when I press play on my iPod, they immediately spring to life once more, as young, as strong, and as vital as they ever were. And, on a more selfish note, the fact that they have achieved this sort of immortality gives me, as a writer, hope that I might someday achieve the same, and I am comforted.
But then -- then you have a year like 2016, and my whole comfy-cozy security-blanket of an philosophy goes up in flames.
As I said before, I know that people die. Even famous ones. As Thomas More says in A Man For All Seasons, "Death comes for us all, my lords. Yes, even for Kings he comes, to whom amidst all their Royalty and brute strength he will neither kneel nor make them any reverence nor pleasantly desire them to come forth, but roughly grasp them by the very breast and rattle them until they be stark dead!" True. And because of this, no year goes by without a steady attrition of famous names in the obituaries. Yet whether it is fact or merely perception driven by internet hysteria, 2016 does seem as if it is operating against my childhood heroes as if it bears them some kind of deeply personal grudge. A complete list, as of December 27, would go on forever, so I've decided to commemorate those whose passing effected me the most.
Prince. When the news of Prince's death reached me, I had just completed a long hike around the Hollywood Reservoir and was tired, sweaty and ready for a shower and possibly a cold beer. At first, I admit I had no reaction at all except a lift of the eyebrows. Though I knew the usual number of his songs ("1999," "Let's Go Crazy," "Raspberry Beret" etc.) by heart, I had never bought one of his albums or even one of his singles; I had never watched Purple Rain even though it was one of The auditory-cinematic moments of the 80s; and in fact I'd found him, overall, to be unpleasantly arrogant and self-reverential, not to mention inconsistent in the quality of his music. Nevertheless, in the brief interval between initially reading the news off my phone and getting to my car, my eyes began to mist over with tears, and I had to concentrate hard on my driving to avoid an accident on the way home. I began to realize that when one of the pillars of your childhood -- he more or less wrote the soundtrack to my formative period --dies, it doesn't really matter whether you were a fan of his or not, cared for his music or not, or even liked the personality he exuded in interviews and television appearances. The pillar has been broken. That's what matters. It cut me especially deep when I read sentiments on Twitter and the like which, remarking on Prince's death, read, "Don't worry -- we still share the same planet as Beyonce." Huh? What? How the fuck can you mention a performance artist who doesn't write her own lyrics, arrange her own music, or choreograph her own dance moves with someone who played 27 musical instruments, many at the very highest level of technical ability? Who wrote his own songs, at a rate of one per day for 33 years? Arranged, produced, engineered and mastered his own albums? Don't make me fucking laugh. Prince was that rarest and now most endangered of breeds: a genius within his own field, someone who didn't need an army to produce masterpieces. Mourn him, folks, because in an increasingly corporate era of music, we may not see his like again.
Abe Vigoda and Ron Glass. Depending on when you were born, these names may not mean much to you, but their deaths stung me. Vigoda, a long-faced, floppy-eared, sad-eyed actor who seemed to have been born old, is best known for playing Tessio in The Godfather, but I knew him best as Detective Phil Fish on Barney Miller a 70s-era sit-com about a group of NYPD detectives, mostly racial, ethnic or class archetypes, who do everything well except police work. Fish was a grumpy, world-weary Jew, hen-pecked by his unseen wife; Ron Glass, on the other hand, played Ron Harris, a smooth-talking, lady-killing black detective with expensive tastes and bad judgement. (He went on to achieve notoriety with a new generation as Shepherd Book on Firefly.) Both were nightly sights in my house -- from Monday through Friday anyway -- when I was growing up.
John Glenn. The term "American hero" is bandied about so much it has lost all value, but Glenn really was an American hero. Trained as an engineer, he flew as a fighter pilot in two wars, earning six Distinguished Flying Crosses and the nickname "Magnet Ass" because his fighter often landed with as many as 250 holes in it, courtesy of enemy flak. He shot down three MiGs over Korea in 1953 -- the last three American aerial victories of the conflict -- and then served as a test pilot, a job generally regarded as being even more dangerous than combat flying. Tapped as one of the "Mercury Seven," he became the first American to orbit the Earth and had a long and distinguished career in NASA. He then ran for, and served in, the United States Senate for 24 years. In 1998, while still in the Senate, he became the oldest astronaut in history, flying aboard the space shuttle Discovery at the age of 77. He was also an ordained Presbyterian minister, a 33rd degree Mason, and remained married to his wife for 73 years until the day he died. I met Glenn once, in '92, during Senate hearings I attended as part of a college assignment, and was struck by the smile he had on his face -- an inward sort of smile, as if he were perpetually chuckling at an inside joke.
Gene Wilder. If you have no feelings for Gene Wilder you've ever never owned a television or you simply have no heart. Though beloved by most for portraying Willy Wonka or Young Frankenstein, my own favorite Wilder performance was as the Waco Kid in Blazing Saddles. With his mop of gold curls, startling blue eyes and homely, highly mobile face, he made the perfect counterpoint to Cleavon Little's black sheriff. And I'll never forget the way he delivered that line: "Yeah, I was the Kid...it got so that every pissant prairie punk who thought he could shoot a gun would ride into town to try out the Waco Kid. I must've killed more men than Cecil B Demille. Got pretty gritty. I started to hear the word 'draw' in my sleep. Then one day, I was just walking down the street, and I heard a voice behind me say, 'Reach for it Mister!' I spun around and there I was face to face with a six-year-old kid. Well I just threw my guns down and walked away. (pause) Little bastard shot me in the ass!!"
Tom Huddleston. Another veteran of Blazing Saddles, the bald, round-faced, heavyset, cigar-chewing Huddleston was such a staple on TV and in film, usually playing pompous blowhards, that you'd be hard-pressed to find anyone who doesn't recognize his face. He will probably go down in history, however, as being The Big Lebowski in, well, The Big Lebowski.
John McLaughlin. Did I say pompous blowhard? McLaughlin ran his self-titled political show The McLaughlin Group for 34 years. Growing up in Washington, D.C., you could not escape this insufferable know-it-all, who bullied and upbraided his fellow journalists for a half hour each week, but also made some of them (i.e. Pat Buchanan) famous. At first merely a local phenomenon, McLaughlin eventually became such fixture in Washington politics that he was tapped to appear (as himself) in Independence Day. Although my father knew the man professionally and told me he was a jerk who "could dish it out but couldn't take it," he was disappointed when his own guest appearance on the show got cancelled. The fact is, in an age when most "journalists" have become mere talking heads reading off telepromters, McLaughin actually knew his stuff.
George Kennedy. Another staple of American film, Kennedy was a master of both heavy drama and slapstick comedy, bringing a rugged, straight-man demeanor to both with equal ability. In "serious" films he held his own when up against Burt Lancaster, Charleton Heston, Jack Lemmon, Clint Eastwood, Chuck Norris, and others, and won an Oscar for playing Dragline opposite Paul Newman in Cool Hand Luke, but I confess I'll remember him as the earnest, bumbling Captain Ed Hocken in The Naked Gun trilogy. I had a chance to attend a memorial showing of some of his films this year at the Aero and Egyptian Theaters here in L.A., and he was very warmly remembered by those who had worked for him.
Kenny Baker. Baker played R2D2 in six Star Wars films. He was "the ghost in the machine," which was possible only because he stood 3'8" tall, proving once and for all that you don't need height to have impact. Anyone who saw the original trilogy in the theaters fell in love with the feisty little droid, who endured C3PO, Darth Vader, Jawas, swamp monsters, Ewoks, and all sorts of additional drama while trying to save the universe.
Morley Safer. When I was growing up, 60 Minutes was the one program no adult ever seemed to miss, and I confess that I not only watched, but enjoyed this gutsy, scrappy "TV news magazine." The mainstays of the show were Mike Wallace, Harry Reasoner, Ed Bradley, Andy Rooney...and Morley Safer, who like many great American journalists, was actually Canadian. It's true that the show sensationalized the profession of journalism and favored a confrontational style of interviewing that was only dubiously ethical, but there is no doubt that in its endless run on television, Safer -- who was on the show from 1970 until a week before his death -- did a lot of good as well. He is best remembered by many as the man who brought "the Vietnam war into the living room," but that was before my time. I myself associate him with skewering interviews of various sweaty-browed politicians and businessmen caught with their hands in the cookie jar. His loss ended an era that, given the ongoing decline of journalism, may never return.
Robert Vaughn. The words "Silky menace" and "Robert Vaughn" go hand in hand. Few guys acted with more aristocratic hauteur and -- if necessary -- ice-cold nastiness than Vaughn. My mom and dad knew him as The Man From U.N.C.L.E." but I remember him from such films as The Magnificent Seven, where he played a gun-for-hire nursing a terrible secret, and 'Bullitt' in which he airily told Steve McQueen, "Integrity is something we sell to the public." In The Bridge at Remagen he played sympathetic German officer Paul Krueger, who fights the Americans on one hand and his own high command on the other. Even as an old man he retained the same cold fire, dueling with Steven Hill (who also fucking died this year!) in several memorable episodes of Law & Order. Some actors are interchangeable; some are unique. Vaughn didn't have extreme range, but what he could do he did better than just about anyone else.
Alan Rickman. Do I really need to write a eulogy for an actor this goddamn good? Most people in Rickman's profession would give a finger to be remembered for one, signature role in a 30 or 40 year career. Rickman had about half a dozen. There was his letter-perfect turn as suave villain Hans Gruber in Die Hard, his over-the-top performance as the Sheriff of Nottingham in Robin Hood, seven films worth of mean-spirited Snape nastiness in the Harry Potter franchise, washed up sci-fi actor Alexander Dane in the cult hit Galaxy Quest and, in perhaps his subtlest turn, Harry the Almost Cheating Husband in Love, Actually. Waking up to discover him dead was like coming down on Christmas morning to find the tree on fire and the presents melted into glue.
George Michael. As I said, I'm basically an 80's kid, and if you were in any way conscious during the 80's you both loved and hated George Michael. You hated him because for what seemed like aeons he was inescapable on both radio and video with a relentless series of horribly catchy hits, each more cloying -- and yet somehow pleasurable -- than the last. Already a star when he left Wham!, he annihilated the charts and conquered an unwilling world with Faith, a solo album that, beneath its pop-music gloss, has real profundity in it and is one of those records that gains rather than loses luster with time. After a deliberate un-glamming of himself in the early 90s, an ugly fight with Sony Records over his contract, and an embarrassing arrest here in La La Land just a few years back, Michael faded from the public eye, but recently there were intimations of a comeback. Alas, it will never happen now. As John Greenleaf Whitter once wrote: "Of all sad words of tongue and pen, the saddest are these: 'It might have been.'"
And this brings me back to Carrie Fisher. I first met this lady in 1977, when my father took me to Star Wars at the Uptown Theater in Washington, D.C. I was only five years old, but her Princess Leia made quite the impression on me -- ram horn hairdo, flowing white robes, refusal to be cowed by Darth Vader, Governor Tarkin, Han Solo, or Chewbacca, to whom she referred to as a "walking carpet." A few years later she charmed me further in The Empire Strikes Back, giving a more nuanced performance as the coldly efficient Rebel leader smitten with an emotionally unavailable space pirate. And then came Return of the Jedi, and her notorious turn in the "slave bikini," which accelerated the puberty of millions of adolescent boys, your humble correspondent included. After that she faded from our collective radar screen for many years amidst stories of manic depression and heavy drug abuse, only to refashion herself as a novelist and autobiographer of surprising subtlety and depth. "I am a spy in the house of me," she wrote. "I report back from the front lines of the battle that is me. I am somewhat nonplused by the event that is my life." And I, Carrie, am somewhat nonplussed by the event that was your death.
I realize, on the one hand, how absurd it is to mourn for people you didn't know, who, if you ever happened to meet them, might offer you a pained, insincere, slightly bored smile while you mumbled, star-struck, at what an effect they'd had on your life. No matter how polite they are, the fact remains that the relationship you have, or think you have, with them, is entirely one-way: you grew up watching them; they, on the other hand, did not grow up watching you. You are not a familiar, welcome face. You are just some random dude making cow-eyes at them, possibly in a very unglamorous setting -- the CVS on Sunset Boulevard, for example, or a coffee shop on King's Road and Beverly. I know this to be true because I work in the entertainment industry and I live in Los Angeles and those two situations have allowed me to bump into a number of famous folks, some of whom I idolized as a youth, and the result is not always heartwarming. It may seem staggeringly obvious, but actors and comedians and musicians are human beings, and have the usual battery of human failings, and just because they played a great guy or a great gal on TV, or sang a sweet tune, or wrote a kick-ass comedy sketch you still quote after your third beer, doesn't mean they're nice folks. Some -- shocker! -- are what Al Pacino once referred to as "large-type assholes."
At the same time, I also recognize the absurdity of personifying a year in the way I have done with the title of this blog. A "year" is a construct of the human mind; it has no physical reality, no consciousness, no intent. As I said above, I realize, intellectually, that this steady stream of deaths is simply a statistical oddity -- and perhaps not even that, perhaps only the perception of one. People have got to die sometime, and random chance will ultimately see to it that one "year" claims more of their lives than another. So it must be here.
And yet! My mental, emotional image of 2016 remains the same. I still see Death striding eagerly and almost gleefully down an alley somewhere in Hollywood, his trusty scythe gripped in bony hands, his sightless eyes peering into lighted windows of the homes where the celebrities of my youth reside. The hourglass around his neck still has a few sands within it, and he there remain a few names on the list tucked into his belt. Who else can he claim before the clock retires this version of him for good? I can't answer that, but I can say this to him in closing:
If you take Sean Connery, we riot.
Published on December 27, 2016 19:56
December 24, 2016
Unhappy New Year: The "Other" Battle of the Bulge
This week marks the 72nd anniversary of the Ardennes Campaign, better known to us as "the Battle of the Bulge." Few battles in American history carry a greater weight of fame, and it is not difficult to understand why, for the history of the Bulge plays out like a hack writer pitching an over-the-top WW2 movie: "The Germans are just about licked, see? Then, out of nowhere, they clobber us with a huge sneak attack. Infantry! Tanks! Paratroopers! Jet fighters! Commandos dressed up in American uniforms! It's panic. It's chaos. It's a red hot mess. And then, at the critical moment, just when the Nazis are about to break through, the good old Band of Brothers hole up in a town called Bastogne and fend them off! The Germans demand surrender, but the American general says, 'Nuts!' And on Christmas Day, General Patton arrives, and...."
Well, you get the picture. Truth is not only stranger than fiction, on rare occasions it's actually more glamorous, melodramatic, and unbelievable. Of course the full truth about the Bulge may never be known, since the battle itself was followed by a frenzy of records-destroying, ass-covering, and fact fudging by American generals, the likes of which wasn't seen again until Vietnam. Their problem was not with the outcome of the battle, but that Allied intelligence, which innumerable spy movies would have you believe was about a quarter of an IQ point less omniscient than God, had somehow failed to notice Hitler had amassed three huge armies at the very weakest spot of the American line -- a blunder which was to cost 89,500 U.S. casualties, including 19,000 dead.
Mind you, I don't really blame our generals for taking their Orwellian approach to history. Americans as a whole have an unwillingness, bordering on an incapacity, to accept that their military can be defeated -- even in a battle of wits. It is noteworthy that the only really crushing defeats suffered by the United States Army which are known to our public were inflicted on it by the Confederate States Army during the Civil War -- in other words, by other Americans. When it comes to foreigners getting the better of us, we don't want to hear that shit. In the Bulge, the Germans fooled us badly, caught us smirking and slacking, and kicked us in the jewels about as hard as they possibly could, but that is not the way we as a nation choose to compose our narrative. Instead, we see it as the ultimate underdog story, a sort of Rocky with tanks and guns, where the desperate heroes are bloodied and battered but, in the end, gloriously triumphant. This version has the advantage of being broadly true, but it is also symptomatic of that larger, moral problem -- our willingness, even eagerness, to edit history when runs contrary to the paradoxical image we have of ourselves -- perennial underdogs who are also perpetual winners.
This problem is moral because if one happens to be one of the American soldiers who fought in a battle the generals, and by extension the public, don't want to remember, then one finds oneself out in the historical cold. This was the fate of the 29,000 Americans doomed to fall under the ghastly heading of "casualties" during the so-called Other Battle of the Bulge, known in American histories as "The Alsace Campaign" and by the Germans, who initiated it, as "Operation North Wind."
When the fight kicked off, on New Year's Eve, 1944, the "real" Bulge battle had been raging for fifteen days, and had drawn fully half of the American forces in Europe into its maelstrom. Hitler saw in this shift an opportunity to administer yet another kick to Uncle Sam's groin, for as Eisenhower concentrated more and more combat power for the purposes of containing and destroying the German armies in the Bulge, he suffered a corresponding depletion of that power elsewhere. The "elsewhere" Hitler had in mind for Operation North Wind was Alsace, the much fought-over province which had just been reconquered by newly-liberated France. At a conference on December 28, just three days before the attack, Hitler claimed to his generals that the purpose of North Wind was not for "prestige" but simply a matter of "destroying and exterminating enemy forces wherever we find them." One of the principal German commanders of the coming battle, Heinrich Himmler, head of the dreaded SS, secretly disagreed with his master in this regard. "Heini" was determined to capture the capital of the province, Strasbourg, and to fly the swastika from its cathedral as a sort of thank-you gift to Der Fuehrer for placing him in command of a field army. This determination was to have fateful consequences for everyone involved in the looming fight, most especially the soldiers of the U.S. Sixth Army Group, commanded by General Jacob Devers. If the name Devers doesn't ring a bell, that's understandable: though he was one of only two Army Group commanders in Europe, he remained a mere three-star general until near the end of the war, largely because Eisenhower both disliked and distrusted him. This feeling, which was reciprocated, was also to have a deep influence on the battle, yet another example of how the "power of personality" is perhaps greater and more terrible than that of bombs.
Nordwind, as the Germans called it, kicked off at just about the stroke of midnight on December 31st, 1944. About twenty German divisions, attacking in successive waves along a 68-mile front that slanted northwest to southeast from Rohrbach to the Rhine River, and along the Rhine itself from the area immediately east of Strasbourg up to the German city of Karlsruhe, slammed into Sixth Army Group (7th U.S. Army, 1st French Army, XXI Corps of 1st U.S. Army) which had been stripped down to the metal by "Ike" to provide troops for the Battle of the Bulge. And indeed, "Ike" was unsympathetic to Devers' pleas for reinforcements. Allied intelligence, which had failed so miserably to predict the Bulge attack, knew all about "North Wind," and Eisenhower had already instructed "Jake" to withdraw if pressed, even if it meant yielding moist of Alsace to the Germans -- including Strasbourg. Eisenhower believed, not without cause, that this new German attack was intended to ease American pressure on the "Bulge," and to shift forces again would be to play Hitler's game. Devers, by temperament and to some extent by blood (his grandmother was from Alsace), was reluctant to obey, and he had key allies in this regard -- namely the French, who were appalled by the idea of giving up Strasbourg merely to save Eisenhower the trouble of defending it. As the historian Charles Whiting has noted, "Strasbourg was second only to Paris in French hearts," and more specifically in the heart of Charles de Gaulle, who informed Eisenhower that he would not permit the city to fall into German hands, even if it meant ordering the French First Army to operate independently of American command. What followed was an ugly and sometimes ridiculous argument between supposed allies which sucked in not only Eisenhower and de Gaulle, but Winston Churchill, who flew out to join a conference of his fellow "great men" on January 3, 1945. Ike accused the Frenchman of allowing political considerations to dictate military tactics, and threatened to withhold American supplies form the French Army, which was utterly dependent on them, if he refused to obey orders; de Gaulle countered by threatening to close French railways and communications systems to American troops. Churchill backed de Gaulle, a rather odd move for a man who would later be accused of trying to having him assassinated, and Eisenhower caved. Derided by his detractors of being a "political general" who was more interested in placating his allies than standing up for his countrymen, Ike countermanded his earlier orders to Devers. From now on, Americans would stand, fight, and die for the honor of France.
Die they did. The "other Battle of the Bulge" raged for twenty-five days, and the U.S. VI Corps, consisting of four "ordinary" infantry divisions (42nd, 45th, 70th and 79th), and commanded by a tough, crafty sonofabitch named Ted Brooks, bore the brunt of the fighting, which was almost unbelievably savage. As the History of the 14th Armored Division recalls: "You heard shouting and and stifled screams and the identifying brrrrp brrrrp brrrrp of (German) machine guns, the steady cracking of machine guns and small arms fire coming from the windows, crevices, and the church steeple, the deep rumple of tanks. Some tanks no longer moved, black hulks among the charred ruins of homes. White phosphorus shells burst in the streets, with sudden yellow flames and smoke pouring from half-timber buildings. Buildings that only smoked because there was nothing more to burn and made the town look like a ghost town, and still the shells came in. The mortars that never gave a warning, endlessly plopping in, scattering mortar and rubbish. There was the catching voice, crying 'Medic!' The surrounding fields no longer had a mantle of clear white snow. It was now stained with soot from powder, pockmarked with craters and soiled with blood."
Pushed all the way back back to the Moder River, Brooks held the line there, stopping the Germans at Haugenau despite intense pressure -- Devers later stated that Brooks had fought "one of the great defensive actions of all time." If so, it had come at a terrifying cost: 29,000 Americans killed, wounded or captured, for an average 1,160 casualties per day. The Germans, driven forward by discipline, love of country (Alsace-Lorriane had been German from 1870 - 1918, and again from 1940 - 1944) and fear of Himmler's wrath, lost 23,000 men in the same period, and never got to hoist the swastika over Strasbourg cathedral. As for the French, their casualties were about 2,000, a figure it is impossible to contemplate without feelings of irony, when one considers First French Army was probably better than a quarter of a million strong during the time in question. Charles de Gaulle seems to have purchased the honor of France with American blood, a fact he was to forget after the war -- if ever he acknowledged it in the first place.
It is interesting to note that Wikipedia lists the result of Operation North Wind as a "tactical Allied retreat." This is, of course, true; the Sixth Army Group was pushed back, at its furthest point, almost thirty-five miles, and the total territory lost to the Germans was between half and two-thirds as much as was yielded during the "real" Battle of the Bulge. Avoidance of the word defeat, however, is emblematic, for while one could make a very powerful argument that North Wind was not a defeat but merely a change in footwork, so to speak, it's hard to put a happy face on the results. The Germans advanced, and the Americans retreated. The Germans lost 23,000 men, but the Americans lost 29,000. The Germans failed to recapture Strasbourg, but Hitler had never said Strasbourg was the goal of the offensive in the first place, and in any event, the argument over how -- or whether -- to defend it left a bitter taste in the mouth of everyone who had a hand in it. The real legacy of Operation North Wind was the bare essence of war itself -- death and destruction without the satisfying, Hollywood-scripted climax offered by the Bulge. Judged by those lights, it's easy to understand why Nordwind has no place in the American memory. And yet I have to ask -- do battles like this, ugly, inconclusive brawls that leave landscapes desolated and cemeteries full, deserve less attention, simply because they don't reinforce our view our ourselves as invincible underdogs? Doesn't the soldier who dies defending a town with the unfortunate name of Bitche deserve as much praise as one who lost his life fighting for Bastogne? Hasn't the G.I. from a a nameless infantry division who lost an eye, a hand or his testicles during a "tactical retreat" earned just as much right to an HBO mini-series as a paratrooper from a famous outfit?
Today is Christmas Eve, and with New Year's right around the corner perhaps it would do us all some good to reflect a moment on those men who weren't "lucky" enough to have fought in one of the battles we Americans have made a collective agreement to remember. It seems to me their blood looked no different, steaming in the snow of Alsace, than that of our genuine heroes.
Well, you get the picture. Truth is not only stranger than fiction, on rare occasions it's actually more glamorous, melodramatic, and unbelievable. Of course the full truth about the Bulge may never be known, since the battle itself was followed by a frenzy of records-destroying, ass-covering, and fact fudging by American generals, the likes of which wasn't seen again until Vietnam. Their problem was not with the outcome of the battle, but that Allied intelligence, which innumerable spy movies would have you believe was about a quarter of an IQ point less omniscient than God, had somehow failed to notice Hitler had amassed three huge armies at the very weakest spot of the American line -- a blunder which was to cost 89,500 U.S. casualties, including 19,000 dead.
Mind you, I don't really blame our generals for taking their Orwellian approach to history. Americans as a whole have an unwillingness, bordering on an incapacity, to accept that their military can be defeated -- even in a battle of wits. It is noteworthy that the only really crushing defeats suffered by the United States Army which are known to our public were inflicted on it by the Confederate States Army during the Civil War -- in other words, by other Americans. When it comes to foreigners getting the better of us, we don't want to hear that shit. In the Bulge, the Germans fooled us badly, caught us smirking and slacking, and kicked us in the jewels about as hard as they possibly could, but that is not the way we as a nation choose to compose our narrative. Instead, we see it as the ultimate underdog story, a sort of Rocky with tanks and guns, where the desperate heroes are bloodied and battered but, in the end, gloriously triumphant. This version has the advantage of being broadly true, but it is also symptomatic of that larger, moral problem -- our willingness, even eagerness, to edit history when runs contrary to the paradoxical image we have of ourselves -- perennial underdogs who are also perpetual winners.
This problem is moral because if one happens to be one of the American soldiers who fought in a battle the generals, and by extension the public, don't want to remember, then one finds oneself out in the historical cold. This was the fate of the 29,000 Americans doomed to fall under the ghastly heading of "casualties" during the so-called Other Battle of the Bulge, known in American histories as "The Alsace Campaign" and by the Germans, who initiated it, as "Operation North Wind."
When the fight kicked off, on New Year's Eve, 1944, the "real" Bulge battle had been raging for fifteen days, and had drawn fully half of the American forces in Europe into its maelstrom. Hitler saw in this shift an opportunity to administer yet another kick to Uncle Sam's groin, for as Eisenhower concentrated more and more combat power for the purposes of containing and destroying the German armies in the Bulge, he suffered a corresponding depletion of that power elsewhere. The "elsewhere" Hitler had in mind for Operation North Wind was Alsace, the much fought-over province which had just been reconquered by newly-liberated France. At a conference on December 28, just three days before the attack, Hitler claimed to his generals that the purpose of North Wind was not for "prestige" but simply a matter of "destroying and exterminating enemy forces wherever we find them." One of the principal German commanders of the coming battle, Heinrich Himmler, head of the dreaded SS, secretly disagreed with his master in this regard. "Heini" was determined to capture the capital of the province, Strasbourg, and to fly the swastika from its cathedral as a sort of thank-you gift to Der Fuehrer for placing him in command of a field army. This determination was to have fateful consequences for everyone involved in the looming fight, most especially the soldiers of the U.S. Sixth Army Group, commanded by General Jacob Devers. If the name Devers doesn't ring a bell, that's understandable: though he was one of only two Army Group commanders in Europe, he remained a mere three-star general until near the end of the war, largely because Eisenhower both disliked and distrusted him. This feeling, which was reciprocated, was also to have a deep influence on the battle, yet another example of how the "power of personality" is perhaps greater and more terrible than that of bombs.
Nordwind, as the Germans called it, kicked off at just about the stroke of midnight on December 31st, 1944. About twenty German divisions, attacking in successive waves along a 68-mile front that slanted northwest to southeast from Rohrbach to the Rhine River, and along the Rhine itself from the area immediately east of Strasbourg up to the German city of Karlsruhe, slammed into Sixth Army Group (7th U.S. Army, 1st French Army, XXI Corps of 1st U.S. Army) which had been stripped down to the metal by "Ike" to provide troops for the Battle of the Bulge. And indeed, "Ike" was unsympathetic to Devers' pleas for reinforcements. Allied intelligence, which had failed so miserably to predict the Bulge attack, knew all about "North Wind," and Eisenhower had already instructed "Jake" to withdraw if pressed, even if it meant yielding moist of Alsace to the Germans -- including Strasbourg. Eisenhower believed, not without cause, that this new German attack was intended to ease American pressure on the "Bulge," and to shift forces again would be to play Hitler's game. Devers, by temperament and to some extent by blood (his grandmother was from Alsace), was reluctant to obey, and he had key allies in this regard -- namely the French, who were appalled by the idea of giving up Strasbourg merely to save Eisenhower the trouble of defending it. As the historian Charles Whiting has noted, "Strasbourg was second only to Paris in French hearts," and more specifically in the heart of Charles de Gaulle, who informed Eisenhower that he would not permit the city to fall into German hands, even if it meant ordering the French First Army to operate independently of American command. What followed was an ugly and sometimes ridiculous argument between supposed allies which sucked in not only Eisenhower and de Gaulle, but Winston Churchill, who flew out to join a conference of his fellow "great men" on January 3, 1945. Ike accused the Frenchman of allowing political considerations to dictate military tactics, and threatened to withhold American supplies form the French Army, which was utterly dependent on them, if he refused to obey orders; de Gaulle countered by threatening to close French railways and communications systems to American troops. Churchill backed de Gaulle, a rather odd move for a man who would later be accused of trying to having him assassinated, and Eisenhower caved. Derided by his detractors of being a "political general" who was more interested in placating his allies than standing up for his countrymen, Ike countermanded his earlier orders to Devers. From now on, Americans would stand, fight, and die for the honor of France.
Die they did. The "other Battle of the Bulge" raged for twenty-five days, and the U.S. VI Corps, consisting of four "ordinary" infantry divisions (42nd, 45th, 70th and 79th), and commanded by a tough, crafty sonofabitch named Ted Brooks, bore the brunt of the fighting, which was almost unbelievably savage. As the History of the 14th Armored Division recalls: "You heard shouting and and stifled screams and the identifying brrrrp brrrrp brrrrp of (German) machine guns, the steady cracking of machine guns and small arms fire coming from the windows, crevices, and the church steeple, the deep rumple of tanks. Some tanks no longer moved, black hulks among the charred ruins of homes. White phosphorus shells burst in the streets, with sudden yellow flames and smoke pouring from half-timber buildings. Buildings that only smoked because there was nothing more to burn and made the town look like a ghost town, and still the shells came in. The mortars that never gave a warning, endlessly plopping in, scattering mortar and rubbish. There was the catching voice, crying 'Medic!' The surrounding fields no longer had a mantle of clear white snow. It was now stained with soot from powder, pockmarked with craters and soiled with blood."
Pushed all the way back back to the Moder River, Brooks held the line there, stopping the Germans at Haugenau despite intense pressure -- Devers later stated that Brooks had fought "one of the great defensive actions of all time." If so, it had come at a terrifying cost: 29,000 Americans killed, wounded or captured, for an average 1,160 casualties per day. The Germans, driven forward by discipline, love of country (Alsace-Lorriane had been German from 1870 - 1918, and again from 1940 - 1944) and fear of Himmler's wrath, lost 23,000 men in the same period, and never got to hoist the swastika over Strasbourg cathedral. As for the French, their casualties were about 2,000, a figure it is impossible to contemplate without feelings of irony, when one considers First French Army was probably better than a quarter of a million strong during the time in question. Charles de Gaulle seems to have purchased the honor of France with American blood, a fact he was to forget after the war -- if ever he acknowledged it in the first place.
It is interesting to note that Wikipedia lists the result of Operation North Wind as a "tactical Allied retreat." This is, of course, true; the Sixth Army Group was pushed back, at its furthest point, almost thirty-five miles, and the total territory lost to the Germans was between half and two-thirds as much as was yielded during the "real" Battle of the Bulge. Avoidance of the word defeat, however, is emblematic, for while one could make a very powerful argument that North Wind was not a defeat but merely a change in footwork, so to speak, it's hard to put a happy face on the results. The Germans advanced, and the Americans retreated. The Germans lost 23,000 men, but the Americans lost 29,000. The Germans failed to recapture Strasbourg, but Hitler had never said Strasbourg was the goal of the offensive in the first place, and in any event, the argument over how -- or whether -- to defend it left a bitter taste in the mouth of everyone who had a hand in it. The real legacy of Operation North Wind was the bare essence of war itself -- death and destruction without the satisfying, Hollywood-scripted climax offered by the Bulge. Judged by those lights, it's easy to understand why Nordwind has no place in the American memory. And yet I have to ask -- do battles like this, ugly, inconclusive brawls that leave landscapes desolated and cemeteries full, deserve less attention, simply because they don't reinforce our view our ourselves as invincible underdogs? Doesn't the soldier who dies defending a town with the unfortunate name of Bitche deserve as much praise as one who lost his life fighting for Bastogne? Hasn't the G.I. from a a nameless infantry division who lost an eye, a hand or his testicles during a "tactical retreat" earned just as much right to an HBO mini-series as a paratrooper from a famous outfit?
Today is Christmas Eve, and with New Year's right around the corner perhaps it would do us all some good to reflect a moment on those men who weren't "lucky" enough to have fought in one of the battles we Americans have made a collective agreement to remember. It seems to me their blood looked no different, steaming in the snow of Alsace, than that of our genuine heroes.
Published on December 24, 2016 10:51
ANTAGONY: BECAUSE EVERYONE IS ENTITLED TO MY OPINION
A blog about everything. Literally. Everything. Coming out twice a week until I run out of everything.
- Miles Watson's profile
- 63 followers

