Tim Patrick's Blog, page 17

September 3, 2014

American Racism is Basically Over


A recent article in the The Independent documented racial angst over a tribute to Robin Williams at this year’s Emmy Awards. During Billy Crystal’s public eulogy for the late comedian, a video showed Williams donning a makeshift hijab and blurting out, “I would like to welcome you to Iran…Help me!” The Twittersphere quickly labeled the shtick as racist, but you shouldn’t believe everything you read on the Internet.


Robin Williams’ poke at the situation in Iran might have offended some people, but it wasn’t racist. Iran is obviously not a race; it’s a country, a geopolitical region defined by sovereign states, not by DNA. Its religion is monolithic, but that too is immaterial, since it is the nation’s recent political history, and not its primary creed, that makes Williams’ humor so biting. The transformation of Iran from a somewhat open, liberal society to one where a joke about the status of women resonates with outsiders happened just a few decades ago, and was a political change, not a racial one.


“Racist” has become the go-to explanation for many of America’s current woes. From police shootings in the Midwest to conflicts in the Middle East and every gripe about President Obama’s policies in between, if there’s trouble in the news, there’s bound to be an accusation of racism soon after. It turns out that calling someone a racist is a great way to silence an opponent. But in the United States in the twenty-first century, such accusations are rarely accurate.


Racism in America is basically over. The country used to be a lot more racist, back in the Civil War and Jim Crow days, and even as recently as the wartime fiasco of Japanese internment. But it’s not like that anymore, not even close. Sure, the KKK and neo-Nazi groups still exist today, but we call them “fringe radicals” for a reason. Over the last 150 years, and especially since the Civil Rights Era of the mid-1960s, the United States has steadily distanced itself from its racist past.


That national evolution doesn’t mean that race is no longer an issue. As proved by the police shooting of Michael Brown in Ferguson, Missouri several weeks ago, and by the riots that followed, race and racial sentiments still have an impact. Does the shooting of a black youth by a white officer substantiate a disturbing trend in race relations? The TV footage says “yes,” but the actual numbers related to police shootings say, “not really.” A post-Ferguson New York Times column looked at the conventional wisdom that says “the use of deadly force by police officers unfairly targets blacks. All that is needed are the numbers to prove it. But those numbers do not exist.”


Young black men are arrested and incarcerated at rates quite disproportionate to their overall population. Is that racist? Perhaps in some situations. But crime statistics by race always have titles like “Crimes by race” instead of “Crimes by race, age, sex, region, type of offense, weapon choice, family background, education level, financial situation, work history, music preference, and local political culture.” Such summary numbers always gloss over poverty and education variations, the distribution, frequency, and extenuating circumstances of different types of crimes, cultural differences between urban and rural environments, the variation in mandatory sentencing laws, the recidivism rates for different regions and demographic groups, changes in family makeup over recent decades, and the role of technology, music, and communications as they relate to criminal activities.


In other words, crime is a complicated and messy thing, and race is just one of many components. The temptation to shout “racist” whenever there is conflict between different ethnic groups ignores that messiness. The rush to stick the racist label on newsworthy events reflects America’s modern tendency to reach for simple explanations to complicated problems. Calling someone a racist is easy, and frees the accuser from doing the hard work of ferreting out the complexities of social interactions between the races, and between individual and situational nuances within a nation of over 300 million diverse citizens.


The current President, two Supreme Court justices, and a full fifty percent of all schoolchildren in America are classified as minorities. If the White Man has been trying to keep the other races down, he’s certainly done a lousy job at it. The shooting of Michael Brown in Ferguson is troubling, but not because America is racist. The United States is, instead, one of the most ethnically diverse and non-racist places on earth, and a nation in which crying out for help from behind a veil would be a joke.


[Image Credits: Possibly the Associated Press]

 •  0 comments  •  flag
Share on Twitter
Published on September 03, 2014 13:05

August 18, 2014

The Era of Wishful Thinking


Earlier this month, President Obama authorized targeted airstrikes against the terrorist group ISIS in Iraq. While it appeared to be a reversal of his long-term policy of disengagement from that nation, the president made it clear that the operation was limited in scope, and that America was not returning to Iraq. “I ran for this office in part to end our war in Iraq and welcome our troops home, and that’s what we’ve done…. The only lasting solution is reconciliation among Iraqi communities and stronger Iraqi security forces.”


In other words, the President’s plan is to provide the equivalent of an aspirin for a broken leg, hoping for a complete recovery. Minimum input, maximum output. It’s the policy of wishful thinking.


Wishful thinking has become the action of choice for modern American politicians on both sides of the aisle. It’s not hard to understand why: good news is good! Spinning Middle East drama into its best possible outcome sounds better than ordering up fresh military recruits. Liking and keeping your healthcare policy sells better than correcting decades of regulations and laws that act as enablers for high-charging medical firms.


Americans want the happy ending, the tidy solution that resolves all loose ends. They want wishful thinking. And since politicians are people-pleasers, they give the public what it wants. The declaration of the happy goal is paramount; the details on how to get there are downplayed or rejected completely.


Wishful thinking sounds nice, but it ignores reality. Iraq is a mess, and has been since long before its modern form came about in 1916. Thoughtful people can disagree about whether the United States should once again commit its military resources to that region of the world. But we delude ourselves if we think that Iraq will meander safely toward peace while more stable countries watch from afar. It didn’t work in Cambodia in the 1970s; it hasn’t worked in Sudan over the past decade; and it won’t work in Iraq.


Ethnic and denominational relations inside Iraq and with its neighbors have been tense for a long time. Tribal conflicts that include human rights violations crop up regularly, and in the absence of an external diplomatic or military force, there is little reason to expect much change in the near future. As commander-in-chief, President Obama is free to reposition American troops. But there is no military or historical justification for coupling a complete Iraqi withdrawal with assurances that “it will all work out somehow.” It’s not honest or accurate. It’s just wishing.


Politicians engage in wishful thinking whenever they offer rosy promises of simple solutions to complex problems. Americans partake of wishful thinking every time they clamor for a slick candidate who claims to hold to key to social difficulties, or believe that the next presidential election will right all wrongs. There are times when government may need to act (or step back from prior actions) to address some societal difficulty. In general, if these actions are to have a positive impact, they will be expensive, time-consuming, controversial, partly ineffective, and injurious to the private sector or some other national concern. Any politician who tells you otherwise is trying to get reelected.


In many ways, it’s our own fault. The wealth and safety we enjoy as Americans has allowed us to entertain ourselves rather than pursue civic education, to prefer a comfortable peace instead of troublesome truths. We desire sitcom solutions to the ugly problems of life. “We will eliminate poverty in our lifetimes” sounds so much better than the reality that some people will always be poor, no matter how much time and money we throw at the problem. Helping the poor is good; believing the help will succeed in all cases is delusional.


Wishful thinking is ignorance with a happy face. Thomas Jefferson warned against such political illiteracy: “If a nation expects to be ignorant and free in a state of civilization, it expects what never was and never will be.” Every time we allow a politician to lull us into apathy with wishful thinking, we move one step farther from civilization.


[Image Credits: whitehouse.gov and disneyclips.com]

 •  0 comments  •  flag
Share on Twitter
Published on August 18, 2014 12:00

June 25, 2014

Review: Outliers


At a small private school in Seattle, a group of mothers scrounged up $3,000 to purchase computer equipment for the students. The school didn’t really have any computers before that, so the gift was a great opportunity for the kids, especially the young geeks who did what geeks tend to do when confronted with technology. It sounds like a heartfelt story you might hear anywhere in America. But this tale takes place at a school named Lakeside in 1968, and the geeks included future Microsoft founders Bill Gates and Paul Allen.


Malcolm Gladwell, in his book Outliers, uses the introduction of computers at Lakeside to demonstrate one of the key features of those who, like Gates and Allen, excel far beyond the bulk of humanity: opportunity. The countless hours that Allen and Gates devoted to their beloved mainframes (an example of the book’s “10,000-Hour Rule”) helped guide their careers. But Gladwell puts the focus instead on the mother’s who purchased the computer equipment, a chance opportunity that gave Microsoft’s pioneers a key advantage. For the outliers discussed in the book, “success arises out of the steady accumulation of advantages: where and when you are born, what your parents did for a living, and what the circumstances of your upbringing were.”


Gladwell is a gifted storyteller, and he is adept at finding just the right anecdote or statistic to move his points forward. But as I read the success stories fleshed out in his text, I found myself unsure if he was drawing the correct conclusions. Bill Gates is a perfect example. He was certainly provided with great advantages, growing up in a time when computing was about to make the transition from business to personal, and coming from a family with the intellectual and financial means to put him in places of opportunity. But he was also a genius of sorts, an aspect of outlier success that Gladwell downplays. To make the anti-genius theme clearer, Gladwell relates the woeful tale of a genius named Christopher Langan who experienced one opportunity setback after another, despite his high intelligence.


Outliers looks to the “web of advantages and inheritances” that great people experience. In the book, the forces that birth outliers are external rather than innate. It was this emphasis that I found lacking. Bill Gates wasn’t the only youth in 1968 to give his every waking hour to computing. But he was one among only a tiny handful of these students to become someone on the level of, well, of Bill Gates. There was something more than opportunity, something more than heritage, that made his success possible, and Gladwell discounts it.


Despite this oversight, the book is still a great read. I found the chapter that discusses the transformation of Korean Airlines from an accident-prone company to one that has one of the best safety records in the industry simply fascinating. Gladwell ends the book with a story of his own family life, hinting not so subtly that he himself is an outlier, and that you, dear reader, might be as well.

 •  0 comments  •  flag
Share on Twitter
Published on June 25, 2014 12:00

June 11, 2014

Guantanamo Swappable Prisoner Supply Reaches Historic Low


Following a recent exchange of five Guantanamo Bay detainees for one American Army soldier held for five years by the Taliban in Afghanistan, the United States has run dangerously low on enemy combatants available for barter purposes. In the years following the terrorist attacks on the World Trade Center, the United States found itself flush with nearly 800 ready prisoners amid a shortage of hostage situations. But the numbers have fallen in recent years. With last month’s transfer, the total now hovers at just 149 human bargaining chips.


Congressional Republicans took advantage of the ominous news to lambast President Obama, calling him out for his “frivolous attacks and spend policy” of dealing with captured insurgents. “We set up the Guantanamo Bay Prisoner Trust Fund for a reason,” said House Majority Leader Eric Cantor. “You can’t just hand out these terrorists five or six at a time. It’s typical left-wing wastefulness.”


The low prison camp numbers come at an especially dangerous time in American international relations. Just days after the release of Taliban prisoner Bowe Burgdahl, North Korea announced that it had arrested an American tourist for unspecified and likely sucky reasons. “We do not negotiate with terrorists,” said departing White House Press Secretary Jay Carney. “But we can negotiate with the Qataris if Kim Jong-un wants go to through them.”


Evil and formerly evil nations like North Korea and Afghanistan may find it hard to trade in their hostages for increasingly valuable Guantanamo Bay ne’er-do-wells. With terrorist supplies at historic lows, the United States may be forced to dip into its National Domestic Prisoner Reserves for the first time since the 1970s. Fortunately, the flagging economy and a spate of high-profile GM vehicle accidents have reduced the need for license plates, “so the prisoners are available for release anytime,” said Secretary of Wardens Bill Westermont.


However, homegrown murderers and white-collar criminals might not be good enough for America’s enemies. The swap of five Taliban prisoners for one American citizen has set a de facto price, and even with an ample supply of American inmates, foreign terror organizations might not want them. Hamid Kahm-Jones, a Taliban mucky muck living on the run in eastern Afghanistan, explained the problem. “We very much appreciate the United States’ attempt to provide homegrown criminals in exchange for our kidnap victims. But let’s face it: American prisoners are soft, with their three square meals per day, their easy access to cable television—including the Food Network—and washing machines. It’s pathetic. Death to America, to Comcast, and to Maytag!”


[Image Credits: Wikimedia Commons]


Humorality

 •  0 comments  •  flag
Share on Twitter
Published on June 11, 2014 12:00

June 4, 2014

Apple Speaks a New Language


At its Worldwide Developers Conference (WWDC) earlier this week, Apple announced the latest iOS 8 and Mac OS X “Yosemite” operating systems. Apple fanboys will appreciate the attention to pixels included in this latest revision. And after all the glitz of the OS updates, the keynote speakers followed up with a new product most Apple users will never even notice: Swift. Swift is a new programming language, crafted to meet the needs of iOS and Mac OS app developers. In the vernacular of software developers: It’s about (reader.Age >= 18 ? GetNextCurseWord() : “”) time!


Anyone who is serious about creating software for Apple platforms has had to content with Objective-C. It’s a powerful language, in much the same way that a weapons factory is powerful. You can build something really awesome, but first you have to get access to basic components and crude raw materials, engineer a complete assembly-line process with little room for error, and fill out the requisite government forms. Swift, it seems, is out to change that.


A quick glance at some sample Swift code shows it to be an amalgam of different language sources. It has curly braces and other syntactic trappings common to most C-like languages; a love for data manipulation found in JavaScript and other web-centric scripting tools; generics, inferred strong typing, extension methods, and other modern conveniences from platforms like Microsoft’s .NET Framework; and just a hint of excess baggage from Smalltalk. And the most surprising progenitor is found in the very first language Apple ever took seriously: BASIC, especially its ability to process code logic outside the context of a full application. Showing off the immediacy of the language’s always-doing-something compiler was a major selling point during the keynote demo.


Apple isn’t the only company to craft a new language. Late last year, Google put the finishing touches on Dart, its self-proclaimed JavaScript-killer. Microsoft, already the proud owner of two popular coding systems (C# and Visual Basic), continues work on the open-source TypeScript language, a strongly typed superset of JavaScript. And who can forget Mozilla’s own unfortunately named Rust language. Of these software newcomers, TypeScript and Swift will likely gain the most traction, since they both exist to address widespread difficulties in two very popular development languages, but in ways that leave current investments in legacy source code intact.


A prerelease edition of Swift is available right now as part of Apple’s Xcode 6 beta release. You can also download a free ebook edition of the Swift language guide.

 •  0 comments  •  flag
Share on Twitter
Published on June 04, 2014 12:00

May 28, 2014

Review: Operation Mincemeat


I’m not really into military history. Whether their covering the Vietnam War or the battle-of-the-week in the Middle East, such books can’t help going on and on about how these five battalions defeated these other three divisions. Or is it four squadrons? There’s only so many times you can read about a general sending someone off to reconnoiter, or successfully outflanking the bad guys. I doubt I’ve ever met anyone in real life who’s flanked.


So I was a little unsure about reading Operation Mincemeat, Ben MacIntyre’s book about a secret British military plan executed smack dab in the middle of World War II. Fortunately, it’s a great wartime read, in part because it’s not really about war at all. In your typical military book, there’s page after page about how specific operations resulted in enemy deaths. That’s where MacIntyre’s book differs: the main character is dead before the operation even begins.


In January 1943, a thirty-four-year-old impoverished Welshman—in and out of homelessness, in and out of mental illness—accidentally or intentionally ingested rat poison and died. His name was Glyndwr Michael, but Adolf Hitler knew him as Major William Martin of the British Royal Navy. Between the time of Michael’s death and Martin’s impact on the Third Reich, he managed to earn a military rank, a sexy girlfriend named Pam, a clandestine submarine trip to within dead-guy floating distance of the Spanish coast, and a briefcase loaded with forged personal letters and erroneous top secret military plans that proposed an Allied invasion of Greece instead of the true Axis-fortified target of Sicily. The Germans, upon having the documents virtually forced into their hands, fell for the deception, altered their plans accordingly, and turned the tide of the war.


Sun Tzu advocated the use of deceit long ago. “All warfare,” he insisted, “is based on deception.” Here, the deceit is explained in fascinating detail. It’s not surprising that Great Britain included subterfuge in its offensive arsenal. The shocking part—something that comes out clearly in the book’s progressive reveal—is that the Germans were begging to be misled in order to confirm, despite recent military setbacks, that they were destined for a millennial victory. Even when the Allies attacked Sicily in force instead of Greece—even as Mussolini implored the Fuhrer to come to his rescue—the Germans kept ignoring Sun Tzu. Two weeks after the Sicily landing, the Italian dictator was out of power thanks to the ploy, and the Axis was on its way to sure defeat.


MacIntyre relates a tale worthy of a James Bond novel, in part because elements of the plan to use a dead body to trick the Nazis sprang from the mind of Bond author Ian Fleming. Extensively researched using declassified military files and interviews with those involved, Operation Mincemeat is hands down the most enjoyable non-military military-history book I’ve ever read.

 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2014 12:00

May 14, 2014

Revolution, BASICally


This month marks the fiftieth anniversary of the BASIC computer programming language. In the early hours of May 1, 1964, Dartmouth professors John Kemeny and Thomas Kurtz activated the new language. Although only a handful of college students had access to that first trial, the vision of Kemeny and Kurtz was nothing short of revolution.


BASIC wasn’t the first English-like language. Fortran had been around since 1957. It wasn’t a fun language by any measure, but you didn’t have to be a rocket scientist to use it. You just had to work for one. In this way, BASIC wasn’t too different, despite its promise of being “easy to learn” and “a stepping-stone for students who may later wish to learn one of the standard languages.” You still needed access to a computer, and in the 1960s that typically required engineering smarts. Not much of a coup.


But it was. Kemeny and Kurtz didn’t usher in an era where every kid would write software. But by lowering the entry requirements for programming, they advocated for a world where anyone could control computer resources. Getting access to the first BASIC system required acceptance into the Dartmouth engineering program. Today, you only need to flop down on your sofa, grab your iPad, and start poking at it with your fingers.


To find out more about one of the most popular software development language families used by businesses today, visit the BASIC fiftieth anniversary site at Dartmouth.


[Image Credits: Geology professor Robert Reynolds, at right, and chemistry professor Roger Soderberg develop a computing component for their course work. (Photo by Adrian N. Bouchard/courtesy of Rauner Special Collections Library, Dartmouth College)]

 •  0 comments  •  flag
Share on Twitter
Published on May 14, 2014 12:00

May 7, 2014

The Stupidest College Graduates in the World


#50811757 / gettyimages.com

Late last week, former Secretary of State Condoleezza Rice announced that she would no longer be the guest speaker at this year’s Rutgers University commencement ceremony. The statement came after 160 Rutgers students, identified as “erratic and irresponsible” by a school administrator, turned the campus into a place where not even a controversial Bush-era government official wanted to be seen.


The protesters, in an open letter to Rutgers President Robert L. Barchi, dredged up a decade-old gripe concerning Ms. Rice’s role in the Iraq War. Specifically, they objected to her approval of waterboarding: “Rice signed off to give the CIA authority to conduct their torture tactics for gathering information from detainees as well.” They also lamented being left out of the speaker selection procedures, accusing school administrators of using an “undemocratic, opaque process” that included a “blatant refusal to pay any heed to what Rutgers University students believe and feel.”


It is sad to discover that these believing, feeling children, despite having twelve years of basic education, and then voluntarily seeking higher education at one of the nation’s top universities, can still know so little. Like all college-aged rebels going back to the 1960s, these youths tell themselves that their generation understands the world in ways never contemplated before their births. In doing so, they call down condemnation on themselves every time their methods lead to a worsening of human suffering.


Despite having a rich selection of humanities courses at their disposal, the essential lessons of history elude the noisemakers. Perhaps their lack is brought about by the pleasing, manicured campus, or by the East coast comfort enjoyed by most families who are able to send their children to such a prominent school. Whatever the source, their rejection of Rice as someone who “justifies torture and debases humanity” is laughable in light of the tumult of history and human nature.


If George W. Bush and Condoleezza Rice are guilty of torture, they are certainly some of the most inept torturers in recent memory. Despite having a military-industrial complex at their disposal, all that the administration managed to do was douse three known terrorists with water. If the demonstrators were really concerned about torture, they might consider carving time out of their busy protesting schedules to decry the regular killing of innocent Muslims by Al Qaeda, or the intentional starvation of tens of thousands of North Korean citizens, or the quarter million or more Iraqis abused and killed by Saddam Hussein. That would be the Saddam Hussein removed from power by “war criminal” Condoleezza Rice.


This miniscule segment of the Rutgers student body—not to mention the faculty members who called their own credentials into question by standing in solidarity with these scholars—also shows no educated understanding of human nature. Sitting in their tranquil classrooms on verdant taxpayer-funded campuses dulls them to the harsh realities that most of the world experiences, not because our nation is an oppressor, but because life is hard, and complex, and filled with wicked rulers who take pleasure in exercising malevolent power over their ill-fed and tortured citizens.


Waterboarding may be a grievous wrong. But if these sit-in protesters can’t differentiate between that act and the massive slaughter that its use sought to destroy, then the university leaders who confer advanced degrees on these uneducated kids are even stupider than their charges.

 •  0 comments  •  flag
Share on Twitter
Published on May 07, 2014 12:00

April 26, 2014

Gabriel García Márquez

Gabriel Garcia Marquez


When I first began working on The Well-Read Man Project back in 2010, only five authors from the set of fifty project books were still living, with a recent sixth, J. D. Salinger, having died just a few months before the idea for the project came about. In thousands of years of literary history, these relative youths still managed to achieve classic status with their writings, and were recognized for their talents during their lifetimes, and ours. That author count has dropped to four with the passing of Gabriel García Márquez.


Márquez’s literary masterpiece, One Hundred Years of Solitude, is one of the strangest books in the reading project. As with Salmon Rushdie’s Midnight’s Children, Márquez’s book attempts to tell the story of a nation through a technique called “magic realism,” primarily by anthropomorphizing its history in the lives of literary characters. (Read here Rushdie’s ode to the late Colombian author.) In this case, that personification is through the entire Buendía family, a fictional madhouse of conquerors and failures, faithful lovers and prostitutes, rich and poor and poor and poor. To someone not intimately familiar with Colombia’s history, it’s a difficult book, yet never dull.


The New York Times published a full obituary following the author’s death on April 17, 2014. You can also read my mini-review of One Hundred Years of Solitude, or visit its project page.

 •  0 comments  •  flag
Share on Twitter
Published on April 26, 2014 12:00

April 7, 2014

Book Review: Still the Best Hope

Still the Best Hope

America is a uniquely blessed nation. In terms of natural resources, cultural diversity, and basic liberty, the United States has experienced a short yet rich life unparalleled by any other country throughout history. Many citizens see these blessings as something to share with other peoples, making real the “City on a Hill” first mentioned in 1630 by Puritan John Winthrop, one of the earliest orators to comment on America’s influential role.


Unfortunately, there are American citizens who have a difficult time seeing these blessings, and others who, though they may recognize them, are unable to articulate why it is that we find ourselves with such benefits. To address these groups, political commentator Dennis Prager penned Still the Best Hope: Why the World Needs American Values to Triumph. Published in 2012, the book documents Americans blessings by defining its core values and comparing those values to other competing worldviews.


The majority of the book contrasts the American value system with its two modern competitors: Leftism and Islamism. He also mentions in passing China’s Confucian-influenced Communist system as a possible fourth major worldview, but does not discuss it in detail because it currently lacks worldwide appeal, generally limited as it is to Chinese domestic society.


For Prager, Leftism is not a tired relic of a failed Soviet Union, but a living movement that is voracious in its appetite for the control of people’s lives and hearts. It is old-time Communism, gussied up in a form palatable to American tastes. It is also a religion of sorts, with communal utopia as its heaven, material inequality as its sin, taxes as its offerings, and the state as its god. Leftism believes that all people are basically good, and that external forces—specifically the economic disparities identified by Karl Marx—make them bad and destroy their lives. To restore goodness and fairness, Leftists must—to quote Barack Obama, an exemplar of Leftism from Prager’s perspective—do the work of “fundamentally transforming the United States of America.” (Quoted from an October 2008 campaign speech in Columbia, Missouri.) To accomplish this, Leftism must displace American values with its own values, by influence and legislation if possible, by force if necessary. As a warning against this latter Marx-approved method, Prager chronicles the moral failures of Leftism throughout the Twentieth Century, including its record of nearly 100 million deaths.


Prager moves on to Islamism, defining it as “holding the belief that not only should all mankind be converted to Islam, but that all Muslim societies be governed by Sharia.” He does separate this view from the more accepted forms of Islam in general. He also says that Islamism does not require, by definition, the use of violence or terrorism. Despite these qualifications, he does include a brief overview of Muslim history, with its sword-based expansion during the Middle Ages and its scriptural support for the subjugation of non-Muslims, all as a warning about Islam’s tendency to see its worldview as something to be imposed on all nations. As with Leftism, force has been an option in the spreading of the Islamic worldview, and the author documents such incidents, both long ago and in our era.


In the final section of the book, Prager formally defines the American values system by invoking the three phrases found on all United States coins: “Liberty,” “In God We Trust,” and “E Pluribus Unum.” Liberty encompasses the freedoms of political, religious, and economic activities, assembly, speech, the press, and above all, freedom from an oppressive state. For the author, this requires a smaller government footprint, since larger governments historically have a tendency to abuse the power entrusted to them.


To reduce the temptation to anarchy that comes with great freedom, the author sees “In God We Trust” as a must. This is not a demand for a Christian nation—Prager is a Jew—but for the core of Judeo-Christian values, what he calls “ethical monotheism.” People are not good by nature, and morality must be inculcated, either by God or by the state. Prager recommends God.


Finally, he invokes “E Pluribus Unum” (“out of many, one”) as the third part of the “American Trinity” of values. As a melting pot, America promotes nationalism over racism—despite some missteps—with a rule that anyone of any background can be an equal citizen. People are accepted because they have value as people, and not because they are of a specific bloodline.


Prager is a master at breaking a worldview down into its most minute components, and then comparing those components across the spectrum of competing systems. He does this in a way that speaks to the common man instead of to Ivy-league philosophy types. The book uses numerous examples—too many, actually—to bolster points, and covers all of the modern hot-button topics, including terrorism, homophobia, global warming, education, mass media, homelessness, as well as less prominent issues such as swine flu, foul language, and anorexia.


In Still the Best Hope, the goal is liberty, a liberty can only be realized by those who believe in an objective God, and who put shared values before race. Leftists and Islamists hate such American liberty because it acts as a brake on their forward momentum. It offers freedom of thought and choice over dogmatic religious laws and capricious power grabs for self-appointed saviors. Prager calls American values the “Best Hope” for the future, but warns readers that Leftism and Islamism will work hard to keep that from happening.

 •  0 comments  •  flag
Share on Twitter
Published on April 07, 2014 12:00