R.P. Nettelhorst's Blog, page 120
April 2, 2013
The Equality of Women

Pastor Ralph Drollinger, who used to be a minister associated with Grace Community Church in Southern California, was quoted in 2004 as writing that a woman is scripturally and specifically “prohibited from leadership in the institutions of marriage, family and church.” Furthermore, he wrote, “She is not explicitly prohibited from leadership in government or commerce—that is, so long as she does not have children at home.” Otherwise, according to him, she is sinning. (His relationship with Grace Community Church ended in 2009.)
Many years ago I briefly taught at the Master’s College. This was during the first year after Grace Community Church took over what had formerly been called Los Angeles Baptist College. While I was teaching there, I came across a copy of a little anonymous booklet that Grace Community Church had printed and distributed called, The Role of Women. Drollinger’s comments are consistent with what that booklet argued. One particular passage from the booklet stands out to me in this regard:
The biblical pattern for raising and instructing children in God’s truths was established in Deuteronomy 6 where children are to be taught by parents “when you sit in your house and when you walk by the way and when you lie down and when you rise up.” Parents are responsible for the spiritual education of their children, and mothers who work full-time outside their homes usually lack the quality time to instruct their children adequately. Nor can the responsibility for this instruction simply be transferred to someone else. (The Role of Women, p. 10)
I’ll ignore the odd leap from “parents” to “mothers” in the passage. More significant is the author’s interpretation of the repeated use of the word “you.” “You” is who the passage in Deuteronomy 6 is directed. Since I can read biblical Hebrew I couldn’t help laughing at the booklet’s incredibly bizarre interpretation of Deuteronomy 6. Something that neither that author nor Grace Community Church apparently know is that the passage from which they are deriving their warped ideas about what women can and cannot do is not even addressed to women at all! You see, the words of Deuteronomy 6 are written exclusively to men. In Hebrew there are four forms of the pronoun “you” available: masculine singular, masculine plural, feminine singular, feminine plural. Guess which is used in Deuteronomy 6? Masculine singular. The role of women is simply not being addressed by this passage. Instead, it is addressing the role of men and only men.
Drollinger’s whole argument, as well as that of Grace Community Church, is based on a ludicrous and incredibly ignorant misreading of the Bible.
In fact, if the author of The Role of Women wants to argue, based on this passage, that a particular parent shouldn’t be working outside of the home if there are children in that home, then he better start tongue-lashing most of the men. They are the awful sinners since they aren’t home with their kids. They abandon them and go off to work on a regular basis. For shame!
Frankly, I have found that many of those who argue against women working outside the home believe that the world has corrupted women with worldly, sinful notions. Somehow, they think that when women want to work outside the home, get careers and the like, it is because the feminist movement has filled them with wrong-headed dreams and aspirations. Those darn feminists are the ones who’ve made women unhappy and dissatisfied with their “proper, God-given roles” in the home. They’ve decided that the world of the 1950′s era sitcom is scriptural, and they don’t like uppity women–so they try to find a scriptural justification for their bigotry and hatred.
Those who believe in the oppression of women frankly remind me of the slave holders of a different era, who complained that “If it weren’t for those durn abolitionists filling the slaves with wool-headed ideas they wouldn’t be near the trouble; getting them all riled up about liberty and equality and who knows what other gosh durn foolishness!”
The reason a woman might like a career and be dissatisfied fulfilling the role of a slave is because she is a human being, created in the image of God, with the same common ideals and aspirations, hopes and fears, that fill the male half of humanity, since a woman, too, is as much a part of humanity as a man. People like Drolinger and the leaders of Grace Community Church just don’t get it (besides not even knowing how to read Hebrew and spouting nonsense as a result). Abraham Lincoln told people to ask themselves a simple question: would you care to be a slave? If the answer is no, then clearly slavery is obviously evil. I would suggest that those men who believe women should be excluded from positions of authority, who must not work outside the home, and so on should ask themselves the same question that those who wondered about the goodness of slavery asked: would you want to be treated the way you think women should be treated? Would you care to have such restrictions imposed on you? Would you like being told what jobs you can and cannot have simply by virtue of your gender? When you think certain people, certain groups, must be restricted, must behave, must do certain things that don’t apply to you—then probably you’re wrong.
“Do to others as you would have them do to you,” said Jesus (Luke 6:31). And likewise, Jesus said that all the laws and regulations of the Bible come down to two things: to love God and love people (see Matthew 22:36-40; Romans 13:8-10; Matthew 7:12; Colossians 3:14). Grace Community Church’s position on the status of women–and those who agree with them–are interpreting certain biblical passages so that they violate the rest of the Bible: both the Golden Rule and the primary commandment to love others. Here’s a clue: when your interpretation of scripture leads to a violation of “love your neighbor as yourself” and “do unto others as you would have them do unto you” then you have misinterpreted the text. Go back and try again.

April 1, 2013
Intelligent Design

I’ve disliked the Intelligent Design concept since I first heard about it several years ago. From the theological standpoint, I believe that the theory is deeply flawed. It is simply a new version of a very old error: the God of the Gaps fallacy. To put it simply, the God of the Gaps fallacy argues that God is to be defined as mystery. Where there is mystery, there is God: if we find something in the world we don’t understand, the explanation is always the same: God did it.
This is an incredibly lazy approach to the world. When explanations for objects and events are found—as they must always be—the God of this fallacy inevitably shrinks. Needless to say, those caught in the grip of this fallacy inevitably fear explanations. Each time humanity’s understanding of the universe grows, a little piece of their God chips away. They little realize that they’re worshiping a false God who needs to disappear.
Most theologians, along with most scientists, discarded the God of the Gaps fallacy a long time ago. God is not dependent for his existence on ignorance.
I recently read someone who wrote, in thinking about God, that “since natural laws are His, presumably He can violate them any time He feels like it.” This reflects a widespread assumption regarding omnipotence which I don’t think is correct.
I do not think it is accurate to say that omnipotence means God can do just anything at all. I also disagree, therefore, that “miracle” in any way is a violation of natural law. The eighteenth century philosopher David Hume’s comments on miracle are devastating to that traditional concept of miracle, but only assuming this widely accepted definition of miracle is accurate. At least since the late nineteenth century, most theologians who have thought about the issue have attacked Hume’s conclusions by dismantling this key presupposition (one that most people, unfortunately, still believe) that “miracle” means “violation of natural law.” A more precise definition is that a miracle is a “sign,” or an “intervention” by God, by which he hopes to get the viewer’s attention.
Most people would find it difficult to commit murder. Their morality constrains their behavior. Most would argue that God is moral and thus is unable to violate his moral precepts, especially given the additional assumption that God is perfect. What if we now also assume that the laws of nature are as much a part of who God is as the moral laws? What if we modify the definition of omnipotence to then mean that “God is capable of doing anything that is consistent with his nature?” God is constrained, I would argue, by his own nature and can do nothing in violation of it; nor do I think that he can do anything that is logically absurd. God can no more make two and two be five than I can.
God then, might no more be able to violate natural law than he is able to violate his moral law. Certainly believers in God accept that he does spectacular things, but do those spectacular things require violation of natural law? An airplane would be mighty spectacular to a person living in the Middle Ages, as would flights to the moon or computers. But none of those spectacular things are violations of natural laws. We simply know the natural laws well and can manipulate them in very creative ways. God, to put it oddly, perhaps, should be seen as simply more technologically advanced than we are. And thus, in a universe where God is like this, science would be compatible with the nature of God.
Given a God constrained by his own nature, who operates in accord with his own laws and never violates them, I would expect that we could learn how the universe functions down to the smallest level. Explanations do not negate God; they illuminate Him. I thus am content with modern science. I do not feel a conflict between science and religion. I do not imagine that the unknown is God, only that it is the unknown—for now. I remain a theist who believes that God is intimately involved with his universe. I believe that God’s manipulations are no more intrusive or problematic than the manipulations of his creative creatures and differ from them perhaps in degree, but not kind. I would also point out that God made us free, and thus it is always going to be possible for us to explain Him away, precisely because we would not be free otherwise. How free are you when you are aware that your boss is watching your every move? God didn’t want us to live that way, either.
I suspect that the moral laws and natural laws are both a reflection of God’s fundamental nature and that he cannot be other than who he is. For instance, the fundamental forces of the universe (weak force, strong force, gravity, electromagnetism) must exist in a certain ratio with one another—to several decimal places—in order to have a universe capable of supporting life as we know it. We can logically posit universes where the forces are different than in our universe, but such universes would be very uncomfortable for us and incompatible with our existence. God is constrained by two and two always having to equal four. Likewise, “thou shalt not murder” is probably a necessary constraint on a properly functioning universe, too; anything else would be uncomfortable.
While God could have done and could do anything, I believe he is constrained by who he is, just as my behavior is constrained by who I am. What are the odds that I will voluntarily drive on the wrong side of the street, even though the only thing stopping me is a double yellow line painted on the asphalt? Hardly an insurmountable physical barrier. But I’m not a complete moron. And likewise, God is an intelligent being, even more than I am.

March 31, 2013
A Memory

Neil Armstrong, the first human being to set foot upon the moon, passed away the afternoon of Saturday, August 25, 2012. The flags of the nation flew at half-staff in remembrance and there were outpourings of tributes. And yet, only half the people alive today have any memory of his first step on the moon. The rest hadn’t even been born yet. It is also noteworthy that no one born after 1935 has ever walked on the moon. The last person to set foot there was in December, 1972. A total of 12 people walked on the lunar surface. With the death of Armstrong, only 8 of them are still alive. The youngest of those who remain is 76 years of age.
In July, 1969 I had just moved from Oklahoma to Ohio with my parents. My father was in the Air Force and he was about to leave for Viet Nam on his second tour of duty. Come Autumn, I was going to begin Junior High—what today is called Middle School. I had just become a teenager in March.
The lunar module separated from the command module in lunar orbit shortly after 1 PM Eastern Daylight Time on July 20, 1969. At 4:18 PM Eastern Daylight Time, it touched down on the surface of the moon. My memories of the event have faded with the years. I recall that I was in a department store with my mother and watched it on a television there in the store.
About six hours later, Armstrong opened the hatch on the lunar module and began his descent to the surface. He put his boot in the lunar soil at 10:56 PM.
My family was in the living room at my great aunt’s home, so my mother, my father and my great aunt were watching the event on television. My great aunt had been born near the end of the nineteenth century. In her lifetime, she had seen the invention of the airplane, the advent of radio and television, the jet aircraft, the atomic bomb and now this. My father remembered when electricity first came to his parent’s farm house when he was a boy.
My great aunt found the whole thing barely comprehensible, so much so that she had a hard time really believing it. For my parents, it was certainly a wonder. But they’d imagined such a thing for a good portion of their lives, having grown up on movies and science fiction stories of people traveling to the moon. They were big fans of Star Trek, which had only recently ended its original television run.
What truly amazed my parents about the event was the fact that, as Neil Armstrong descended the ladder toward the lunar surface, he pulled a D-ring which activated a television camera. In all the movie versions of the moon landing that anyone had seen, in all the science fiction stories they had read, none of them had imagined that people on Earth would be able to watch the first man step onto the lunar surface live on television. It is estimated that 600 million people saw that moment: one of the largest television audiences in history. That means about 14 per cent of the human race watched Neil Armstrong make history in real time.
I had watched the launches and followed the space program all my life, having been born the same year that the first satellite was launched into orbit. I had been fascinated with astronomy and all things space from my earliest memories. And so for me, it was not so much a wonder as it was simply the way things were supposed to be. From my perspective, Neil Armstrong’s steps seemed inevitable. That humans should fly into space and set foot on the lunar surface was exciting, but not nearly as amazing as it was for my great aunt or my parents.
Like most children my age, I imagined that the future in space would be like what I saw in the books and television shows that I consumed. I believed that the movie 2001: A Space Odyssey and the Wonderful World of Disney on Sunday nights revealed what would actually happen in my lifetime.
Sadly, of course, moon bases, vacations on Mars, and hotels in giant spinning wheels orbiting the Earth remain unrealized. And the reason those things never came to be had nothing to do with their feasibility. Instead, it all came down to the money.
Since the end of the lunar program, NASA’s budget has remained at about one half of one percent of the federal budget. If the United States had chosen to invest even half the money Americans spent just on beer between 1972 and today, the level of human activity in the solar system might well have matched or exceeded what was imagined in the movies.

March 30, 2013
How to Do Research

My undergraduate degree was in history and one of the required courses was a course in the methodology of research. Our primary textbook was The Modern Researcher by Jacques Barzun and Henry F. Graff; the sixth edition paperback is rather expensive—far more expensive than it was when I got it in college.
Another book I read was Historian’s Fallacies: Toward a Logic of Historical Thought, by David Hackett Fischer. It is much more reasonably priced. It demonstrates some of the mistakes that can be made in historical research, and uses actual works by historians to illustrate the problems.
Both books, together with the class, gave me a good foundation in how to conduct research. The principles are not difficult or complicated, and once learned, seem so obvious that it’s a wonder we have to be instructed. But instruction we indeed need.
Together with a course in logic, and an understanding of both the scientific method and Occam’s Razor, an understanding of basic research methodology could go a long way in limiting the problems that people would have in figuring out whether something is true, or if the evidence is lacking. Once you get the principles inside of you, even if you don’t know what the truth really is, it becomes pretty easy to recognize lies, stupidity, and general misinformation.
I am frequently annoyed by what I read on Facebook, what I hear from politicians, or what I endure from the poorly researched articles created by journalists and pundits.
Some Basic Principles of Research
Various basic principles have become so generally established, so tried in the fires of experience, that the scholars concerned hardly ever feel the need to even mention them in print. They include the following:
1. The primary importance of facts
Priority must always be given to tangible, objective data, and to external evidence over subjective theory or speculative opinions. Facts must control theory and not vice-versa. Source material must always be scrutinized in this light.
2. The importance of primary sources
In research, one must always seek out the original source material—seek to discover the origin of a tale or legend, or incident. For instance, the story of George Washington chopping down a cherry tree originates in a book written by Parson Mason Locke Weems entitled The Life of Washington that was published in 1800. The story appears nowhere prior to Weems’ book. He attributes it to “…an aged lady, who was a distant relative, and, when a girl, spent much of her time in the family…” He does not name her so it is impossible to verify.
3. A positive attitude toward source material
You could summarize this briefly as: the source material is innocent until proven guilty. It is normal practice to assume the general reliability of statements in our sources, unless there is good, explicit evidence to the contrary. Unreliability, secondary origins, dishonesty of a writer, or tendentious traits — all these must be clearly proved by tangible evidence, and not merely inferred to support a theory. So, given the axe he had to grind, there is reason to have some doubts about everything that Josephus wrote: he was attempting to justify his behavior at the battle where he surrendered to the Romans, and he was attempting to explain why the Romans should not dislike the Jewish nation. It is unlikely that Josephus flat out lies, but it seems probable that he selected his material and describes it in the best possible light for his purposes. It is similar to a fictional tale about a two man race between an American athlete and a Soviet athlete during the Cold War. The American won. Pravda reported the race as follows: “The imperialist American came in next to last, while the glorious Soviet worker came in second.” Nothing precisely inaccurate in the account, but the impression is another thing altogether.
3. The Inconclusive Nature of Negative Evidence
Negative evidence is commonly not evidence at all, and is thus usually irrelevant. If some person, event, etc. is mentioned only in documents of a later age, the absence of any directly contemporary document referring to such a person or event is not in itself a valid or sufficient ground for doubting the historicity of the person, event, etc.
It must always be remembered that the absence of evidence too often merely reflects the large gaps in our present day knowledge of historical periods. The gaps in our knowledge of even of the relatively well-documented culture of, say, Ancient Egypt are significant. Much relavent evidence still awaits discovery or decipherment, or else it has simply been lost. Although cuneiform tablets and fragments in the world’s museums are numbered in the hundreds of thousands, they are but a fraction of all that were written — perhaps ninety-nine percent are still in the ground. In the words of Cyrus Gorden, “for every mound excavated in the Near East, a hundred remain untouched.” For instance, the fact that there are no contemporary documents besides the Bible that mention Jesus is not surprising: Palestine was a minor province in the Roman Empire, and far from Rome. Jesus was not important or significant (at that time) to the Romans or anyone outside a small group of people in Palestine, most of whom would have been considered insignificant.
4. A Proper approach to apparent discrepancies
The basic harmony that ultimately underlies extant records should be sought out, even despite apparent discrepancy. Throughout ancient history, our existing sources are incomplete and elliptical. We must weigh and take into account all relevant sources, and make allowance for missing or ill-interpreted factors. Finally, in speaking of error or inconsistencies, one must distinguish clearly between primary errors (mistakes committed by the original author of a work) and secondary errors (not in the original, but resulting from faulty textual transmission or the like).
5. Secondary sources should not all come from one political, economic, cultural or religious point of view.
Likewise, it is important in analyzing secondary sources to make sure that they aren’t simply quoting from one another, or all saying exactly the same thing in the same way.
An Example of Bad Research
A good illustration of a failure of proper research methodology and its devastating consequences can be illustrated by the bestselling book The Coming Economic Earthquake by Larry Burkett, which appeared in 1991 and was published by Moody Press (that it won praise and awards demonstrates the failure of editors, fact checkers at the publisher, reviewers and readers alike to have any idea of how to do proper research). Someone gave me the book and told me it was a good book. I read it expecting that it would be good.
It wasn’t.
The factual errors in the book were so obvious and so bad that I was flabbergasted. Of course, the mistakes were obvious to me only because my undergraduate degree was in modern European history and because I had taken an introductory course in economics. The average, uninformed reader would not necessarily notice the problems that easily. The errors were not minor; they were so serious that they entirely undermined the point of the book and cast its conclusions into doubt.
Five examples (there were many more problems than this) can serve to illustrate the failure of his research. Remember, these are not matters of opinion. They are matters of plain fact.
1. p. 27:
“Their spokesman for this New Deal was an articu¬late aristocrat with a household family name: Roosev¬elt. Franklin Roosevelt was born to wealth, raised to wealth, and educated in wealth at Harvard, where he was exposed to the phi¬losophies of Dr. John Maynard Keynes of England. Keynes, an avowed socialist, had long advocated the use of government control over banking and business to ensure prosperity for all. This phi-losophy was not new. Karl Marx had advocated essen¬tially the same doctrine, only to a more radical group — the poor.”
a. John Maynard Keynes was not a socialist. According to the Encycopaedia Britannica:
In Cambridge, to which Keynes now returned, his reputation was rather different. He was quite simply esteemed as the most brilliant student of Alfred Marshall and A.C. Pigou, the two Cambridge economists who between them had produced the authoritative expla¬nation of how competitive markets functioned, business firms operated, and consumers spent their incomes.
Although the tone of Keynes major writings in the 1920’s was occasionally skeptical, he did not directly challenge that conventional wisdom of the period that held laissez-faire, only slightly tempered by public policy, the best of all possible social arrangements.
(Encyclopaedia Britannica, volume 10, p. 447, 1984)
b. It is impossible that Roosevelt was influenced by Keynes in Harvard because Keynes was born on June 5, 1883. Roosevelt was born January 30, 1882. Roosevelt was older than Keynes, and they were both in college about the same time. It seems unlikely that Roosevelt would be studying the philosophy of someone who was himself taking classes at the same time in Cambridge, from firmly laissez-faire capitalist economic teachers — especially when you consider that Keynes had yet to develop the economic philosophy about which Burkett is so critical.
c. Keynes’ book, in which he propounded his economic theory of unemployment (Larry Burkett terribly misrepresents and apparently doesn’t understand Keynesian economics in the first place) was called The General Theory of Employment, Interest and Money, which appeared in England at the very end of 1935. Roosevelt had been elected president in 1932.
This is how the Encyclopaedia Britannica summarizes Keynes argument in his book:
The central message is readily translated into two powerful propositions. The first declared the existing theory of unemployment nonsense. In a depression, according to Keynes, there was no wage so low that it could eliminate unemploy¬ment. Accordingly, it was wicked to blame the unemployed for their plight. The second proposition proposed an alternative explanation about the origins of un¬employment and depression. This centered upon aggregate demand — i.e., the to¬tal spending of consumers, business investors, and public agencies. When aggre¬gate demand was low, sales and jobs suffered. When it was high, all was well.
From these generalities there flowed a powerful and comprehensive view of economic behaviour. Because consumers were limited in their spending by the size of their incomes, they were not the source of business cycle fluctuations. The dynamic actors were business investors and governments. In depressions the thing to do was either to enlarge private investment or to create public substitutes for private investment defi¬ciencies. In mild economic contractions, monetary policy in the shape of easier credit and lower interest rates just might stimulate business in¬vestment and restore the aggregate demand caused by full employment. Severer contractions required as therapy the sterner remedy of deliberate public deficits ei¬ther in the shape of public works or subsidies to afflicted groups.
(Encyclopaedia Britannica, volume 10, p. 448, 1984.)
Whether Keynes is right or not is a separate issue. But Burkett’s presentation of him is far from accurate, therefore rendering Burkett’s conclusions very suspect.
2. p. 72:
“It was assumed that by injecting a modest amount of new currency into the economy, only a modest amount of inflation would follow. Advocates of this plan assured the Kaiser that a modest amount of inflation would be manageable and would actually allow producers to reap more profits, thus helping to repay the Weimar Repub¬lic’s debts with cheaper currency.”
a. Germany did not have a Kaiser after World War I. Therefore, how could there be advisors to this by then non-existent person? Following World War I, the Kaiser abdicated and moved to Holland, together with his family. He had no power or influence on Germany after that. Before the rise of Hitler, Germany had a popularly elected, democratic government that the Kaiser had nothing to do with–since he was in exile. In Holland. Where he died in 1941.
3. p. 165:
“This is what George Orwell described as ‘government speak’ in his novel 1984.”
Orwell called it “Newspeak” in Orwell’s 1984, not “government speak.”
4. p. 166:
“Then in the sixties President Nixon substituted the use of base metal coins for silver coins effectively removing all fixed asset value from U.S. currency.”
The coins were changed from silver to nickel/copper sandwiches in 1965. I’m a coin collector. I have these coins. And Nixon did not take office as president until January, 1969. Thus, he didn’t have anything to do with eliminating silver from the coins, since he didn’t become President until four years after the deed was done.
5. p. 198:
“Once the word was made public, investors outside the U.S. rushed to convert their U.S. dollars into the E.C. Eurodollar, adopted as the official world currency by virtually all members of the World Economic Council, excluding the United States of course.”
Although Burkett was describing a fictionalized account of a possible future crisis in 1999, what he is described would be a remarkable trick indeed, considering what Eurodollars are, according to the Encyclopaedia Britannica:
Eurodollars, deposits of United States dollars in foreign banks obligated to pay in U.S. dollars when the deposits are withdrawn.
(Encyclopaedia Britannica, volume III, p. 998, 1984.)
What does that mean? Eurodollars are simply U.S. dollars that happen to be sitting in European banks. If Burkett’s scenario took place, I suspect the Europeans would find their Eurodollars just as worthless as the U.S. dollars — since they are the same thing.
The fundamental problem with Burkett’s book can be traced back to his research methodology or lack thereof. First, facts were apparently not of primary importance for driving his theory. Second, he did not look at primary source materials. He didn’t even doublecheck the information he’d gotten. Say, in a basic reference book. Like an encyclopedia. Second, and perhaps most significant, the list of the sources he used according to his surprisingly short bibliography all came from the same economic and political point of view. Third, his source texts (which are secondary or tertiary sources) are essentially really only one source—and an unreliable one at that—because they all quoted from one another and from one author who apparently came up with the misinformation in the first place.
I wrote Larry Burkett a letter pointing out these problems. He was unconcerned. One of his associates told me that yeah, there were some typographical errors in the book, but so what? After all, the book was selling well and garnering awards.
I’ve seen similar behavior on Facebook; people will post untrue and false statements. Even after learning that their statements are false (not a matter of opinion; demonstrably, factually false, like arguing that 2 and 2 equal 73), they and their friends ignore the criticism and continue praising, sharing, and reposting the misinformation.
It puzzles me, though it no longer surprises me. If a high percentage of politicians, journalists, pundits, economists and other people in positions of power and influence are unconcerned with reality, why should I expect the percentages to be any different among any other groups of human beings?
I’ve learned that many people simply are not particularly interested in the truth. They are interested in having what they think to be the case confirmed. I’ve had people come up to me and ask me what I thought about a particular interpretation of the Bible, or theological position. When I start to show a different way of looking at things, or demonstrate that what they’ve told me simply can’t be true, nine times out of ten the reaction is not particularly positive. Instead, they quickly disengage from the conversation and walk away. Many people are not really looking for the truth; instead they are looking for a hug and confirmation. If it doesn’t fit, then they will ignore it or explain it away, or in some way justify their continued hold on illusion. Likewise, in my experience, most people aren’t interested in learning new things, nor are they willing to alter their beliefs or opinions, regardless of the data or new information–which they resist hearing or learning about and sometimes actively avoid.
People seem to approach much of reality in the same way sports fans approach their favorite teams: the umpire is always wrong, the ball was always in play, the batter was out, and their team was robbed when it loses.

March 29, 2013
How High

One day when my oldest daughter was no more than about four years old, she announced that “I’m taller than any tree that’s really short.” I suppose that’s even more true of her today, when she can drive herself and has a paid summer internship in the corporate offices of Guess in Los Angeles. Even as children we seem to be obsessed with the shortest, longest, fastest and highest. The world’s highest mountain is Mount Everest, at 29,029 feet high. That mountain is the highest that a human being can walk, a height first achieved by walking by Tenzing Norgay and Edmund Hillary at 11:30 AM on May 29, 1953. As of today, more than 3000 people have made that walk. If you’re healthy and in good physical shape, you can join one of the annual climbs—assuming you can afford the months off work and the cost of anywhere from 40,000 to 77,000 dollars for the tour service, not including the 8000 to 15,000 for your equipment and clothing and perhaps 5000 dollars to fly to Everest in the first place.
It is far cheaper, if all you’re interested in getting high, to book a flight on a commercial airliner. Your airplane will cruise above the height of Mount Everest by at least 6000 feet—more than a mile—since most cross country flights stay around 35,000 feet. Flying so high is now something we take for granted. The old Concorde supersonic aircraft, which ended service in 2003 usually flew at a cruising altitude of 56,000 feet, though its maximum cruise altitude was 60,039 feet.
We take such heights for granted now. But it hasn’t always been so easy to fly high. Only one hundred years ago this month did anyone reach even one mile high in an airplane. That happened late in the afternoon on Saturday July 9, 1910 when Walter Brookins took off from Atlantic City in New Jersey. During his brief flight, he managed to reach an altitude of 6175 feet. He was notable for having been the first pilot trained by the Wright Brothers for their exhibition team. Born in 1889, he died on April 29, 1953. By 1930, the altitude record stood at 43, 168 feet—more than eight miles. It was set by A. Soucek in a Wright Apache propeller driven plane. The highest altitude a propeller driven plane ever reached was 56, 850 feet on October 22, 1938 when Lt. Col. Mario Pezzi, an Italian Air Force pilot, flew a biplane, wearing a special electrically heated pressurize suit and an airtight helmet. That altitude record wouldn’t be broken until August 28, 1957 when Mike Randrup flew a turbojet powered English Electric Canberra B.2 with a Scorpion Rocket motor to 70,310 feet. Leroy Heath and Larry Monroe beat that in December, 1960 in a North American A-5, flying to 91,419 feet.
The current record for an aircraft was set on October 4, 2004 by Brian Binnie when his air-launched, rocket powered SpaceShipOne flew to 69.6 miles, beating the previous record set by Joseph Albert Walker in an X-15 rocket plane on August 22, 1963, when he flew to 66.9 miles.
But flying high was something that happened long before the airplane was ever invented. In fact, it happened before the United States was invented (if we assume our invention occurred when our Constitution was ratified). The first person to fly higher than a mile managed that feat in a balloon the same year that the Treaty of Paris ended the American Revolutionary War and the last British troops left New York City. On December 1, 1783, Jacques Alexandre Charles flew a hydrogen balloon to a height of 8900 feet from Paris, only ten days after the first human flight ever, by Jean-Francois Pilatre de Rozier in a hot air balloon. Benjamin Franklin was among the crowd who witnessed the event.
The record altitude for a hot air balloon is 69, 850 feet, set on November 26, 2005 by Vijaypat Singhania of India. The highest balloon flight ever was 113,740 feet on May 4, 1961. That record was set by Commander Malcolm D. Ross and Lieutenant Commander Victor A. Prather, Jr. in Strato-Lab V. On that day they became the highest flying Americans ever.
But they held that record for barely one day. On May 5, 1961, astronaut Alan Shepard flew the Freedom 7 mission in a Mercury capsule launched by a Redstone Rocket. He became the first American to fly into space on a suborbital mission which took him 116 miles up.
Currently, the “highest” ever that human beings have flown is about 240,000 miles. It’s a record held by all twenty-one astronauts who flew to the moon aboard the Apollo missions 11-17.

March 28, 2013
Image of God

Genesis 1:26-27 records the bare statement that Adam and Eve were made in His likeness:
Then God said, “Let us make man in our image, in our likeness, and let them rule over the fish of the sea and the birds of the air, over the livestock, over all the Earth, and over all the creatures that move along the ground.”
So God created man in his own image, in the image of God he created him; male and female he created them.
What does all that mean? The significance of humans existing in the image of God is that they therefore are very valuable and important. In Genesis 9, following the flood, God stresses the image of God as the fundamental difference between humans and animals. Animals were for food—but not people.
Perhaps the significance of the image of God is all that we need to understand. However, Christians have long wondered exactly what God might have meant when he said we were created in “His image”. The traditional answer given by most Christians runs as follows:
Men and women possess attributes of personality:
• Reason
• Creativity
• Love
• Morality
• Freedom
• Responsibility
• The ability to commune with God
The above list is all well and good, but it raises two valid questions. The first, is how does this list significantly differentiate us from the animals? Certainly there is a difference of degree between humans and animals, but the Bible strongly suggests that there is a significant difference of kind, which the above list doesn’t clarify. The second question that needs answering is likewise devastating: where in the Bible is the “image of God” ever defined as it is in the list above?
Could we define “image of God” to mean simply that we look like him? If the reader searches a concordance, he or she will find that every occurrence of the Hebrew words “image” and “likeness” refer to a physical resemblance. In fact, “image” is often used to describe an idol. It is most logical, therefore, to conclude that the image of God in human beings is exactly what a natural understanding of the words imply: human beings were made to look like God. Moses contrasted what was normal in his society with a different reality: where in the ancient Near East it was universally the case that people made images of gods, God now taught that instead, He made people as images of himself.
Many of the early church fathers wrote that the image of God must include the physical and bodily characteristics—not just the immaterial.
But then we will find passages in the Bible which speak of other qualities for God, i.e. “his wings” (Psalm 17:8; 91:4; and Ruth 2:12). In Genesis 15, he appears as a smoking fire pot. As the Israelites wander the wilderness God appears as a pillar of fire at night, and a pillar of cloud by day. In Exodus he appears to Moses as a flaming bush.
One must ask the question, whether appearances are symbolic in the same sense language—words—are symbols of the underlying reality, but are not that reality themselves. The black lines on the page spelling out “water” do not quench my thirst or wash my body; they merely symbolize the sound which symbolizes the substance that I could fill my swimming pool with.
So, when God manifests himself, is his appearance part of the symbolism that allows clear communication in other respects, such as his choice of language and vocabulary, and general adjustment to the cultural background of those he contacted? God, after all, wished to be perfectly clear to those to whom he talked. So, when we see God appearing as a biped in Genesis 18, is that appearance the one that corresponds to ultimate reality, or is it his appearance in Genesis 15 as a smoking fire pot?
Worse, if we argue that physical appearance is what constitutes the image of God, then what of human beings who are deformed, whether by birth or through some tragic accident? Do such people then lack the imago dei?
This same problem faces us if return to the more traditional formulation of “image of God” as a list of cognitive elements, such as reason. If the traditional list is right, then what of the retarded, the autistic, the insane, those with Alzheimer’s, or even the fetus? Is their lack of—or severely damaged—reason, volition, emotions and the like indicative of their no longer having—or perhaps never having—the image of God in them? The eugenic Nazi might be happy to argue that way, but for the rest of us, it demonstrates that neither the physical nor the mental are likely correct, or most certainly less than complete, understandings of the “image of God.”
And the problem only grows. If the thought of excluding some human beings from the image, whether for physical or sentient reasons is distasteful and repugnant, what will we do with non-human sentience, whether extraterrestrial or electronically based that we may come upon or create in the future?
Obviously, this question of the “image of God” is far more complex than it may at first appear.
There is, I think, a way out, however.
In the New Testament, the Church is described as the “body” of Christ. No individual Christian is the body of Christ, but he or she is part of that body (cf. 1 Corinthians 12). Perhaps the “image of God” in man is not in individuals, but is in the species as a whole. That is, humanity is collectively the “image of God” and each individual is a part of that—an important part. Just as each individual Christian is important, serving a function, so each individual human being does the same for humanity as a whole.
That this could be the sense of what was intended becomes likely when we consider that the first nine chapters of Genesis are consistent in using the term “the Man” (Heb. ha-adam), from the creation through the flood, to refer first to the single individual who was the first man, and then as a general term for the species as a whole. This usage continues through to the flood of Noah, when God decided to destroy “the Man.”
Likewise, assuming the image of God is expressed in humanity as a whole explains and clarifies the comment God makes in Genesis 11, when he says that “nothing will be impossible for them”–for humanity. That is, God states that if humanity imagines it, humanity can do it. Humanity’s capability is unlimited, unless God himself intervenes. Human beings can do what God can do, and in fact, that was God’s original intention. The essentially unlimited potential of humanity is the inevitable consequence of God making creatures like himself. It is fundamental to the meaning of the statement “in our image, after our likeness.”
Perhaps the plurality of humanity, then, is a reflection of the plurality of God, seen in God’s use of plural pronouns for himself at the moment of humanity’s creation. Understanding the nature of the image of God in humanity makes the lie of the serpent (Genesis 3) all the more destructive, because he implied to Eve that unless she ate from the fruit, she (and by implication) her species would forever fail to truly be “the image of God.”
The concepts in the New Testament of the church becoming the bride of Christ, of Christians being adopted into God’s family, of becoming the friends of God, and of Jesus being our brother, are simply other ways of stating that we collectively reflect God. A consequence of this is that to criticize something because “we’re playing God” may not be reasonable. It appears that humanity “playing God” was precisely God’s plan from the start. Perhaps just as Eve was a “helper fit for” or “equivalent to” Adam, so humanity is that for God.

March 27, 2013
Find a Busy Person

If you want something done, find a busy person. Why? Because a busy person is someone who can get stuff done—otherwise he or she wouldn’t be so busy. Unfortunately, I’m a busy person. When I finished writing A Year With God, something I accomplished from start to final rewrite in three months–I was tired. But having finished the book, and with it being accepted by the publisher, that meant I finally got paid the final third of my promised payment. And having a pay day meant that my wife’s plan for our master bathroom could finally be set in motion.
If I were a bestselling author on the order of a J.K. Rowling who wrote the Harry Potter books, then I would simply have found some contractors, collected bids, and then spent the next couple of weeks in a hotel while they tore my bathroom apart and put it back together. Or maybe I’d just buy a new house. Unfortunately, my books are not quite as popular as Harry Potter, so a remodel plan meant the contractor was me. Another job on top of all the other stuff I was doing. Like trying to work on my next book, with its deadlines fast approaching. How fast? At the time, I had barely two months left before I had to have that book finished, including the final rewrites and edits. All three hundred eighty-four pages of it (A Year With Jesus).
But my wife had been waiting a long time for her new bathroom. And over the previous two years I had somehow remodeled my daughters’ bathroom, remodeled my kitchen, and remodeled my living room. The master bathroom was simply the last piece in the grand plan that my wife had for our house.
This remodel began as all remodels begin: with a trip to the local hardware store so my wife could show me what she had in mind. The centerpiece of her plans was the replacement of the then current shower stall with a Jacuzzi-style bathtub. Given that the dimensions of the shower stall were the same as those of the tub she wanted, how hard could it be?
The next step, therefore, was to take out the old, to make way for the new. For this, I needed a sledge hammer. Probably the idea of taking a sledgehammer and hitting a bunch of stuff with it sounds appealing, perhaps a way to get out all your aggressions. That theoretical dream fails to take into consideration the physical reality of swinging a sledge hammer. Sledge hammers are heavy. So the fun of smashing stuff lasts, oh, maybe two or three swings. After that, it is just hard work.
On top of that I have severe allergies and asthma. And I was doing all this during the time of year pollen levels were at their highest. Thankfully, I have and had good prescription medication, so I barely notice my allergies.
Unless I’m pounding on a wall of tile with a sledge hammer. Oddly, that created quite a bit of dust. Which led to sneezing. And a mild asthma attack. But I avoided a hospital visit.
After working for about eight hours on Saturday, the bathroom was at last stripped of tile and the old shower was gone. The old shower stall had a drain in the exact center. I had hoped, when I took up the shower floor, I would discover the drain pipe running from the wall to that center. Unfortunately, I simply found concrete, except for a two square foot spot in the center where the drain pipe was. Given that the new tub has a drain pipe on the far right side, as most tubs do, I had a bit of a problem.
My first thought was to drill holes through the concrete slab—but given that it took me about a half hour to drill one ¾ inch hole through five inches of concrete, I decided that was not a practical solution. My next thought was that I would rent a jackhammer to smash through the concrete.
By then it was late, so I went to bed. Easter Sunday, I awoke with a headache that lasted most of the day. We spent that morning with my brother-in-law and his family at his church, an hour’s drive from my house, where his youngest son and daughter were getting baptized by my father-in-law (who was a retired pastor). Feeling tired and grumpy, all I had wanted to do was sit. But the pastor of this distant church knew me, because he’d attended several of the conferences where I’ve spoken. So he asked me to assist him that morning in serving communion.
After enjoying Easter dinner with my in-laws, I finally got home late in the afternoon and promptly took a nap for about three hours. Thankfully, my headache left me. I also came up with a solution to the plumbing for my new tub that wouldn’t require the jackhammer: I would simply build a four inch platform for the new tub and then run the pipe across the top of the concrete and down into the old drain. This worked splendidly, and the raised tub actually looks nice.
Of course, just because my wife’s plans for my house were nearly done, didn’t mean I didn’t have more remodeling to do. My wife has parents. And their house needed a bathroom remodel, their kitchen sink needed to be replaced, and they wanted me to put ceramic tiles in their dining room and kitchen.
Maybe someday I’ll have a best seller and then I can just hire someone to do all that stuff for me. Or maybe someday I’ll learn how not to be a busy person.

March 26, 2013
Dawn

Early in the evening of September 27, 2007, a Delta II blasted off from Space Launch Complex 17B at the Cape Canaveral Air Force Station. On board was Dawn, an enormous unmanned spaceship designed to visit two asteroids: Vesta,and Ceres. Vesta is the fourth asteroid ever discovered and the second most massive of the asteroids. Ceres is both the first asteroid ever discovered as well as the largest. In fact it is so large that it is now classified as a dwarf planet.
Weighing 2800 pounds and bearing two solar panels that stretch 65 feet from tip to tip, Dawn is powered not by chemical rockets, but by an ion propulsion system first tested on the Deep Space 1 probe that launched in October, 1998. Deep Space 1 flew past asteroid Braille and comet Borelly. Its mission was completed by the end of 2001.
On a normal spaceship, the chemical propellant (usually either liquid oxygen and liquid hydrogen, or liquid oxygen and kerosene) makes up the bulk of the weight. For example, when the space shuttle rockets into orbit, it burns more than two million pounds of solid rocket propellant in the twin solid rocket boosters within two minutes. Meanwhile, the large external tank carries more than 1.5 million pounds of liquid oxygen and hydrogen, every drop of which is used in the eight and a half minutes it takes the shuttle to climb to orbit, going from zero to 18,000 miles per hour in order to reach an altitude barely 150 miles up. It thus burns through more than 16,000 pounds of solid rocket fuel per second and nearly 3000 pounds of liquid fuel each second. The space shuttle’s twin solid rocket engines alone put out nearly six million pounds of thrust.
Dawn’s ion propulsion, in contrast, is only a tiny fraction as powerful as the chemical engines used on a shuttle. The amount of thrust put out by Dawn’s engines is the equivalent of what your hand feels holding a single sheet of 8½ by 11 inch paper. But while a chemical rocket dumps its whole load of several million pounds of propellant in just a few minutes, an ion engine can burn its propellant continuously, non-stop, for years. Dawn was sent out of Earth orbit using standard chemical propulsion—but once out in interplanetary space, the ion engines took over. After burning for 11 days non-stop, just to check to make sure they were functioning properly, Dawn began its long-term cruise state on December 17, 2007. The engines then burned non-stop until October 31, 2008, when it was by then on course for its gravity assisted flyby of Mars on February 17, 2009, which sped it on its way into the asteroid belt. During those 270 days of continual thrusting, it used only 158 pounds of propellant. During its entire mission—flying to Vesta, orbiting Vesta, leaving Vesta and then going to Ceres and orbiting Ceres, with its engines firing continuously for years—it will use a grand total of barely 900 pounds of propellant. Contrast that with the space shuttle which uses more than three and a half million pounds of propellant in less than nine minutes just going to orbit, 150 miles up.
If automobiles got the sort of mileage Dawn gets, you’d fill up your car once when you bought it and your grandchildren’s grandchildren would still be driving around on that same tank of fuel. Of course, it would take you a full day to get from zero to sixty miles per hour.
What exactly is ion propulsion? The propellant is Xenon, a colorless, heavy, odorless noble gas that exists in trace amounts in Earth’s atmosphere. Electricity from Dawn’s large solar panels breakdown the gas and send its ions zooming from the nozzle of the engine. Though the ions don’t weigh much, their speed is high. Thus it makes for an efficient, if low power, thrust that builds up over time, ultimately giving the spacecraft a very high velocity.
On July 16, 2011, Dawn settled into orbit around Vesta. Escape velocity from Vesta is only 78 miles per hour, compared to 25,000 miles per hour for Earth. Vesta averages only about 330 miles in diameter, with a surface gravity so weak that an astronaut standing on its surface would still feel weightless. Your average baseball pitcher would be able to hurl a fastball from Vesta on an escape trajectory. If he aimed it just right, he could hurl it all the way back to Dodger Stadium. In fact, an astronaut would have to be very careful: if he jumped too hard he would never come down again. He’d escape Vesta’s gravity and float away into interplanetary space.
Vesta is not quite round, but it’s close. After spending a year orbiting, mapping and studying Vesta, Dawn fired up its engines and left orbit on September 5, 2012. Dawn is now en route to Ceres, with a scheduled arrival in February, 2015 (the same year that the New Horizons spacecraft will reach Pluto). Unlike Vesta, Ceres, is actually round. It is nearly 600 miles in diameter and is thought to have an enormous amount of water ice on and in it. In fact, the evidence at this point suggests that Ceres might actually have more fresh water than Earth does, albeit not in liquid form.

March 25, 2013
English

English is currently the primary or secondary language in many countries, and in fact it is the most widely taught and understood language in the world. Although Modern Standard Chinese has more native speakers, English is used by more people as a second or foreign language. Over 400 million people speak English as their first language. Estimates about second language speakers of English put their number at around 1.5 billion. English is the dominant international language in communications, science, business, aviation, entertainment, diplomacy and the Internet. It has been one of the official languages of the United Nations since its founding in 1945 and is considered by many to be the universal language.
Those who fear that the language of some new immigrant group to the United States is going to somehow displace English are worrying about something that simply won’t happen. Their fear flies in the face of both the overwhelming dominance of the English language, and the history of all previous immigrant groups in the United States. In the nineteenth century, for instance, new immigrants from Europe gathered in linguistic enclaves, established newspapers in their old European languages, established businesses and put up bi-lingual signs. Within three generations, all those old European languages were forgotten, the newspapers had gone out of business or transformed themselves into English dailies. Yet, far fewer people spoke English in the ninteenth century than now.
Far from being threatened, English is more likely to kill off several other living tongues over the next few generations, since the numbers and percentages of English speakers is increasing every year.
English, of course, does not sound the same everywhere it is spoken. Though one can easily converse with someone from England or Austrailia, every so often words or phrases pop up that make everyone scratch their heads. Despite all the contact between Australia and the United States, for instance, most Americans hearing the lyrics to the song Waltzing Matilda believe that it has something to do with dancing. Instead, “Matilda” refers to a knapsack worn by a hobo, and so the song is actually describing the life of a vagabond.
Although my books, The Bible’s Most Fascinating People and The Bible: A Reader’s Guide were published in the U.S., my primary publisher is actually Quarto, a British publisher in London. As a consequence, my editors were in London, and all my correspondence and phone calls were to people with London accents who would sound at home on the BBC. And, although for the most part I’ve never had any difficulty understanding them, there are occasional, momentary puzzles.
Once, while my family and I were in Disneyland, I got an email and later a voicemail from my London editor informing me that my contract (I forget for which book) was ready and that she’d “courier” it to me. My first thought was that some little guy on a bicycle would be carrying the contract to me. However, I quickly dismissed that as improbable since last I checked there were no bike lanes across the Atlantic. So my second thought was that she meant what an American would mean if she had said she’d “overnight” it—or more commonly, “Fed-ex” it.
My second thought was the right one, of course. The contract arrived in a sealed plastic envelope, with the name of the British company emblazoned on the outside: “Inter Continent Couriers Ltd.” Perhaps that’s as common as Federal Express is for us in the U.S. Once it had reached our shores, it was DHL that picked it up and actually delivered it to my door.
So I signed the contract and faxed it back to her. My friends and family whom I’ve showed the emails I’ve gotten from London comment on how “British” they sound. Even written languages have an “accent.” Despite the fact that the British and Americans speak the same language, it only takes a handful of sentences for us to realize that we’re not reading something written by a guy in Kansas, or vice-versa, someone from South Kensington.

March 24, 2013
Where Does the Time Go?

During my senior year of high school I lived in Homestead Florida, a town that described itself as the “gateway to the Everglades.” My dad was in the Air Force and he was stationed there. What that meant in practical terms was that it was humid and we were surrounded by a swamp. Mosquitoes were common and large, though to say that they carried off small house pets would be an exaggeration.
When I am under deadline for a book, I will often work many days in a row without a break. While I was working on A Year With God, I worked a whole a month with no days off, a consequence of laboring to finish the second third of it to meet a deadline. I had about 36 days to do the first third and I had about 36 to do the last third, but I’d only been given 29 days to do the second third. It was tiring, but I managed it. So when I finished that second third, I finally took a weekend off and relaxed. Recalling that time–and even now, without any pressing deadlines–I have been reflecting on how busy I tend to be.
Which got me to thinking about my swampy senior year of high school. I don’t recall feeling so overwhelmingly busy back then, and I’m not entirely sure why, since I was really very busy. Admittedly, children usually do not have jobs, but they are still in school all day, five days a week. My senior year I had to arise before the sun came up and I took a long bus ride to school. I had homework when I got home—a lot of it, since I had honors classes and still managed to graduate with all As. Despite that, I recall endless hours spent on a model railroad or on my stamp or coin collection and listening to classical music on the radio. Admittedly, I was an odd teenager. Somehow, in all of that, I also found time to write novels (I wrote my first when I was 16 and from then til now, for the most part, I do at least 10 pages of writing per day). I also read voraciously, and watched television not infrequently.
But I don’t recall any sense of being overworked or over tired. How did I do it?
What is my problem now, that I feel like I have no time for myself? Does it somehow relate to the sad reality of being middle-aged? Is it just that I’m simply not as spry as I used to be? Possible, I suppose, but I know some younger people that have trouble keeping up with me. In fact, my children are constantly complaining about how fast I walk and they don’t like how I pick parking spots as far from the stores as possible. Walking is good for your health, I tell them, but they don’t believe me.
So I’m wondering if my problem with feeling overworked is simply an attitudinal issue. If so, I’m going to have to work on altering it and recovering my high school mindset. Perhaps I need to discover again the art of leaving my work at work rather than carrying it with me in my head all the time. When I was in high school, I was able to keep my school life and my non-school life separated. Maybe I need to do that with my work.
In a world filled with cell phones and the internet, it can be hard to go completely off the clock. There’s nearly no place to go to get away from our labor. People can call us, text us, and email us no matter where we happen to be. Day or night. When I was working on A Year With God (a bit of a misnomer since I only spent three months writing it, including all rewrites and revisions) I regularly got requests from my editor at 11:30 PM–and she was back east, while I’m on the west coast! I’m not sure that woman ever slept.
Since I work from an office in my home, getting away from work is even harder. When do I begin work? When do I end work? And how can I tell? In the old days, I physically left my job. When I was in graduate school at UCLA, there was actually a time clock that I punched when I arrived at my job and when I left. But not anymore. Now I commute from my bedroom to my office and I find that I hardly ever leave my office until I’m back in my bedroom to sleep.
So what am I going to do? Something all of us need to do. Turn off our connections to work at the end of the day. We need to shut off the office in our heads and find something else to think about and do. Eight or so hours a day is long enough to work, and we need to work only five days a week. It is good to work hard, but we can work harder and better if we know when to quit. “All work and no play make Jack a dull boy,” so goes the old cliché. The dullness is not so much in the sense of making him boring and uninteresting to be around, as it is that it renders him dull of mind, and dull of energy. If we don’t take time off we might be able to impress our peers by telling them how many hours we’ve put in and how tired we are. But in the end, in cold reality, we will actually just make ourselves dull: accomplishing less and doing it less competently.
