Every few years, a storm erupts when some public figure blurts out, “America is a Christian nation!” She was once, and a majority yet call themselves Christians. But our dominant culture should more accurately be called post-Christian, or anti-Christian, for the values it celebrates are the antithesis of what it used to mean to be a Christian. “I am the Lord thy God; thou shalt not have strange gods before me” was the the first commandment Moses brought down from Mount Sinai. But the new culture rejects the God of the Old Testament and burns its incense at the altars of the global economy.
...more