The Dechristianization of America
Religion, more than any time in history, is proving to be a dividing issue among Americans. At a time when tolerance is supposedly the aim for all, one would think that all religious beliefs would have an equal ability to thrive. This doesn't seem to be the case according to Gary Laderman, Chair of the Department of Religion, Emory University and author of an article in the Huffington Post.
Laderman wrote, "Religion in general is not diminishing its social impact, but Christianity specifically is losing its authoritative power across society. What we are witnessing today, and what has been especially visible in the past for some time now, is a process of dechristianization (not secularization)."
Christianity as a faith, not a religion, has always offered meaning, fulfillment and hope for our natural life and life in the hereafter. What has challenged and even begun to displace all that Christianity offers? Laderman cites, "the growing power of popular and entertainment cultures to provide what Christianity no longer can -- meaning and fulfillment, pathways for transcendence and ideals to live up to, satisfactory explanations for death and a true, revelatory sense of personal identity." If this is true then our society is purely self-focused, satisfied with false claims and doomed to implode. Were it not for the promises of God in Scripture, I would be very fearful. Instead, as a Christian, I am doing all I can to rally others to examine what they believe and why so they can be certain that they are basing their life - their eternal life - on truth.
I welcome your thoughtful comments here and with each post. Click on the headline title to open the comment box.


