Christians have faith in God and, by and large, they believe and hold fast to the central truths of the Christian tradition. But while they have faith, they have also been formed by the larger post-Christian culture, a culture whose habits of life less and less resemble anything like the vision of human flourishing provided by the life of Christ and witness of scripture.