In the late nineteenth and early twentieth centuries, a liberal Christian movement called the Social Gospel had transformed American Christianity. The Social Gospel called on Christians to establish the kingdom of God on earth by reforming oppressive economic systems. Social Gospel leaders took an optimistic view of human nature, focusing less on human fallibility than on the ability of good-hearted individuals to work for social justice.

