America is the only nation on earth where Christianity has been historically entwined with white supremacy from its beginning. Slavery, and the succeeding years of segregation and institutionalized racism in America, endured for one reason—Christian leaders consistently validated it.

