Many white people like to think that racism is really just a Southern problem. Sure, there are ignorant people everywhere, and yes—occasionally—racist incidents happen in California or New York, but those are exceptions. The South was the place that wanted slavery, and the rest of the country fought to end it, right? So how bad can it be once you leave the South? The answer is very bad.

