But it is a truth universally acknowledged that America is centered on the success, promotion, pleasure, and whims of white people. White people control the levers of power in America, whether economic, political, educational, employment, religious, social, or cultural.
Published on October 10, 2020 13:30