Fieldnotes on Allyship: Now in Print

But it is a truth universally acknowledged that America is centered on the success, promotion, pleasure, and whims of white people. White people control the levers of power in America, whether economic, political, educational, employment, religious, social, or cultural.
 •  0 comments  •  flag
Share on Twitter
Published on October 10, 2020 13:30
No comments have been added yet.