Stephanie

73%
Flag icon
When did America become a democracy? When all white men won the right to vote. When white women won the right to vote. When Black people won the right to vote. I’ll let you know.
Black AF History: The Un-Whitewashed Story of America
Rate this book
Clear rating
Open Preview