Rachel

63%
Flag icon
Yet empire is not only a pejorative. It’s also a way of describing a country that, for good or bad, has outposts and colonies. In this sense, empire is not about a country’s character, but its shape. And by this definition, the United States has indisputably been an empire and remains one today.
How to Hide an Empire: A History of the Greater United States
Rate this book
Clear rating
Open Preview