Yet empire is not only a pejorative. It’s also a way of describing a country that, for good or bad, has outposts and colonies. In this sense, empire is not about a country’s character, but its shape. And by this definition, the United States has indisputably been an empire and remains one today.