Dennis

15%
Flag icon
At the outbreak of the First World War, European nations and their offshoots still dominated the world. In 1914, however, Europe’s great powers turned on one another. The First World War marked the beginning of the end of European dominance of the world.
Easternization: Asia's Rise and America's Decline From Obama to Trump and Beyond
Rate this book
Clear rating