The West has shaped the world we live in. Even now, with signs of a growing challenge from China, the West remains the dominant geopolitical and cultural force. Such has been the extent of Western influence that it is impossible to think of the world without it, or imagine what the world would have been like if it had never happened. We have come to take Western hegemony for granted.

