To say that the United States is a settler colony means that the land was overtaken by white Europeans over the course of several centuries in a way that differed from the way that most countries in Africa and Asia were conquered. The white settlers came to stay, and the native population was excluded, by definition, from the nation they built. In order for the new white and Christian country to take form, the indigenous population had to get out of the way.