Will paused and sat forward. “Y’know, that’s a very real possibility as an explanation. Starting on day one, you have all the usual risks, like meteor strike, nearby supernova, ecological catastrophe, and so on. But once a species becomes intelligent, they start introducing more existential dangers, like climate change, all the forms of warfare, and self-destructive technologies like gray goo and AIs. And none of the older dangers go away, really. If the dangers just keep piling up as the species advances, eventually the odds catch up with you. It might be that extinction becomes statistically
...more