Should one person or group get to decide the goals adopted by a future superintelligence, even though there’s a vast difference between the goals of Adolf Hitler, Pope Francis and Carl Sagan? Or do there exist some sort of consensus goals that form a good compromise for humanity as a whole?