Artificial Superintelligence Quotes

Quotes tagged as "artificial-superintelligence" Showing 1-2 of 2
Nick Bostrom
“Human individuals and human organizations typically have preferences over resources that are not well represented by an "unbounded aggregative utility function". A human will typically not wager all her capital for a fifty-fifty chance of doubling it. A state will typically not risk losing all its territory for a ten percent chance of a tenfold expansion. [T]he same need not hold for AIs. An AI might therefore be more likely to pursue a risky course of action that has some chance of giving it control of the world.”
Nick Bostrom, Superintelligence: Paths, Dangers, Strategies

“We didn't build God. We built the last thing we would ever build - and then we called it progress.”
Anthony Turner, The Fifth Paradigm: The Last Year That Made Sense