Simulated Universe and Roko's Basilisk are spoopy.
Roko's basilisk is a
thought experiment about the potential risks involved in developing
artificial intelligence. The experiment's premise is that an all-powerful artificial intelligence from the future could retroactively punish those who did not help bring about its existence, including those who merely knew about the possible development of such a being. It resembles a
futurist version of
Pascal's wager, in that it suggests people should weigh possible punishment versus reward and as a result accept particular
singularitarian ideas or financially support their development. It is named after the member of the
rationalist community
LessWrong who first publicly described it, though he did not originate it or the underlying ideas.
Despite widespread incredulity,
[3] this argument is taken quite seriously by some people, primarily some denizens of LessWrong.
[4] While neither LessWrong nor its founder
Eliezer Yudkowsky advocate the basilisk as true, they
do advocate almost all of the premises that add up to it.
Roko's posited solution to this quandary is to buy a lottery ticket, because you'll win in
some quantum branch.