Questions for Eliezer, SIAI and consciousness

From: sam kayley (thedeepervoid@btinternet.com)
Date: Tue Oct 26 2004 - 13:12:28 MDT


What probability do you give for consciousness involving strange physics?

An AI, even one designed to self-improve has some basic assumptions which it
does not make sense to change, such as its goal, and probably several more
subtle ontological assumptions. Is it a design goal for SIAI not to include
assumptions likely to result in nonconsensual uploading of humans even if
evidence is found that physical effects occur in brains that do not occur
elsewhere in the observable universe?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT