From: Randall Randall (email@example.com)
Date: Wed Dec 31 2003 - 16:26:30 MST
On Wednesday, December 31, 2003, at 12:40 PM, Michael Anissimov wrote:
> "Combining a few issues here. I believe that strong superintelligence
> is possible. Furthermore, I believe that to argue to the contrary is
> amazingly rank anthropocentrism, and should be laughed at. Beyond
> that, I think full AI is possible. It's the combination of the two
> that's interesting."
> I agree that people who believe strong superintelligence is impossible
> are memetically distant enough from Singularity-aware thought that
> trying to avoid offending/confusing them is pointless.
This is not at all true. I think it's quite arguable that
strong superintelligence is impossible. For instance, it
might be that physical law doesn't permit complexity above
some ceiling. If so, it might be that the smartest possible
humans are already very close to that limit.
This might not seem very likely to you, but unless you
can logically rule it out, it isn't incompatible with
a technologically induced singularity (in the weak sense).
-- Randall Randall
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT