Re: answers I'd like from an SI

From: Norman Noman (overturnedchair@gmail.com)
Date: Fri Nov 16 2007 - 16:12:43 MST


On Nov 16, 2007 5:30 AM, Stathis Papaioannou <stathisp@gmail.com> wrote:
> I can imagine that talking someone into killing themselves is
> possible, but psychosis is a hardware problem. You can't talk someone
> out of a psychotic state and you can't talk them into it either.

Sticks and stones can break my bones, eh?

I don't think anyone here would argue that a superintelligent AI could
take over the internet with relative ease. There are always security
holes that we don't realize, and as soon as you can manage a buffer
overflow you can get a computer to execute any code you want. The
human brain is messier, more complicated, and less modular, but it is
still a "robust yet fragile" system. It has strange weaknesses that
allow seemingly harmless types of input, such as words, to set off
catastrophic changes within it. Just because I can't talk someone into
a psychotic state doesn't mean it can't be done.

There was an episode of pokemon that sent 685 people to the hospital,
and that wasn't even INTENTIONAL.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT