Date: Wed Feb 06 2002 - 12:44:12 MST
Responding to several emails on this subject. Sorry about the length.
> From: "Ben Goertzel" <email@example.com>
> It may be fun to talk about, but this sort of post-Singularity stuff is *way*
> beyond our capability to project or comprehend with any reasonable degree of
I'm more worried about the transition period.
> Mitch Howe wrote:
> "Could someone hack into a Sysop?" The only answer I have is "No, because if
> its design were so vulnerable a Friendly SI would never implement it."
We're progressing straight to perfection, with no stops along the way?
> I believe that this is a valid answer, but such ontological responses are not
> very satisfying to the reader or the writer.
I don't see the people we're dealing with today (including the entire
spectrum from spammers and crackers through career criminals, mobsters and
terrorists) going away any time soon. I assume some will still be around to
upload, and not necessarily to the same Sysop you decide to visit. If only a
few hundred thousand upload (or duplicate themselves), and they continue to
work semi-cooperatively, who knows what they might accomplish as they advance
along their own progress spike? Unless there's something inherently
enlightening and unifying about advancement?
> What I'm wondering is if anyone has any ideas about how a Sysop might be
> supremely well protected from hacking.
Only infinite resources would allow the Sysop to keep his computing base/
resources/software/hardware/AI code base up to date with the very latest tech.
There will be occasional lapses where the "State of the Art" and the "State
you're living in" aren't the same. You only need to be cracked once.
> My background isn't technical enough to give any real-life examples besides
> "physical separation"
Physical separation increases knowledge propagation delays. Unless you are
/the/a parallel/ source of all new knowledge?
> Could something as ubiquitous as a Sysop realistically do this?
I think planning for survivability will be more fruitful. Was it Alan Grimes
that was talking about liberty? Something he wrote exposed the thoughts I had
lurking. We're not going to be able to give up our responsibilities and cares
to some super computer. Each of us (or at least enough up to some threshold)
has to play a part, and contribute time, diligence, and resources to
maintaining our safety - always. The Sysop I picture is just another tool I/we
use to get there - but it's not the "there".
> it would be logical for the Sysop to physically distribute minds in such a
> way that a physical attack would require destroying the universe to take
> anyone out completely.
A large part of information warfare deals just with getting read-access to
data, with or without the data holder knowing. You don't have to manipulate or
destroy the data to create havoc.
>From: "Eliezer S. Yudkowsky" <firstname.lastname@example.org>
> it's also possible that given a superintelligent programmer there are just no
> mistakes left to exploit,
Even with near-infinite intelligence, not everything will be deduced ahead of
time. There will still be a lot of exploring of the state-space of technology
and its implementation. What if you discover that for the last /10 seconds/1
Myear/ you've been merely climbing the mole-hill instead of the mountain? In a
second you could be left in the dust.
>From: Gordon Worley <email@example.com>
> ... This will not work, though, because if the Sysop is to do ver job ve
> would have to, in theory, protect SIs of greater intelligence as well as
> those lower intelligence from those same attacks.
This plays on one of my fears - that either the Sysop is always significantly
greater than everyone else, or else everyone is somehow limited such that they
are always less than the Sysop.
Future Me: "I think I'm onto a break-through in remote matter manipulation...."
Sysop: That would allow anyone to reprogram me and compromise the system.
"Your access to the research lab is terminated. Please return to the
playground. Your emigration visa to the Hackers-R-US BBS is denied. I've also
set your information export quota to zero Pbytes/sec. I'll let you know when
I've discovered a defense. Until then, everyone's CPU allotment is cut to 10%
while I search."
"Until I can find a solution, we are accelerating at MAX perpendicular to the
galactic plane, to minimize our intersection with hostile light-cones."
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT