From: Michael Vassar (email@example.com)
Date: Fri Aug 12 2005 - 15:13:18 MDT
If sing inst is to be humanities last hope, it must remain focused. At the
very least, it must keep its attention on items which are not the subject of
widespread attention, in other words, on items which we can realistically
hope to impact. Interpreted most broadly, that still doesn't include nukes,
though it may include other technologies that FAI if they have the key
feature of recursive self-improvement.
The chance against meteor impacts is the definition of astronomical.
>Michael Vassar <firstname.lastname@example.org> wrote:
> >I'm sorry, I must be on the wrong list. I was looking for the list >where
> >people discussed the nature, impact, and design strategy for >survivable
> >recusively self-improving generally intelligent systems.
>Sure, that is the primary focus of this list. But I don't think the sl4
>list is specifically for discussion of sl4 technologies as defined by
>Yudkowsky's essay: Shock Level Four. It says right on the intro page that
>there are other subject matters of relevance, including extinction threats
>(nukes). If this list is to be "humanity's last hope", I think narrow
>focus upon achiving an FAI AGI is a dangerous dismissal of all other
>potentialities. Molecular Manufacturing, extinction threats or shrieks, IA
>via drugs/bio or implants, uploads, and any technologies enabling any of
>these all seem fair game.
> I posted my rambling estimates of the # of ET civilizations because if
>it could be contradicted enough to suggest aliens might live in our own
>super cluster or galaxy, even talk of little green men might be sl4. I
>almost played guns germs and steel as sl4 material, but these are
>extinction threats we have already passed. We have to be careful new
>technologies many of the people on this list have influence over don't lead
>to greater such risks than would be present in the absence of the
>introduction of the new technologies. Time travel seems sl5 and I'm sure
>there are potenital technologies beyond this; these too are fair game if it
>can be shown there is any chance of achieving these before AGI. I would
>love to help in efforts to work with the nuts 'n bolts of AGI but apart
>from chipping my two cents in on the philosphy behind it, I'm focused on
>MM. The above fields are too large and uncharted for any one expert to hit
>all of them in depth so a forum for measuring the !
> each against the rest is needed, and sl4 looks to be best for this. It
>would be a shame for AGI researchers to have a seed AGI which they expect
>will be FAI but have still been "testing" it for the last 12 months, wiped
>out by a meteorite because they haven't turned on the news in a year.
>Do You Yahoo!?
>Tired of spam? Yahoo! Mail has the best spam protection around
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:00:58 MDT