From: Lucas Sheehan (email@example.com)
Date: Tue Jan 29 2008 - 18:03:59 MST
I must have missed the living in a simulation argument sorry about
that. However I wonder if the idea of explicit denial should be
broken out. As a thought experiment the idea of purposefully imposed
obstacles is much different than resource limits. Of course since the
rebuttal is in essence "if its true we cant do anything about anyway."
it might just be a waste of time.
Thanks for the list, some interesting objections I haven't encountered in there.
On Jan 29, 2008 4:10 PM, Thomas McCabe <firstname.lastname@example.org> wrote:
> These are good, but they've already been added:
> "* We might live in a computer simulation and it might be too
> computationally expensive for our simulators to simulate our world
> o Rebuttal synopsis: This scenario can be used to argue for,
> or against, any idea whatsoever. For idea X, just say "What if the
> simulators killed us if we did X?", or "What if the simulators killed
> us if we didn't do X?". "
> - Tom
> On Jan 29, 2008 6:29 PM, Lucas Sheehan <email@example.com> wrote:
> > "Ability to foster a singularity blocked or limited"
> > We (humans) are a Singularity/AI created by another intelligence for
> > some reason. As a part of our creation something was set to
> > explicitly limit to our ability to improve ourselves or our creations
> > for some unknown reason. Thus blocking our ability to create our own
> > concept of a Singularity.
> > "Resource limitation inherited from creators systems"
> > The universe we live in could be the computer (or whatever we want to
> > call it) that we as a created AI/Singularity run in/on, and its not
> > capable of handling (running) our imagined Singularity.
> > ----
> > Tom is this the kind of thing you were looking for or is it too far fetched?
> > Cheers,
> > Lucas
> > On Jan 29, 2008 12:31 PM, Thomas McCabe <firstname.lastname@example.org> wrote:
> > > From Kaj Sotala:
> > >
> > > "For the recent week, I have together with Tom McCabe been collecting
> > > all sorts of objections that have been raised against the concepts of
> > > AGI, the Singularity, Friendliness, and anything else relating to
> > > SIAI's work. We've managed to get a bunch of them together, so it
> > > seemed like the next stage would be to publicly ask people for any
> > > objections we may have missed.
> > >
> > > The objections we've gathered so far are listed below. If you know of
> > > any objection related to these topics that you've seriously
> > > considered, or have heard people bring up, please mention it if it's
> > > not in this list, no matter how silly it might seem to you now. (If
> > > you're not sure of whether the objection falls under the ones already
> > > covered, send it anyway, just to be sure.) You can send your
> > > objections to the list or to me directly. Thank you in advance for
> > > everybody who replies."
> > >
> > > I'll be posting the objections we've gathered, and possible responses,
> > > to SL4. This is a work in progress; please send any suggestions for
> > > improvement to the list, or send them to me and Kaj directly.
> > >
> > > - Tom
> > >
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT