Re: Happy Box

From: Lee Corbin (lcorbin@rawbw.com)
Date: Sat Apr 26 2008 - 00:50:21 MDT


Stuart writes

>> > We could instruct our "happy box" to make us
>> > deliriously happy while AT THE SAME TIME we do
>> > a lot of other, productive things
>>
> [Attribution missing] wrote:
>
>> Well sure you could, but would you? I'm not sure you'd even want to.
>
> Yes, I would want to. And if given the option, I would.

Certainly. Even now sometimes I pass up a comedy that
I would enjoy more for an opportunity to invest some time
in, say, enhancing my computer skills.

> It probably depends on how the "happiness box" evolves. I'm sure you
> can be ecstatically happy without turning into an drooling imbecile. It
> just depends on whether the first "happiness box" offers that option,
> and how long until such a one is developed.

If cryonics works, and if AIs have taken over, I expect to
be awakened with a robotic looking entity standing next
to my bed (whether uploaded or not), who says, "Lee,
we know you want to evolve quickly, give old copies
of yourself run time and so on---we have, after all read
everything, including all the old email archives---and so
the question is, how much smarter right now do you
want to be while we talk. As you have gone on record,
90% of the resources allocated to you will be used for
further advancement and 10% for immediate gratification."

My answer: "How much smarter right now? Ten IQ points,
please. Tell me now, quickly, what are the other tradeoffs?"

And the AI launches into a very convincing (of course!)
commentary on the tradeoffs between being too deliriously
happy to accomplish anything, and becoming really smart
so fast that I turn into someone else---and all the options
in between, especially those that haven't occurred to any
of us yet.

Lee



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT