From: Keith Henson (hkhenson@rogers.com)
Date: Sat Nov 06 2004 - 09:40:49 MST
At 07:01 AM 06/11/04 +0000, David Picon Alvarez wrote:
snip
[Keith]
> > I have come to the horrid realization that wars are what happens when some
> > significant segment of a human population thinks they are facing a bleak
> > future. We evolved as gene survival machines. Now *genes* by the very
> > nature of the process are rational. But a for a gene to be rational, that
> > is survive into the next generation, in some circumstances it induces
> > irrational thinking and behavior into its carriers. The mechanism by which
> > this is done isn't understood yet, but the logic is
> > airtight--unfortunately. :-(
>
>Wrong. Evolution has not come to an end. Genes are only _more rational_ than
>the iteration which came before.
Using "rational" for a process like evolution is kind of a metaphor, but to
put it in more prosaic terms, a gene has done the right thing
(the rational thing) if it builds (takes part in building along with tens
of thousands of other genes) a vehicle (body) which succeeds in propagating
said gene into the following generation. It's a binary switch. Looking
into the past, the gene either did the right (rational) thing or it failed
(sometimes by just being unlucky). The genes we have with us today did the
rational thing, the right thing (by this standard) every single time.
>In addition, collective volition would
>extrapolate from the volition of the human agents, not their genes,
The problem is the logic that genes build psychological mechanisms into our
brains that under some circumstances shuts off rational thinking. (I bet
you could *see* it in action with fMRI, I would expect to see reduced
metabolism in the prefrontal cortex.) There is more than an example a week
in the news--suicide bombers. There is a good reason from our evolutionary
past that we should have this mechanism. This mechanism is part of what
contributes to the collective volition of human agents.
Understand I *don't like* this conclusion but so far I have found it
inescapable.
>and
>(taking the mostly accepted assumption on this list that humans could be ran
>in a different substrate than genes) it is entirely possible that part of CV
>would be getting rid of genes and genetic evolution.
That's possible. But if you are going to run humans on a different
substrate than meat bodies (something I too think can be done) then you are
going to have the same scary psychological mechanisms present--unless you
edit them out. The problem is not the substrate, but the mechanisms that
were built into human brains/minds by millions of years of evolution as
hunter-gatherers.
> > I hope you are right about this If you are, I don't think it will be due
> > to the CV of large populations, but because the people who develop AI just
> > impose their collective ideas on what the AI should value.
>
>In which case CV wouldn't have been implemented.
So be it.
I don't care how long they have had to think about it or how much thinking
power they had, I frankly *don't want* the CV of a mess of fundamentalist,
rational impaired, suicide bombers contributing to the values of an
extremely powerful AI. Ghod knows I have had enough trouble from rational
impaired scientologists.
> > I suspect that Liberal-Democratic memes are a historic anomaly, caused by
> > a few centuries of tech driven mostly optimistic future prospects and
> > possibly the effect of large families (see Frank Sulloway).
>
>It doesn't very much matter what causes them, so long as they are deemed by
>a hyper-rational human with loads of time and knowledge, to be desireable.
Good times, happy people. Desirable? Yes.
The problem (in the past anyway) was that good times led to an expansion of
the population which inevitably led to limited resources per person, i.e.,
bad times. Unhappy people with a bleak future were a fertile substrate for
xenophobic memes . . . .
>In addition, many of the bases for Liberal-Democracy go as far back as
>Greece in their provenance. Maybe the situation hasn't been so that
>Liberal-Democracy (or whatever other "progressive" political system) was
>viable in the past, but it would be viable now. Also, the issues of scarcity
>would very much change after singularity.
For a while, yes. But it won't last, not without fundamental changes in
human nature.
> > You should think about this a bit more. The "evolutionary detritus of our
> > past" should be discarded with *extreme* care. Most of the problems we
> > have can be solved by enough room rather than changing human nature. Of
> > course maintaining the ratio of resources to humans might require
>tinkering
> > anyway.
>
>There's never enough room. Lightspeed is limited, and humans could expand at
>exponential rates. However, if what you say is true (id est, human nature
>should not be changed)
Didn't say that, just be really careful. The consequences might be dire
indeed.
>CV could come to that outcome just as well, and give
>us enough room, instead of helping us to live together in the room we have.
>Of course, within the constraints of the possible.
CV could come to the conclusion that high average happiness required
periodic population culls too.
I just don't know.
Keith Henson
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:48 MST