From: Mike Dougherty (msd001@gmail.com)
Date: Sat Apr 29 2006 - 22:13:02 MDT
"Brain limited"? I'm guessing all the measurements done by respectable
research have been on untweaked brains. Suppose we do manage to grow a
synthetic equivalent of human brain in a lab. Before 'it' claims
inalienable rights for itself, we (ignorantly?) will likely be performing
the kinds of experiments that are immoral to do to real people (or monkeys,
or rats, or what-have-you)
How much overclocking is the human wetware capable of supporting? Assuming
we have a guaranteed replacement warranty, how far can human neural hardware
be pushed before meltdown? ex: I've noticed that ginkgo does affect my
concentration. If the reason for this is nothing more than a slight
increase in oxygenated blood flow to the brain, then what other nutrition
deficits could easily be corrected to optimize brain performance?
(rhetorical question, obviously I could google an answer by researching for
a few hours) My point is that many people neglect their own physiological
requirements for optimal performance.
Back to the home-grown brain. Assume we have an interface that we can
interact with and somehow take metrics. Can we get significant increases in
performance from a meat-based CPU by radically altering the chemicals
available? Is there an exploitable side effect of inducing synesthesia for
multiplexing information? Sure, the stresses would probably make a human
test subject "opt-out" long before the usage was perfected or controlled,
but if we're talking about lab-grown brain blocks who would complain? Are
the common themes of hallucinogenics like DMT, LSD, etc 'merely' a product
of conditioning, or do those themes arise from a function of a particular
compound's effect on the observer's information processing resources. I am
not advocating controlled substance abuse, though this example is probably
more compelling than caffeine or increased blood sugar. I am also not
advocating any actual experiments along this line of thinking. This might
be more appropriate in the list of unwelcome paths to an unmitigated goal of
UFAI that is probing the same question I just asked.
Still, I am curious what the opinions are of much more informed thinkers
than myself. Clearly the speed of meat-based thinking is far slower than
silicon (or computronium, etc) Will there be any redeeming use for
biological/chemical architecture once we have not as much bootstrapped, but
shoehorned ourselves into new hardware? I assume by the time we have
destructive scanning uploads that a method of growing artificial neurons
would be equally doable so the uploaded could also be downloaded if there
was any desire to do so. (like an upload to preserve Self, wait around
either in subjective real-time or zero-time for a new body to be
grown/harvested/produced, download into the new and improved computing
substrate. How weird to think about a completely new body being 'easier'
for medical science than to fix an existing body with cancer... OK, maybe
the technology to cure disease will be available - what makes one body more
special than any other?
I apologize if you think this was simply too transhumanist. I started from
the "human-brain-limited point" and ended in new territory. I don't know
where to draw the line between thinking of a brain, my brain, general human
brains, our global collective processing power as a brain... My guess is
that Singularity will redefine that line anyway.
PS: I don't think it would require much for GMail to have run a search on
the word "computronium" before suggesting that it was spelled wrong -
especially since the first Google hit is a wikipedia link. This should be a
'no brainer' (pun intended) for an improvement to the tools the smart
monkeys use in their endeavors to make smarter tools.
On 4/29/06, Richard Loosemore <rpwl@lightlink.com> wrote:
>
> There is one other point that connects with this: the improvements we
> have seen so far might be approaching a human-brain-limited point
> anyway. What that means is that it doesn't matter what all those lovely
> exponential curves from previous technologies are saying, we might be
> approaching a plateau right now (AGI excepted, of course).
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT