From: David Picon Alvarez (eleuteri@myrealbox.com)
Date: Wed Dec 14 2005 - 13:46:17 MST
From: "micah glasser" <micahglasser@gmail.com>
member. One more thing I must clarify. I believe (for a plethora of reasons)
that all rational agents will necessarily have for their goal increasing the
state of freedom as a super goal of the individual and society. If I am
correct in this (and I am) then it will not be possible to program a truly
rational agent without including achieving greater freedom (power/knowledge)
as a super goal.
I think you're entirely confused about what a rational agent is, or perhaps
utilizing a non-standard meaning for rationality. A rational agent is one
which maximizes expected utility, whatever its utility function may be. A
thermostat can be rational.
--David.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT