From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Tue Mar 10 2009 - 23:43:40 MDT
The Singularity Institute and the Future of Humanity Institute are
beta’ing a new site devoted to refining the art of human rationality,
LessWrong.com. LessWrong will end up as the future home of Eliezer
Yudkowsky’s massive repository of essays previously written on
Overcoming Bias - with, we hope some added features to enable much
easier reading thereof; which is where you come in.
LessWrong is based off the open-source Reddit codebase and written in
Python. The current site is up and running, but there are many
features remaining to be added. Volunteer developers for this
open-source project would be much welcome, especially fast coders
fluent in Python with large blocks of free time to contribute major
new modules.
For more information, see:
http://wiki.github.com/tricycle/lesswrong/contributing-to-less-wrong
http://lesswrong.com/lw/1t/wanted_python_open_source_volunteers/
Please note that to prevent topic drift during the establishment of
Less Wrong, we’ve banned discussion of the Singularity / AGI / FAI
until May 2009 - the goal of this blog is to create an ongoing
conversation about human rationality. If you aren't much interested
in reading about discussion of human rationality per se, and you just
want to discuss transhumanist projects, then you're politely requested
not to comment on Less Wrong.
But it goes without saying that SIAI and FHI have reasons for
launching this project - for example, it turns out that it's
surprisingly hard to convey our arguments to someone who doesn't
realize why you ought to maximize expected utility instead of running
out to save cute puppies. So any Python developers with large blocks
of open-source development time may indeed consider this a formal
request for help.
Sincerely,
Eliezer Yudkowsky.
-- Eliezer Yudkowsky Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT