Saturday, November 29, 2008

greed post singularity?

> The question of whether to allow drugs and/or wireheading is not really interesting. The fact is that cultures that prohibit overriding the evolved motivational systems that keep us alive have more children, thus explaining current attitudes.
>
> We should similar attitudes among reproducing AIs capable of reprogramming themselves to live in simulated worlds with magic genies, or that just directly increment their utility registers. They will succumb to the AIs that aren't happy all the time because they hunger for computing resources.
>
> -- Matt Mahoney, matmahoney@yahoo.com

Matt,

I'm very skeptical of assuming that post-Singularity society will be dominated by a competition for resources.

This view of organismic evolution has been rightly questioned by Stephen Jay Gould, A. Lima de Faria and many others ... most of these folks don't doubt that competition-based selection exists, but they argue that

a) selection is mostly not about competition for the same resources, but rather about the creation of complementary niches

b) self-organizing dynamics acting complementarily to, and orthogonally to, the dynamics of differential reproduction also have a huge impact

In short, the "nature, red in tooth and claw" vision of organismic evolution is in itself an idealized oversimplification which approximates reality better in some cases and worse in others ... and, it may approximate post-Singularity society even worse...

Even if one posits that "the surviving post-singularity minds will be the ones that want to survive", that doesn't tell you anything about reproduction or the drive thereof, or the drive of agents to accumulate more and more resources.

How do you know some kind of "post-singularity steward" won't get put into place, with the specific job of preventing greedy agents from accumulating more and more resources -- but NOT asking for anything in return ... and not trying to accumulate excessive resources for itself ... and proceeding in this manner because **that's how it was programmed**

I believe this is closely related to what a previous version of Eliezer Yudkowsky called the Sysop Scenario...

ben g

No comments: