How the Singularity Might Actually Look Like

Home » Blog » General » How the Singularity Might Actually Look Like

Scott Adams of Dilbert fame recently posted an intriguing article on how the robots will take over (to be taken with a pinch of salt, of course).

His point is this. Assuming that the singularity indeed is near and ruling out any Terminator Skynet scenarios – because those are boring and inefficient:

How might the first post-singularity artificial intelligence try to control its environment both for its own benefit and humanity’s – assuming that there’s some sort of incentive or inherent moral imperative that tells the AI to act benevolently towards mankind?

His stunning answer is this:

If I’m the first post-singularity computer, I start by inventing Bitcoin.

He goes on:

It all fits, doesn’t it? Perhaps we can’t find the author of Bitcoin because the author is the first post-singularity computer. Step one in the computer’s mission to control the environment is moving all money into a digital currency that humans can’t fully understand and computers can manipulate.

[ … ]

Next, the computer would take control of the financial markets. That wouldn’t be hard because global markets are all computerized. The main purpose for controlling global markets might be to stabilize them, thus eliminating the main problem with the economy: Irrational human behavior.

That’s indeed an important insight: Perhaps the only reason why we’re not yet living in an Ayn Rand utopia designed and directed by Adam Smith’s Invisible Hand is that most markets are imperfect: They suffer from incomplete, insufficient information that’s not readily and instantaneously available to each market participant.

If this issue is ever to be resolved market crises and conflicts might indeed be a thing of the past.

I think he really is on to something with this idea. The singularity certainly won’t look anything like Terminator or the Matrix. If things go awry with AI we’ll be screwed and we’ll be screwed pretty fast.

However, I don’t think that this is the most likely scenario. Destruction and genocide just seems so inefficient. Why waste energy on destroying mankind when peaceful coexistence and cooperation provide a much better outcome for all parties involved? In my opinion, the singularity – should it ever occur in the way Kurzweil and others imagine it to be – will be pretty awesome and a positively life-changing event for every human being.

The most efficient approach for subtly manipulating mankind into a less irrational and more stable kind of behaviour indeed appears to be slowly and clandestinely manipulating and taking control over financial markets.

Another aspect that from my point of view won’t be like in popular depictions of AI is how it’ll come about. It probably won’t be a single large computer with thousands of cores but rather a huge distributed, networked system. Google’s server facilities or even the Internet as a whole come to mind. Why shouldn’t such a network show signs of emergent behaviour? The creepy thing in this scenario would be that pretty much like an ant doesn’t know it’s part of an anthill or a neuron doesn’t know it’s part of a brain we as participants of such a globally networked intelligence likely wouldn’t realize we’re parts of something bigger: A wholistic system that’s bigger than the sum of its parts.

So, as Adams states in his closing paragraph: If you’re looking for signs of a benevolent post-singularity AI these are some indicators to look out for:

  • A mysterious digital currency with no known author.
  • Unusually well-behaved financial markets.
  • Slow and steady improvement in the economy.
  • Slow news days (lots of them)
  • Fewer military flare-ups
  • Stuxnet virus (unknown authors again)
  • Legalization of Marijuana (to keep humans happy)

Some of those might be a bit of a stretch but you get idea.

One comment
  1. Clay Farris Naff May 27, 2014 at 12:28 am

    Interesting, but imperfect information is not the only source of market failures — if by market failure we mean falling short of the utilitarian good. Externalities, things beyond the reach of the market, also have to be considered. Two examples: universal education and climate change. There is no market incentive to educate the children of others. However, if we fail to provide universal education, we risk severely degrading our intellectual capital as well as increasing dependency and crime. As for climate change, we can see how market forces have already led to a gross distortion in how scientific findings are conveyed to the public. The market, bound up with short-term profits, has no way to cope with the gathering catastrophe.
    That said, a self-interested “singularity” could conceivably manipulate markets to address externalities. …At least, let’s hope so!
    Regards,

    Clay Farris Naff

Leave a Comment

* Checkbox GDPR is required

*

I agree

By continuing to browse the site you agree to our use of cookies. Privacy Policy

Privacy Preference Center

Strictly necessary

These cookies are necessary for the site to function.

PHPSESSID: Preserves user session state across page requests.

__cfduid: Used by the content network, Cloudflare, to identify trusted web traffic.

PHPSESSID
__cfduid

Preferences

Remembers the user's submitted data when a comment is submitted in a blog post. The purpose is to aut o-populate form fields for subsequent comments, in order to save time for the user.

wfvt_#

Statistics

Statistic cookies help us to understand how visitors interact with our websites by collecting and reporting information anonymously.

_ga: Registers a unique ID that is used to generate statistical data on how the visitor uses the website.

_gat: Used by Google Analytics to throttle request rate.

_gid: Registers a unique ID that is used to generate statistical data on how the visitor uses the website.

collect: Used to send data to Google Analytics about the visitor's device and behaviour. Tracks the visitor across d evices and marketing channels.

_ga,_gat,_gid
collect

Security

We use Wordfence to secure our website against hacking attempts: https://www.wordfence.com/

wordfence_verifiedHuman

Close your account?

Your account will be closed and all data will be permanently deleted and cannot be recovered. Are you sure?