Forgot your password?
typodupeerror
Sci-Fi AI

The Singularity Is Sci-Fi's Faith-Based Initiative 339

Posted by Soulskill
from the omnipotent-god-computers-will-not-run-your-life dept.
malachiorion writes: "Is machine sentience not only possible, but inevitable? Of course not. But don't tell that to devotees of the Singularity, a theory that sounds like science, but is really just science fiction repackaged as secular prophecy. I'm not simply arguing that the Singularity is stupid — people much smarter than me have covered that territory. But as part of my series of stories for Popular Science about the major myths of robotics, I try to point out the Singularity's inescapable sci-fi roots. It was popularized by a SF writer, in a paper that cites SF stories as examples of its potential impact, and, ultimately, it only makes sense when you apply copious amounts of SF handwavery. The article explains why SF has trained us to believe that artificial general intelligence (and everything that follows) is our destiny, but we shouldn't confuse an end-times fantasy with anything resembling science."
This discussion has been archived. No new comments can be posted.

The Singularity Is Sci-Fi's Faith-Based Initiative

Comments Filter:
  • by kylemonger (686302) on Wednesday May 28, 2014 @03:47PM (#47112185)
    ... out of hand, consider that for every other species extant on this planet the singularity already happened: It was us, humans. To think that it can't happen to us is simple hubris.
  • by StefanJ (88986) on Wednesday May 28, 2014 @03:59PM (#47112337) Homepage Journal

    In, ah, 1997, just before I moved out west, I went to the campus SF convention that I'd once helped run once last time. The GOH was Vernor Vinge. A friend and I, seeing Vinge looking kind of bored and lost at a loud cyberpunk-themed meet-the-pros party, dragged him off to the green room and BSed about the Singularity, Vinge's "Zones" setting, E.E. "Doc" Smith, and gaming for a couple of hours. This was freaking amazing! Next day, a couple more friends and I took him for Mongolian BBQ. More heady speculation and wonky BSing.

    That afternoon we'd arranged for a panel about the Singularity. One of the other panelists was Frederik Pohl. I'd suggested him because I thought his 1965 short-short story, "Day Million," was arguably the first SF to hint at the singularity. There's talk in there about asymptotic progress, and society becoming so weird it would be hard for us to comprehend.

    "Just what is this Singularity thing?" Pohl asked while waiting for the panel to begin. A friend and I gave a short explanation. He rolled his eyes. Paraphrasing: "What a load of crap. All that's going to happen is that we're going to burn out this planet, and the survivors will live to regret our waste and folly."

    Well. That was embarassing.

    Fifteen years later, I found myself agreeing more and more with Pohl. He had seen, in his fifty-plus years writing and editing SF, and keeping a pulse on science and technology, to see many, many cultish futurist fads come and go, some of them touted by SF authors or editors (COUGH Dianetics COUGH psionics COUGH L-5 colonies). When spirits are high these seemed logical and inevitable and full of answers (and good things to peg an SF story to); with time, they all became pale and in retrospect seem a bit silly, and the remaining true believers kind of odd.

  • by Anonymous Coward on Wednesday May 28, 2014 @04:06PM (#47112431)

    You are assuming that the human brain can not be improved or you need machines.

    What if a pill could raise your IQ to 200+ and/or give you total recall.

    Just doing that en-mass would be a Singularity compared to any society that existed before that.

  • by mcgrew (92797) * on Wednesday May 28, 2014 @06:03PM (#47113785) Homepage Journal

    Wikipedia [wikipedia.org] disagrees with you, and neither the OED or Webster's defines "technological singularity".

    The technological singularity, or simply the singularity, is a hypothetical moment in time when artificial intelligence will have progressed to the point of a greater-than-human intelligence, radically changing civilization, and perhaps human nature.[1] Because the capabilities of such an intelligence may be difficult for a human to comprehend, the technological singularity is often seen as an occurrence (akin to a gravitational singularity) beyond which the future course of human history is unpredictable or even unfathomable.[2]

    Technology has always displaced human labor. As to Wikipedia's definition, which is what this thread is about, as someone who knows how computers work, down to the schematics of the gates inside your processor (read The TTL Handbook some time) and has programmed in hand-assembled machine code and written a program on a Z-80 computer and 16k of RAM that fooled people into believing it was actually sentient, I'm calling bullshit on the first part of the definition (first put forward in 1958 by Von Neuman when my Sinclair had more power than the computers of his day).

    As to the second part, it's already happened. The world today is nothing like the world was in 1964. Both civilization and "human nature" (read this [psmag.com]) have changed radically in the last fifty years. Doubtless it changed as much in the 1st half of the 20th century, and someone from Jefferson's time would be completely lost in today's world.

  • Homo-singularity (Score:2, Interesting)

    by TapeCutter (624760) on Wednesday May 28, 2014 @09:02PM (#47115557) Journal
    Indeed, this post most likely bounced off an Arthur C Clarke satellite on it's way to the US. The singularity idea suffers from the same problem as Lovelock's Gaia idea, it gets adopted, expanded and contorted by spiritualism. "Gaia" is just the original name for "the biosphere". Likewise the "singularity" is just a label for a hypothetical point in time when AI becomes "more intelligent" than its creator.

    Science is nothing if not explicit, giving something a name is the first step in understanding (and controlling) it, language is intimately connected to human intelligence, the name tags a concept/thing, which in turn allows the human imagination to play with it, this is why quantum mechanics can only really be "understood" by those who can understand the maths, there is no everyday metaphor for the mind to grasp. Infinity and nothing really don't fit in the human mind but we just have to look up for an example of infinity so we have symbols for them where they occur in nature. If your mind cannot package it's own concepts into a word or short phrase it will not spread very well as a "meme", for example try telling someone about the periodic table without using a noun to identify the table itself.

    Personally I'm not a fan of the singularity idea, I think "smarter than a human" is a vague and largely irrelevant way to measure intelligence in an AI system, it's only useful in that we can compare the different behaviour of the two systems to learn more about both.

    The linguists are correct in that the reason humans are the smartest thing raping the planet is the sophistication of our language. About 50-60Kya we acquired the ability to tell stories using words and pictures, more importantly the stories could be recombined to form new stories and handed down the generations (education) - the ability was clearly a beneficial mutation since it spread through the global population like a dose of the flu and we immediately jumped to the top spot in the food chain, the number of "stories" we have (and have forgotten) in the last 50kyrs continues to grow exponentially without limit ( homo-singularity already happened? ).

    Computers are pretty good at "understanding" stories these days, systems exist that can write a pretty good HS book report on a random novel* in less than a second and of course IBM's Watson has demonstrated computers can do better at the open ended domain of general knowledge than the best humans. These systems are wonderful tools that are a product of the recent (last century) explosive growth in human stories, they are a tool for creating more stories, faster, much like a space telescope is a tool for rapidly generating pictures that inform our current stories about the cosmos.

    Which gets back to the reason why I'm not a fan of the singularity - To me, "something smarter than a human" implies a level of conceptual abstraction above story telling, if we knew what that was it introduces a tautology into the singularity story - ie: we would already be "smarter than a human".

    *Novel - computers are no so good a children's stories - any linguist can explain why.

Machines certainly can solve problems, store information, correlate, and play games -- but not with pleasure. -- Leo Rosten

Working...