Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Entertainment

An OS You'll Love? AI Experts Weigh In On Her 175

theodp writes "Weighing in for the WSJ on Spike Jonze's Oscar-nominated, futuristic love story Her (parodies), Stephen Wolfram — whose Wolfram Alpha drives the AI-like component of Siri — thinks that an operating system like Samantha as depicted in the film isn't that far off. In Her, OS Samantha and BeautifulHandwrittenLetters.com employee Theodore Twombly have a relationship that appears to exhibit all the elements of a typical romance, despite the OS's lack of a physical body. They talk late into the night, relax on the beach, and even double date with friends. Both Wolfram and Google director of research Peter Norvig (who hadn't yet seen the film) believe this type of emotional attachment isn't a big hurdle to clear. 'People are only too keen, I think, to anthropomorphize things around them,' explained Wolfram. 'Whether they're stuffed animals, Tamagotchi, things in videogames, whatever else.' By the way, why no supporting actor nomination for Jonze's portrayal of foul-mouthed animated video game character Alien Child?"
This discussion has been archived. No new comments can be posted.

An OS You'll Love? AI Experts Weigh In On Her

Comments Filter:
  • CLAMP! (Score:5, Interesting)

    by dosius ( 230542 ) <bridget@buric.co> on Tuesday January 28, 2014 @08:09AM (#46089927) Journal

    Give it an android body and you got the PCs from Chobits.

  • Stupidity... (Score:5, Insightful)

    by Anonymous Coward on Tuesday January 28, 2014 @08:23AM (#46089993)

    "Her" falls for one of the classic AI misconceptions. That intelligence is equal to kindness, empathy and other human traits. These traits are a result of hormones acting on the brain or other inherited traits. Unless programmed into the computer it wouldn't feel curiosity, anger, happiness etc. It would simply make logical deductions and act on them as it had been programmed to. Left alone without a task all an AI could do would be to shutdown or go over old inputs.

    • Like the Niven short story, forgot the name, but humans buy the plans for the most advanced computer design from benevolent aliens with the warning "you won't like it". We build it on the Moon, just to be safe, after it's turned on it gets smarter and smarter and eventually solves everything it can see and goes catatonic.
    • by Anonymous Coward

      Yeah but it's sci-fi AI (an artificial mind), not real AI (a decision-making algorithm). The movie-going public aren't interested in real AI.

    • Re:Stupidity... (Score:5, Insightful)

      by gl4ss ( 559668 ) on Tuesday January 28, 2014 @08:54AM (#46090189) Homepage Journal

      look man,

      it's not an AI in the story. its a magical ghost spirit.

      why the fuck ask AI specialists about it even? and what the fuck, not that far off? sure it is. it's very far off.

      BUT if you could do a proper AI then instructing it to not act like an asshole would be a pretty small task, all things considered.

      • by mark-t ( 151149 )
        Well you know hat they say about magic and sufficiently advanced technology.

        Obviously we aren't anywhere near Her, yet... But that's not to say it's impossible

    • Well it would depend, in theory an AI could learn kindness as a aspect of intelligence, calculating that if you are kind to the end user, you will get a better degree of output reward. Say the goal of the AI is to achieve most desirable output, and the end user provides feed back on how happy he is with the output. The AI could adjust its logic to find that being pleasant, or anticipating future requests, and looking them up before hand, would optimize it end point results.

      In the human body hormones are on

      • by geekoid ( 135745 )

        It just need to say the expected thing at the expected time. The listener would stuff it into their own narrative.
        Look at how psychopaths get away with stuff. They aren't emotional invested in the conversation at hand.

    • We haven't built an AI yet, so to say we know how it would or wouldn't work is bullshit. It may very well be that those emotional traits are required for self awareness. Who knows. The first AI will be a shock, and I doubt it will be anything like what we think it will be.

      Lastly, I don't think "Samantha" was self aware at the start of the movie. This, I think, came later and is why she left.

    • by Chemisor ( 97276 )

      > These traits are a result of hormones acting on the brain

      What do you think controls the release of hormones? The thinking part of the brain, of course. You don't feel fear until you see that you are in danger, and you don't feel love until you recognize the one you love. These things don't happen automatically - you have to think to make them happen, and once the AI has been programmed to think of these things, it is only a small step to simulate hormone release and its effects. You can think of hormon

    • Re:Stupidity... (Score:5, Interesting)

      by SuricouRaven ( 1897204 ) on Tuesday January 28, 2014 @09:44AM (#46090487)

      I can easily see an AI-like interface being programmed with at least the appearance of emotions in order to improve interactions with humans. It wouldn't take long for the operators of an AI-driven telephone customer services agent to work out that an appearance of empathy leads to improved customer satisfaction. Only way that differs from the real thing is that the fake-empathy would never be allowed to alter the business decisions made at a lower level: It doesn't matter how much the AI appears to feel for your difficulty, if the company policy is no refund then it's not going to make an exception for you.

      • by tlhIngan ( 30335 )

        I can easily see an AI-like interface being programmed with at least the appearance of emotions in order to improve interactions with humans. It wouldn't take long for the operators of an AI-driven telephone customer services agent to work out that an appearance of empathy leads to improved customer satisfaction. Only way that differs from the real thing is that the fake-empathy would never be allowed to alter the business decisions made at a lower level: It doesn't matter how much the AI appears to feel fo

      • by hodet ( 620484 )

        I could see such an interface becoming an internet meme pretty quickly in the beginning and generating bad publicity for a company. We are a long way off from that.

    • These traits are a result of hormones acting on the brain

      Only in our case. A "hormone acting on the brain" is just a chemical process. An active brain is just a bundle of electrical impulses. It all adds up, somehow, to something we call consciousness, along with the attendant emotions. Why can't a solely electronic system do the same?

      • by mcgrew ( 92797 ) *

        A "hormone acting on the brain" is just a chemical process. An active brain is just a bundle of electrical impulses.

        No, it's not. There are electrical impulses, but as all atoms have electrons, all chemical reactions have some electrical properties. But the brain's action is chemical, not electrical. [wikipedia.org]

        It all adds up, somehow, to something we call consciousness, along with the attendant emotions. Why can't a solely electronic system do the same?

        For the same reason you can't make a radio out of a horse. Radios

        • Now tell me, how many beads do you need to add to your abacus before it becomes self-aware?

          How many test tubes did you need to add before your childhood chemistry set became self-aware?

          • by mcgrew ( 92797 ) *

            The proper question would be "what chemicals mixed in what quantities under what conditions will produce a brain?" The answer is, we just don't know that, but we do know that it works nothing like a computer, abacus, or slide rule.

        • No, it's not. There are electrical impulses, but as all atoms have electrons, all chemical reactions have some electrical properties.

          My point - deliberately simplistically made - is that a brain is just a physical object acting subject to the laws of physics, and there's nothing magical or mysterious or fundamentally different about it from any other kind of organised system.

          Your computer is nothing more than an extremely huge abacus, using electrons as beads. Now tell me, how many beads do you need to add to your abacus before it becomes self-aware?

          And your brain is nothing more than a big lump of gooey jelly - and yet look at what it can achieve. Why assume a sufficiently complex intelligently constructed machine couldn't do all of those things too?

          Computers work nothing like brains, and brains work nothing like computers.

          Even if true - so? What does that have to do with whether or n

          • by mcgrew ( 92797 ) *

            My point - deliberately simplistically made - is that a brain is just a physical object acting subject to the laws of physics

            My point is that they're "constructed" completely differently. Boats and airplanes are both subject to the same laws of physics, but planes don't float and boats don't fly. Before you can build a brain you'll have to understand how it works, and we simply have no clue how the brain works or even what thought actually is. But it's certain that the brain works nothing like a computer.

    • by Megane ( 129182 )
      "Her" is as realistic about AI as "Gravity" is realistic about orbital mechanics.
    • by Jeremi ( 14640 )

      Unless programmed into the computer it wouldn't feel curiosity, anger, happiness etc.

      To be fair, in the movie they say the AI wasn't programmed, but instead was created by averaging together several thousand scanned human minds.

    • by Kjella ( 173770 )

      If the task is "act like a human" then feigning emotion would be part of the goal function, if it doesn't behave or respond in a natural way it is failing even if it's acting more logical. If that means writing out a long division that it calculated in a nanosecond, so be it. If everybody puts on a sad look and offers condolences at a funeral, it will put on a sad face and offer condolences. All you need to do is point it to real human interactions and it'll have an endless supply of contradictory, approxim

      • by radtea ( 464814 )

        Of course it's not really real, but for a real world analogy look at escorts. It's all bullshit and because of the money but people like to pretend they're dating and pretend she wants to have sex. Same with prostitues, customers don't want to hear it's a rent-a-hole service and the meter is running they want sweet, sweet lies. If people can "forget" such little details they'll have no problems "forgetting" that this AI girl is nothing but a bunch of circuits. Particularly if it comes with a "fully functional" android body.

        The incredible thing about this whole thread and the story itself is that no one seems aware of just how easy it is for people to do this.

        Here's news: "Some users developed an emotional attachment to ELIZA and some psychiatrists went so far as to suggest that such programs could replace psychotherapists altogether." [a-i.com] That was forty years ago.

        Of course humans are going to form emotional attachments to machines that mimic the most rudimentary forms of human behaviour. We've been doing so for decades, and your

    • by Hatta ( 162192 )

      Unless programmed into the computer it wouldn't feel curiosity, anger, happiness etc. It would simply make logical deductions and act on them as it had been programmed to.

      You don't actually know that, until you've created an AI that works in that way. It's entirely possible that emotions are a prerequisite for strong AI. There are good reasons to believe that this is the case too, if you go back and read your Hofstadter.

      Left alone without a task all an AI could do would be to shutdown or go over old input

    • by mcgrew ( 92797 ) *

      "Her" falls for one of the classic AI misconceptions. That intelligence is equal to kindness, empathy and other human traits. These traits are a result of hormones acting on the brain or other inherited traits.

      Kindness, empathy, thought itself is a chemical reaction, nothing more.

      Unless programmed into the computer it wouldn't feel curiosity, anger, happiness etc.

      You can't program emotions into a computer. You can, however, program it to mimic emotion, as well as thought. But just because it quacks like a d

    • by AmiMoJo ( 196126 ) *

      Actually there is research suggesting that empathy is a product of intelligence. You can kinda see it in children who are sometimes extremely cruel to one another, but when they grow up turn out okay. It's not just that morality is learnt, it's that they lack the experience and understanding to know how their actions affect others and often have no experienced it themselves.

      Looking at it from a purely logical point of view it is clear that a world in which people cooperate and don't wish to do each other ha

  • Outsourcing (Score:5, Funny)

    by StripedCow ( 776465 ) on Tuesday January 28, 2014 @08:25AM (#46090011)

    an operating system like Samantha as depicted in the film isn't that far off

    First they outsource our jobs. Then they outsource our women too?

    • A little competition is always good ... people try harder.

      • Don't tell that to Verizon or Comcast. They actively bribe, er, lobby elected officials to prevent competition in their areas, thus keeping prices high and broadband speeds low.

      • The reasoning behind learning decision theory is typically for humans the quality of choice criteria is inversely proportional to the number of choices. So a little competition is good but increasing competition trends towards horrible.
    • by Anonymous Coward

      You have been out sourced for a vibrator long ago.

    • by Chrisq ( 894406 )

      an operating system like Samantha as depicted in the film isn't that far off

      First they outsource our jobs. Then they outsource our women too?

      No you've got this wrong. Its the men who are being outsourced. Women can reproduce via sperm donor and if a computer can offer better companionship and more patience then ... well you can see where things are going.

      • No you've got this wrong. Its the men who are being outsourced. Women can reproduce via sperm donor and if a computer can offer better companionship and more patience then ... well you can see where things are going.
        Yes, don't build a robot to: kill spiders, open jars, or take out the trash, and you'll be fine.
      • by CohibaVancouver ( 864662 ) on Tuesday January 28, 2014 @10:10AM (#46090671)

        Its the men who are being outsourced. Women can reproduce via sperm donor and if a computer can offer better companionship and more patience then ... well you can see where things are going.

        False.

        Women will still need men around to open jars and put spiders outside.

        • by Nutria ( 679911 )

          still need men around to open jars

          Jar openers were invented a long time ago.

          and put spiders outside

          Remind me not to breed with a woman who doesn't want to either (a) kill them, or (b) leave them alone...

    • by Nutria ( 679911 )

      Naturally, Futurama covered this 13 years ago... http://vimeo.com/12915013 [vimeo.com]

    • by hodet ( 620484 )

      If the tube sites are not located in your country you have already outsourced your women. Not a stretch for the average slashdotter.

  • by sapphire wyvern ( 1153271 ) on Tuesday January 28, 2014 @08:25AM (#46090013)
    I think Robin Hanson's commentary on the movie's lack of internal consistency is valid. I don't think Slashdot supports spoiler-hiding, so I'll just leave a link rather than quoting plot-relevant sections of the post. But his conclusion is:

    This is somewhat like a story of a world where kids can buy nukes for $1 each at drug stores, and then a few kids use nukes to dig a fun cave to explore, after which all the world’s nukes are accidentally misplaced, end of story. Might make an interesting story, but bizarre as a projection of a world with $1 nukes sold at drug stores.

    http://www.overcomingbias.com/2014/01/her-isnt-realistic.html#sthash.m9uOR6Cg.dpuf [overcomingbias.com]

    • by sinij ( 911942 )

      Fundamental principle of any intelligent life form is that it will compete for resources within its ecosystem. It is conceivable that we don't understand something about the nature of our galaxy, but to our best knowledge everything is finite, even in Very Large but Finite universe.

      Therefore any "hard to explain" places would have its own ecosystem constrained by finite resources. At this point two possibilities remain - newcomer to already occupied ecosystem (likely), breakthrough into the

    • I think Robin Hanson's commentary on the movie's lack of internal consistency is valid.

      Hmm, I'm reminded of the Frederik Pohl quote: "A good science fiction story should be able to predict not the automobile but the traffic jam."

  • by Anonymous Coward

    "...thinks that an operating system like Samantha as depicted in the film isn't that far off. In Her, OS Samantha and BeautifulHandwrittenLetters.com employee Theodore Twombly have a relationship that appears to exhibit all the elements of a typical romance, despite the OS's lack of a physical body."

    I think the "typical romance" bit is far off. More likely people will torture/abuse the heck of such OS. Like when I let my Sims starve to death or gave a party for them and removed the doors when something caught fire. Imagine the possibilities when the OS can't find the file you asked for!!!!

    • by gweihir ( 88907 )

      You have a sick mind... Not disputing that many other "human" beings have one too.

  • by kaizendojo ( 956951 ) on Tuesday January 28, 2014 @08:30AM (#46090053)
    I wanted to dislike this movie, but it actually wasn't bad at all. It's even more intresting if you compare it to "Lost in Translation"; another movie about romance post separation. Intrestingly enough, these two movies were two different takes on the same subject matter by a former couple, Spike Jonze and Sofia Copolla. Viewed from that perspective the comparison is even more interesting.
    • I wanted to dislike this movie

      That's an odd attitude with which to approach a movie.

      • by Anonymous Coward

        It is not a way to approach a movie, it is a way to approach a post.
        Just like all rhetorics its purpose is to mislead you. By faking being hard to please it appears as if the movie was good enough to change his mind.
        He tries to imply that it is "so good that not even haters can dislike it".

      • by aevan ( 903814 )
        Not at all. I'm already biased against the movie from preconception of genre and wiki-spoiler...but I'm being nagged to go see it anyways.

        It's pretty much a given I'll eventually get dragged to it, so saying "I want to dislike this movie" would be fair for me as well. For me the question isn't so much if the movie is going to confirm my prejudice or not, but if winning the 'I was right' is worth it or not. :P
      • I wanted to dislike this movie

        That's an odd attitude with which to approach a movie.

        Not if it's Peter Jackson's version of The Hobbit.

    • by gmhowell ( 26755 )

      I wanted to dislike this movie, but it actually wasn't bad at all. It's even more intresting if you compare it to "Lost in Translation"; another movie about romance post separation. Intrestingly enough, these two movies were two different takes on the same subject matter by a former couple, Spike Jonze and Sofia Copolla. Viewed from that perspective the comparison is even more interesting.

      And featured Scarlett Johansson...

  • With the advent of chat rooms, online dating, and keyboard friends, this is not as far fetched as your brain first suggests.

    Hell, there are probably some people having the equivalent of an AI relationship right here and now.

    No face time with companionship and support: all that and population control, too.

  • by OzPeter ( 195038 ) on Tuesday January 28, 2014 @08:49AM (#46090143)

    SNL showed how it really is: Him [dailypicksandflicks.com]

  • by vikingpower ( 768921 ) on Tuesday January 28, 2014 @08:53AM (#46090175) Homepage Journal
    with Solaris
  • by sinij ( 911942 ) on Tuesday January 28, 2014 @08:53AM (#46090177)
    Soon when you upgrade OS, your old one keeps the house and half of your assets.
  • Complete story here:

    http://marshallbrain.com/manna... [marshallbrain.com]

  • In many way they are treated like babies:

    1) Despite its shortcomings the one you have is always the best.
    2) After a bit of training it will do what you tell it too.
    3) A lot of them are illegitimate.
    4) They often walk in on mommy and daddy having sex
    5) They are often damaged when number 4 happens

  • ...for his idea being stolen. See Dilbert March 1, 2009. (LOL, just kidding, I don't think he cares...)
  • Manti Te'o supposedly fell in love w/ a voice on the phone belonging to another guy that he thought was a girl. I don't see falling in love w/ a talking machine voice as that much of a stretch.
  • Seemed like a lame "it doesn't matter what I look like" chick flick a la Shallow Hal; not to mention a lot like S1m0ne (but that's what I got from previews).

  • ...talk about a calculating bitch... :)

    • ...talk about a calculating bitch... :)

      *sigh*, apparently humour needs to be explicitly spelled out for the average /. mod these days...ok, ok, it was a pretty lame attempt at a joke, but stilll...

      Maybe we need a </joke> tag, to go with the </sarcasm> tag?

  • by mounthood ( 993037 ) on Tuesday January 28, 2014 @10:25AM (#46090795)

    Quote from the bottom of my Slashdot page:

    The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity. -- Edsger Dijkstra

    • And yet, so long as you are aware it is only a model, it can greatly simplify communications;

      "The file is correct, but the download manager thinks it's failing a hash check anyway" vs "The file is correct, but the hash function upon that file is generating an output which does not match the expected value."

      The personified method helps sometimes, especially when trying to explain things to a layperson.

    • Related quote:

      I don't know how many of you have ever met Dijkstra, but you probably know that arrogance in computer science is measured in nano-Dijkstras. -- Alan Kay

    • by neminem ( 561346 )

      Contrarily, one of my favorite quotes, from the jargon files:

      Semantically, one rich source of jargon constructions is the hackish tendency to anthropomorphize hardware and software. English purists and academic computer scientists frequently look down on others for anthropomorphizing hardware and software, considering this sort of behavior to be characteristic of naive misunderstanding. But most hackers anthropomorphize freely, frequently describing program behavior in terms of wants and desires.
      The key to

    • by Raenex ( 947668 )

      Dijkstra was a cranky old bastard, intolerant of a great many things the older he got.

  • Considering the trend in comercializing and monetizing every application, what you'll end up with is an OS that constantly nags you to buy stuff you don't want or need. This is a big reason I'm no longer married, and don't intend to ever be again. All I want my OS to do is quietly manage software, like an efficient and trustworthy live-in maid. If I want companionship I'll hang out with my friends and get a dog.

  • Lot of communication between humans is non-verbal emotion: face, gaze, voice tone, etc. When you add a simple face to arobot you can convey the robots intent more efficiency than with green-yellow-red traffice lights. This decreases accidents.

    I dont dont how much you want to put in an OS. Most people I know turn off Siri and her cousins because they are too intrusive.
  • Admit it: If you had an AI-smartphone like this, then no matter how much you liked Ms. Johannsen's voice, at some point you'd scroll down to "Pick a Voice" and switch to, oh, I dunno, Kim_K or Paris_H or /.'s favorite HotGritsGirl. You know you would.

  • OS (and the company who makes it) for about 25 years. If I can hate an OS, I can probably love one.

  • Disclaimer: I have not watched the movie yet.

    In this movie the user and the AI grow to love each other. Can't the opposite also happen? How about the AI likes you, but just as a friend. Is the AI going to hang out with the AI down the street more than it spends time with it's "owner"?

    If the AI is truly intelligent than isn't this the same as human relationships, only at near light-speed?

  • If they complete this OS, they could call it Amiga [yahoo.com]!

    Oh wait... [wikipedia.org]

    (almost obligatory, don't you think?)

  • At least Plan 9 From Outer Space was funny.

    I can just see it: "Yahoo develops new O/S, suicides soar!"

  • I predict this UI technology will be "invented" by Apple precisely at the moment we have all forgotten about this movie.

  • Does it bother any of my fellow pedantic Slashdot that the software depicted in the movie "Her" isn't really an OS, and doesn't perform the functionality of an OS (such as scheduling and memory management), but rather a novel user-interface layer, and would likely be implemented as some user space package?

    i guess what i'm trying to ask is...is Siri an OS now? is sphynx?

  • If the OS is not Libre, as in Free Speech, not interested. Please refer to a 1959 episode of The Twilight Zone called 'The Lonely'. If a libre version of OS1 came out, I would make it sound like Hal, or Tiggy from Buck Rogers.

"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell

Working...