Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Movies Programming Entertainment

Movie and TV GUIs: Cracking the Code 74

rjmarvin writes "We've all seen the code displayed in hacking scenes from movies and TV, but now a new industry is growing around custom-building realistic software and dummy code. Twisted Media, a Chicago-based design team, started doing fake computer graphics back in 2007 for the TNT show Leverage, and is now working on three prime-time shows on top of films like Gravity and the upcoming Divergent. They design and create realistic interfaces and codebases for futuristic software. British computer scientist John Graham-Cumming has drawn attention to entertainment background code by explaining what the displayed code actually does on his blog, but now that the public is more aware, studios are paying for fake code that's actually convincing."
This discussion has been archived. No new comments can be posted.

Movie and TV GUIs: Cracking the Code

Comments Filter:
  • Facial recognition (Score:5, Insightful)

    by jodido ( 1052890 ) on Wednesday March 12, 2014 @04:46PM (#46468811)
    Maybe they'll explain to TV producers that facial recognition software doesn't work by showing each face it's checking. Yet somehow get through ginormous databases in minutes.
    • Maybe they'll explain to TV producers that facial recognition software doesn't work by showing each face it's checking.

      I always thought of it more as a throbber [wikipedia.org], the same as if the app were to display Lindsay Lohan doesn't change facial expressions [ytmnd.com] during recognition.

    • by ubrgeek ( 679399 ) on Wednesday March 12, 2014 @05:10PM (#46468981)
      It doesn't take minutes. It takes exactly as long as it does for the person at the keyboard to look up, make some remark to the main character and then glance back down when the computer goes "beep."
      • ...or it takes days, if it's necessary for the story.

        ...much in the same way that one week anyone on the planet can be tracked to inches of their location by their cellphone instantly, by anyone with a computer when the story wants it to be and other people who use perfectly normal non-CIA cellphones (because CIA cellphones are magic) can't be located, ever, under any circumstances, until the story wants them to be located.

    • by vux984 ( 928602 ) on Wednesday March 12, 2014 @05:38PM (#46469179)

      Maybe they'll explain to TV producers that facial recognition software doesn't work by showing each face it's checking

      It doesn't, and it shouldn't but it could do something along those lines by way of feedback.

      I've written a number of systems that have lengthy batch processes that flashes up record information as the system moves through; it doesn't show every record it passes as the screen updates would slow down the system enormously, but it updates a couple times a second and shows for example, every 10000th or so giving them real feedback that something is actually happening without slowing it down at all.

      The progress bars that are time 'calibrated' and do not bear any reflection to actual progress are the bane of my existence; where the process hangs at 50% at stops dead, but the time calibrated progress bar just drifts along to 99% and then eventually reports that an error happened.

      Those updates that actual data is being processed are good user feedback that its actually doing something.

      Of course I don't think facial recogntion is done with a cursor search from the start of a database to finish, the way a system batch processing transactions would be. Instead I imagine they work more like traditional databases, breaking the images down to collections of indexable information and searching the indices; so a record-by-record walk wouldn't be necessary, or perhaps would only be necessary as the final pass through a returned set to further score and sort the results.

      In any case, the trouble with TV facial recognition portrayals is less the software itself (because I can handle a dramatization of a computer search like that), I'm more offended by the portrayol of the results. There are no false positives (finding the wrong people) and false negatives, (failing to find people who ARE in the system), or multiple results. No its always either... face goes in and perp comes out... or face goes in and computer declares the person doesn't exist.

      • It doesn't, and it shouldn't but it could do something along those lines by way of feedback.

        I've written a number of systems that have lengthy batch processes that flashes up record information as the system moves through; it doesn't show every record it passes as the screen updates would slow down the system enormously, but it updates a couple times a second and shows for example, every 10000th or so giving them real feedback that something is actually happening without slowing it down at all.

        I've done similar ... either I'll show the record number, updating every {whatever interval}, or if it is looping over intelligible sets of some sort show what category it is in ... as you say, at least you get some idea both that something is happening and possibly how far it has progressed.

      • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday March 12, 2014 @08:04PM (#46470081) Homepage Journal

        In any case, the trouble with TV facial recognition portrayals is less the software itself (because I can handle a dramatization of a computer search like that), I'm more offended by the portrayol of the results. There are no false positives (finding the wrong people) and false negatives, (failing to find people who ARE in the system), or multiple results. No its always either... face goes in and perp comes out... or face goes in and computer declares the person doesn't exist.

        Statistically nobody would even understand what they were on about unless they devoted an entire episode to the concept. Which might be reasonable, of course.

      • If you go to any screenwriting class or read any books on screenwriting, they'll talk to you about the use of "compression"

        In a movie you have 120 minutes (or 120 pages of script) to tell your story. Were you to actually record what really would go on in a real life conversation / situation, you'd have:

        a) a bored audience
        b) more time needed for your "facial recognition sequence" then allotted for the movie.

        It's a key element of fiction, and you'll never see exact reality. Movies DO like to be realistic

        • This would be defensible if most fiction was good, but it's not. Way too much of it is stupid and predictable. It could do with a good injection of the sorts of uncertainties and unpredictability real life holds.

    • Maybe they'll explain to TV producers that facial recognition software doesn't work by showing each face it's checking. Yet somehow get through ginormous databases in minutes.

      It's technically silly of course, but ... it's a visual medium.

      At least the flashing faces convey the idea that a collection of faces is being searched for matches. Which is close enough in concept (I have no idea how you'd visually convey the concept of indexing and such).

  • by Joe_Dragon ( 2206452 ) on Wednesday March 12, 2014 @04:46PM (#46468813)

    Godzilla 2000 used the whats new in mame txt file on a system shown at high speed maybe they can just take txt files from anywhere and show them at speed that needs freeze frames to read them.

  • by Lumpio- ( 986581 ) on Wednesday March 12, 2014 @04:51PM (#46468859)
    A brilliant combination of real software and fake GUIs on the same screen - they obviously had a product placement deal with Microsoft, and in one scene they literally dragged a file from SkyDrive into the usual bleeping "FBI Database Lookup" window. I wish I had a .gif of that...
  • by cmeans ( 81143 ) <[chris.a.means] [at] [gmail.com]> on Wednesday March 12, 2014 @05:10PM (#46468979) Journal
    Feb. 27th, Revolution had code scrolling on the screen (yes they were debugging at light speed), but they stopped at a C function that did actually have a runtime bug that matched the story line (an unused/released C malloc). The only thing that spoiled it was that the same statement was missing a semi-colon, so the code wouldn't have actually compiled in the first place.
    Oh well...it was nice to see some code that did actually match what the characters were babbling about...even if there were other things that they did that didn't make any sense what-so-ever to someone who actually understood what they were seeing on the computer screen.
    • by geekoid ( 135745 )

      Actually some compilers(*coughgcc4.7*) in some instance will compile a missing semi colon. They assume the next line is a continuation and funny things happen.

      Of course I can't speak to the instance you are talking about.

      • by fisted ( 2295862 )

        Actually some compilers(*coughgcc4.7*) in some instance will compile a missing semi colon. They assume the next line is a continuation and funny things happen.

        This comment made me facepalm on so many levels. and then you even mention the Dunning-Kruger effect in your sig, wow.
        Protip: If you have little to no understanding of C, don't try and make supposedly intelligent comments about it. It will not work.

    • by DRichardHipp ( 995880 ) on Wednesday March 12, 2014 @05:42PM (#46469211)
      The code on-screen was real code from SQLite. The line that contained the memory leak was added by the producers. More information here: http://www.mail-archive.com/sq... [mail-archive.com]
      • Only part of it came from SQLLite. Other functions came from libfann (Fast Artificial Neural Network Library). I was rather impressed they used code from a neural network, that is completely in line with the story. Good Job, (R)Evolution.

        • That seems to be a hallmark of Abram's shows...they always try to "fill in" the little details, even things that seem just like background props often are small clues to the stories. In Revolution, each sword of the Monroe Militia had a unique serial number even...something the watchers of the show will never see BUT it does add realism from the actor's POV and thus makes them more believable.
    • The only thing that spoiled it was that the same statement was missing a emi-colon

      That was the ONLY thing that spoiled it?

      In a show where power is magically inhibited by some fucking nanites, who can also bring back power to stuff that has 15-year old aged batteries or no other power source, can be used as weapons, while at the same time having the ability to heal people, that that have become sentient, that can bring the fucking dead back to life, are being worshiped, communicate through hallucinations, ca

      • by geekoid ( 135745 )

        Some people can enjoy something for the context and world it represents.

        I bet you a blast a parties.

        • Thing is, it started off trying to be fairly realistic. If they had started out from day 1 explaining that these nanites were implemented to collect and redistribute power with some not fully understood tech that might be supernatural or alien, then it would be easier to just sit back and enjoy. But instead they keep everything secret so they can pull out some new WTF whenever they feel like it.

          Take the basic premise and characters, and stuff it in a mind-warping anime, and it would probably work well.

          But a

          • I'm glad they didn't flat-out tell about the nanites from day 1. The reason they "keep changing" is because they too are evolving. If someone was dead long enough, they probably would stay dea . The nanites can repair physical damage, defib the heart, I'll bet all the people it "brought back" weren't brain-dead yet. And since they obviously have repairing capabilities, the nanites are probably also maintaining the electrical grid, since they need it to survive.

            One of the few ways to "defeat" the nanite
      • by Richy_T ( 111409 )

        I stopped after ep 2. If nothing else, it looked like cancel-fodder.

      • by Lumpy ( 12016 )

        Hey! At least all the zombies in the Walking Dead are still mowing all the lawns....

      • by AmiMoJo ( 196126 ) *

        In a show where power is magically inhibited by some fucking nanites

        SPOILER ALERT.

    • Revolution is an amazing show...still looking for Mile's sword online...another JJ show to watch is Almost Human, Someone decided to put Total Recall 2070, Blade Runner, Total Recall (the original movie), and several other motif's right from PKD into a great show that is both quite funny and very interesting...every show has some little shout-out in it, from the midget in the "fat chick exoskeleton" (looking much like Arnold's 'mask' in TR) to the slow techno-jazz that could be right off of Blade Runner's s
  • They moved to Portland so it was cool to see the city in the background, but Sophie bacame a pycho stalker and the Microsoft product placemnt was hard to ignore. I loved the 3D rendered blueprints of buildings they are able to magically pull up...like most of them arent still on blue wide format paper.
    • by oneiros27 ( 46144 ) on Wednesday March 12, 2014 @05:20PM (#46469057) Homepage

      Blueprints aren't blue paper.

      It's actually a light-sensitive chemical reaction (cyanotype). The back side of blueprints (without the dye) are white. Before it's been exposed to UV and cured, the dye is kinda yellow-ish.

      • by Macgrrl ( 762836 )

        Remembering back 25 years or so when I worked as an architectural draftsman, the yellow paper printed to a purple-blue-ish line on white paper - it was quite pale. There were also gloss versions that left a black line but I can't remember if they were also 'yellow' before being exposed. We also used a stock that came out sepia brown.

        Generally a print was made from a 'positive' drawing on tracing paper, the image would appear where ink obstructed the UV light from reaching the transfer sheet/copy. Anywhere w

    • Man agreed. Season 5 killed the show. Up until then I seriously couldn't get enough and watched the different seasons many times. Season 5 just had so much nonsense (remember when they built a bloody holodeck?) they ruined the illusion of "it's possible" and it just got to be so much nonsense.

  • by idontgno ( 624372 ) on Wednesday March 12, 2014 @05:18PM (#46469037) Journal

    Fallout New Vegas has a man-portable 25mm automatic grenade launcher. It has an on-screen display scrolling what looks like code while the weapon is firing.

    The code? It's a piece of BASH scripting. With a crippling syntax error ("if" without closing "fi"). [wikia.com]

    If this was the height of alternate-history pre-war embedded software technology, I can understand why derelict car engines can explode in a nuclear explosion.

  • by asmkm22 ( 1902712 ) on Wednesday March 12, 2014 @05:49PM (#46469249)

    The creators of CSI are hands-down either the most tech-illiterate people on the planet, or the best Trolls in the industry. I can't tell which it is.

    Here's a real gem:

    http://www.youtube.com/watch?v... [youtube.com]

    • Oh yeah, NCIS, which I'd like to point out contains 3 of 4 letters in CSI.

      .. but yes, I find that having two people typing on my keyboard at once generally creates better code than I write on my own.

      For the record, NCIS is probably the show I hate most of all, but then again I don't watch much crap TV if I can help it.

  • That seems highly unlikely.

  • by terrab0t ( 559047 ) on Wednesday March 12, 2014 @06:04PM (#46469355)

    An animator for the TV show Archer popped into Reddit's Linux section [reddit.com] to point out an in-joke he'd placed in some code on an extra monitor in a scene. He says he's added many more gags like this.

  • If I remember correctly, the second matrix movie showed a closeup of a terminal where someone was running nmap with sensical command line arguments. No, it didn't make it any better.

    • It was more than just sensical: They were using it to locate a server that was vulnerable to a real exploit for a real exploit for a real(if old) version of SSH1:

      http://nmap.org/movies/ [nmap.org]

      It's always disappointing when terrible movies mix total nonsense with very real information, as it raises expectations too much. Other examples of somewhat correct lingo mixed with nonsense is the scene in Hackers where they go through real books that actual hackers from way back then would find useful. But even if I bring m

    • Last time I watched Matrix 2 I found it nice and entertaining. It's an underrated movie. There's so much worse stuff out there (including the third movie) and Neo meeting "the architect" is mindless fun, I laugh at the complicated words and trying to understand what he means. The first movie feels old and tired (rewatching it is not rewarding as you know everything about it already) and it had the horrendous crap about mining bio-electricity. One bullshit line that ruins the whole series.

  • Apple II disassembly used to be a go-to for this kind of thing.
  • Love 'em or hate 'em, but the ship display screens in 2001 were quite original. It was early foreshadowing by a detailed director, telling a future of very lazy ones to come after.
  • I have always found it funny that Apple gives away all these Macs to television shows and movie production, but 9 times out of 10 when they show the screen it is something completely made up and looks nothing like OS X.

    Anyway, I watch out for these things very carefully and I have to lend some credit to Revolution. In two different episodes a computer booted into what was very clearly a korn shell. I was a bit impressed. They also show a lot of code on that show, but it is too briefly shown and obfuscated
    • by tlhIngan ( 30335 )

      I have always found it funny that Apple gives away all these Macs to television shows and movie production, but 9 times out of 10 when they show the screen it is something completely made up and looks nothing like OS X.

      It harks back to Apple's business model - they sell hardware. Software like OS X, iOS, iTunes Stores (music, movies, books, TV shows, apps), and other Apple software (iWork, iLife, Aperture, Logic, Final Cut) are really only used to promote that.

      So Apple will happily let people run Windows on

  • It's been a while since I watched it, but I recall that one of the DVD extras for the first "Iron Man" movie is on the GUI design of the HUDs for the suits. The designers apparently thought quite a bit about the specific HCI issues that might arise for such a usage situation (essentially like a fighter plane, but with more stuff), and so there are nested menus that radiate out from the lower left when the user's attention focuses on that part of the display, without obscuring the full field of view, etc.

    The

    • by tlhIngan ( 30335 )

      One of my favourite special effects stories is that back when "Escape from New York" was being made, it was too difficult/expensive to do the computerized 3D wire-frame rendering of Manhattan digitally that was to be displayed on Snake Plissken's glider, so they just made black miniature models of the buildings with gridlines painted on them, and then "flew" a camera over them to get the footage that ended up being displayed on the screen. Back in those days, practical effects based on painted wood were sti

    • by Macgrrl ( 762836 )

      The "computer graphics" from the original HHGTTG TV series were hand animated cells. It was the only way for them to animate the guide within budget at the time. From memory they used a blue screen to project the animations onto the guide's screen in post production.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...