Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Movies It's funny.  Laugh. Programming

Programmer Debunks Source Code Shown In Movies and TV Shows 301

rjmarvin writes "Someone is finally pausing TV shows and movies to figure out if the code shown on screen is accurate or not. British programmer and writer John Graham-Cumming started taking screenshots of source code from movies such as Elysium, Swordfish and Doctor Who, and when it became popular turned the concept into a blog. Source Code in TV and Films posts a new screenshot daily, proving that, for example, Tony Stark's first Iron Man suit was running code from a 1998 programmable Lego brick."
This discussion has been archived. No new comments can be posted.

Programmer Debunks Source Code Shown In Movies and TV Shows

Comments Filter:
  • common and fun (Score:5, Interesting)

    by Speare ( 84249 ) on Tuesday January 14, 2014 @10:33AM (#45950567) Homepage Journal

    Doesn't everyone who can proram do this? Just like gun fans identify and count shots for each weapon they see?

    From the (mistaken? wise?) use of a .300 in an IPv4 address in The Net, to the identification of some kind of 6502 assembly code in the Terminator's red overlay, it's always been something to try to do in the theater without freeze-frame available.

  • by Bradmont ( 513167 ) on Tuesday January 14, 2014 @10:41AM (#45950667) Homepage
    So if the code is taken, used, and redistributed without acknowledgement, is that copyright abuse? I imagine tiny snippets would fall under fair use, but if a substantial block of code from, say, a GPLed project is reproduced without acknowledgement or attaching the license, what are the chances the filmmakers could be held liable?
  • Re:oh duh (Score:5, Interesting)

    by aitikin ( 909209 ) on Tuesday January 14, 2014 @10:49AM (#45950751)

    For examples, in two different films with Matthew Broderick, his modifying school records, assuming that he does indeed have credentials, is not implausible..

    Interesting factoid about those, as I recall, Broderick actually learned to code the 8080 for his role in Wargames and saved some time in filming because of it.

  • Re:oh duh (Score:4, Interesting)

    by TWiTfan ( 2887093 ) on Tuesday January 14, 2014 @10:53AM (#45950813)

    My favorite is when cracking/hacking is shown to be ridiculously easy. As in: leet hacker guy types a few characters and clicks this one thing...and.....WE'RE IN!

  • Re:thats crazy (Score:5, Interesting)

    by game kid ( 805301 ) on Tuesday January 14, 2014 @10:55AM (#45950825) Homepage

    Speaking of spaceships, I found it fun to contrast these fake code uses with one in the game Starbound (got it a day or few after it hit Steam as an Early Access game). When you obtain enough fuel (like coal) from your current planet there and send it back to your spaceborne ship, you can take it to another planet and enjoy a flashy warp sequence with code that scrolls on a screen. The code shown is that of...the warp sequence. [reddit.com] (Starbound is a C++ game, and you'll notice fun things in the display like uint64_t and class names.)

    Granted, it's almost certainly not a true quine [wikipedia.org], as it uses only a portion of the code; said code is in PNG form, not text; and I doubt the display will be updated for each patch, especially this early in development.

  • Re:oh duh (Score:5, Interesting)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday January 14, 2014 @11:34AM (#45951269) Homepage Journal

    Just like when an actor is playing a piano on-screen, you can tell the difference between real typing and fake typing when you watch it.

    There is a middle ground where the timing of the keystrokes is used for the display of the keystrokes. They don't have to hit the right keys, but it still helps. And you can do it after the fact with timecodes, or you can code it into the demo. The fact that so many movies fail at it even though they have two perfectly good options for implementing it is particularly pathetic.

  • Re:common and fun (Score:5, Interesting)

    by ledow ( 319597 ) on Tuesday January 14, 2014 @11:36AM (#45951283) Homepage

    Well, yes, but the point is that there's no need to do this.

    If you're making a film about cars, get someone who knows about cars to help produce/edit it, at least for glaring inaccuracies. If you're making a film about guns, the same. If you're making a film about computers, the same.

    To be honest, even the "555" phone number is enough to jolt me out of a movie I'm into - you instantly are reminded that it's fake things you are watching (which is not what a film director should be doing to their captivated audience).

    I've always had this annoyance, too. I have it about computer movies, mathematics and science. A geneticist I live with has it about science and genetics in general (do not let her watch Gattaca or Jurassic Park!). My ex and her father (both black belts) have it about anything martial-arty. My dad (a mechanic) has it about cars and mechanics.

    I just don't see how hard it is to get someone who vaguely knows what they are doing to actually step back and say "hold on, that wouldn't happen". I don't expect perfection but at least if you're qualified enough to teach, say, a film star kung fu over a year of filming, have the decency to make sure that the moves you teach are realistic and there's no "queue of baddies waiting to be beaten up, because they're too stupid to attack simulatenously" elements. Same for computer graphics - SOMEONE with computer knowledge had to make them and display them, just ask them what it would look like if they REALLY did what the actors are being asked to do.

    Same for cars, guns, planes, stunts, etc. You have an expert on the movie, ask them if it's at all realistic and, if not, change it. Artistic licence is fine so long as you KNOW that's why you're doing it but too often directors go OUT OF THEIR WAY to make things "pretty" when actually the real thing would be a lot more realistic, useful, interesting, less jarring, etc. (e.g. who the hell uses text-based displays nowadays, and why do you need to "fake" loading screens or password decryptions or whatever - everyone KNOWS what a computer looks like and how display windows work).

    You don't get this in theatre, except by accident. You don't get it in novels, because the amount of detail required means you can hide all the potential pitfalls behind the line "He logged on..." or similar.

    You only get it in Hollywood, and you must only get it through directors who think they know what LOOKS better. While a certain percentage of the audience can't stop laughing at the ridiculous methods used, or just screen "NO! That's NOT how it works" at the screen.

    I don't get why annoying your audience is a good thing, at the expense of listening to the people you hired to be experts anyway.

  • Re:oh duh (Score:4, Interesting)

    by jellomizer ( 103300 ) on Tuesday January 14, 2014 @11:53AM (#45951469)

    Back in the 1980's there was much more interest towards programming.
    It was a topic taught in Elementary Schools, the general conception was the future of computing is where everyone will program the computer to their needs, they never really though about having a large supply of existing application to pick and choose from.

    I am not surprised about this fact, it if people are to read code like any other language it would be considered as silly showing wrong code, as it is for an actor to talk in a garbled tongue and pretend to be a french man.

    However things have changed, most people don't read code, and the code they show on the screens are just to make it look complicated, and usually only show for a few seconds, too short for even good coders to go back and say oh this code does this. Usually in that period of time, I may be able to get the language, they are using, or the OS. But for the most part I turn myself off and focus on the plot, not the detail on what is on the screen.

  • by tepples ( 727027 ) <tepples.gmail@com> on Tuesday January 14, 2014 @02:46PM (#45954491) Homepage Journal

    Back in the 1980's there was much more interest towards programming.
    It was a topic taught in Elementary Schools, the general conception was the future of computing is where everyone will program the computer to their needs

    I know precisely what killed that, and it was the introduction in the mid-1980s of home computers that run only applications approved by the computer's manufacturer. The biggest culprits were the North American version of the Atari 7800, whose IPL used an RSA signature to verify that Atari had approved the program, and the North American and European versions of the Nintendo Entertainment System, which used a pair of synchronized CICs (checking integrated circuits, essentially pseudorandom number generators implemented on microcontrollers) in the Control Deck and Game Pak to verify that Nintendo had approved manufacturing of the PCB. (Later consoles, such as Microsoft's Xbox and Nintendo's Wii, would use an elaboration of Atari's method.) These cryptographically enforced walled gardens helped to erode elementary school students' interest in programming.

8 Catfish = 1 Octo-puss

Working...