Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Sci-Fi Star Wars Prequels Entertainment

1977 Star Wars Computer Graphics 271

Noryungi writes "The interestingly named 'Topless Robot' has a real trip down memory lane: how the computer graphics of the original Star Wars movie were made. The article points to this YouTube video of a short documentary made by Larry Cuba, the original artist, that explains how he did it. In 1977."

http://www.youtube.com/watch?v=yMeSw00n3Ac

This discussion has been archived. No new comments can be posted.

1977 Star Wars Computer Graphics

Comments Filter:
  • by harmonise ( 1484057 ) on Wednesday November 18, 2009 @01:53PM (#30146216)

    Wow, that's nice to have the dials to manipulate 3D objects. Is there anything like that which someone can buy today?

  • by peter303 ( 12292 ) on Wednesday November 18, 2009 @02:08PM (#30146456)
    The space-ship consoles show CAD-drawings the ship aligning with landing pads. Also the astronauts debugging the supposedly broker communication module used graphics. Only these was faked with drafted animation cells because computer graphics wasnt advanced enough in the 1960s to this. There were only osilliscope vector graphics then. But Kubrick and advisers like Minsky were anticipating better graphics in the future.
  • by Franklin Brauner ( 1034220 ) on Wednesday November 18, 2009 @02:10PM (#30146476)
    You could buy a vintage Battlezone, the first FPS, designed in 1979 by Ed Rotberg for Atari. It's an elegant design. No dials, but the control schema of the sticks are beautiful to behold. Most of the XY technology used in those early Atari vector machines are nearly identical to the tech described in this video. The math required for real time manipulation of XY displays is far simpler than what Jim Blinn was doing around the same time. He was a wizard for sure.
  • by theaveng ( 1243528 ) on Wednesday November 18, 2009 @02:14PM (#30146534)

    Yes my ancient 1985 Amiga at just 7 megahertz has a pirate demo like that. It showed a 3D rabbit, and you could spin him in any direction using just your mouse and the right button. It was impressive in the 80s.

  • by tverbeek ( 457094 ) on Wednesday November 18, 2009 @02:20PM (#30146592) Homepage

    That's right, kids: no computers were used in the making of "2001". Pretty remarkable.

    It's ironic: in "2001" (the movie) Kubrick had to use analog methods to simulate digital technology. But by 2001 (the year), filmmakers were using digital technology to simulate analog objects. [imdb.com]

  • by theaveng ( 1243528 ) on Wednesday November 18, 2009 @02:20PM (#30146596)

    Battlezone on the lowly 1 megahertz Atari VCS (1977) - I like the cool effects when the tank blows up. It's also very colorful for an ancient 70s game (128 colors)

    http://www.atariguide.com//ss/batlzone.gif [atariguide.com]

  • by Animats ( 122034 ) on Wednesday November 18, 2009 @02:36PM (#30146838) Homepage

    Wow, that's nice to have the dials to manipulate 3D objects. Is there anything like that which someone can buy today?

    Until about 2002 or so (about when SGI tanked), most of the high-end 3D systems supported MIDI devices as controllers. You could plug in a MIDI knob or slider box and connect it up to the joints of your character. For some reason, few people do that any more. Support for that never really caught on when 3D moved to the PC, even though MIDI devices were cheap.

    The Jurassic Park guys had a small dinosaur skeleton model with sensors at the joints wired up to a MIDI interface, so they could pose the thing and the animation would follow. That sort of thing was popular around 1995-2000 because it required little retraining for stop-motion animators.

  • by hardburn ( 141468 ) <hardburn@wumpus-ca[ ]net ['ve.' in gap]> on Wednesday November 18, 2009 @03:15PM (#30147332)

    From the article:

    And it reminds me of something -- when the Star Wars special editions were about to come out in '97, I was certain that Lucas was going to redo those computer effects, like from the Rebel briefing and on the Millennium Falcon's display during the TIE Fighter dogfight. Dead certain, because if anything dated the Star Wars movies (besides Hamill's hair) it was the computer effects.

    Quite true. In fact, the original model effects of the whole battle still look pretty good, but other parts of the movie are quite dated, and not all of them were changed in the new versions. Another example is Yoda's death scene, where the muppet disappears and sheet slowly falls into the unoccupied space. It's an obvious piece of stop motion animation, and I'm surprised Lucas didn't redo it in CGI in some of the newer remakes of Star Wars (the ones where Han shoots at the same time). He already had a Yoda computer model by then from the prequels, which is half the work done right there.

  • Analog Computers (Score:5, Interesting)

    by ei4anb ( 625481 ) on Wednesday November 18, 2009 @03:29PM (#30147482)
    John Whitney did use computers for the into-the-monolith scene, one of the first computer graphics scenes in movies. However he used analog computers and he has been credited with being the person who introduced computer effects into the film industry. Daisy was sung by a digital computer.

    The first digital computer I programmed was an IBM 1800 built in 1966 (and was donated to our university in 1975 where I got my hands on it) so I well know the level of computer power available when 2001 ASO was filmed. Back then analog computers were more suitable than digital computers for many real world tasks. Anyone studying computer science then was expected to be able to build an analog circuit to solve differential equations for example, that way was faster than the digital methods at the time. It would have taken quite a while to render a movie scene with the 4K that was left of the 1800's RAM after the compiler/runtime was loaded.

    Now, where was I? Oh yes, Get off my lawn!

  • by elrous0 ( 869638 ) * on Wednesday November 18, 2009 @03:30PM (#30147492)
    My personal favorite substitute for expensive early computer graphics was in Escape from New York [wikipedia.org]. To do the sequence where Snake is gliding into New York and looking at a computer generated wireframe of the city; James Cameron simply cut out a bunch of boxes, painted the lines on them with phosphorescent paint, and shot it in the dark.
  • by Anonymous Coward on Wednesday November 18, 2009 @03:41PM (#30147650)

    Computational power grew fast enough where you could do real-time inverse kinematics. The artist just has to move the end of the 'chain' of joints and the math handles the rest. A lot easier than trying to manage all of the joints explicitly with external hardware interfaces - and cheaper!

  • Re:Pretty Sweet (Score:1, Interesting)

    by Anonymous Coward on Wednesday November 18, 2009 @03:53PM (#30147768)

    Another cool thing about this particular Youtube channel is that some of the historical vids talk about the Zgrass hardware, which I believe was related to the Zgrass-32 Bally Astrocade add-on.

    I don't know if they ever actually released the Zgrass add-on though; it may have been vaporware.

  • by RedMage ( 136286 ) on Wednesday November 18, 2009 @03:54PM (#30147784) Homepage

    Larry said he wrote the software to do the combining of the primitives for the trench, but what was the hardware? I've used E&S consoles similar to those, but those were VAX driven, which wasn't an option in 1976. The terminal looked similar to a VT05, but that was just an impression while watching.

  • Re:Better Then CGI (Score:3, Interesting)

    by Quiet_Desperation ( 858215 ) on Wednesday November 18, 2009 @04:39PM (#30148276)

    I don't understand the anti-CGI attitude, and I'm in the older age group that is supposed to. I see a guy in a rubber suit/mask and I *my* brain says nope and starts to laugh. They're *both* fake. Who cares, really, about the tool used to realize the fakeness? Yeah, there's some poorly integrated CGI out there, but there's also CGI most people don't even notice because it's so slickly done and depicts everyday objects.

    You don't have the same reaction if the whole film is CGI, do you?

  • by Tetsujin ( 103070 ) on Wednesday November 18, 2009 @04:46PM (#30148370) Homepage Journal

    ...even for todays standards...

    For today's standards? Seriously?

    Twenty years ago this is the kind of project a hobbyist could have taken on, working alone. A PC from that era would have been sufficient to do all the modeling and rendering work.

    These days, a $300 computer would be plenty for modeling and rendering a superior final product. It's not just about raw rendering power, either - it's also about having access to software (Blender, for instance) which makes the modeling and animation tasks a lot easier to manage...

    Don't get me wrong - I think early examples of early CG work in movies is cool stuff, and I love seeing how it was done. It was impressive stuff by the standards of the day. But today? No... It's only impressive if you look at it in terms of what the guy had to work with.

    I gotta say, though, it's interesting that they chose to do that sequence with a computer. I would have thought that, since they were building models of everything anyway, it would have been easier to do the sequence as a set of model shots... With the right treatment and photographic process, a physical model could be used to create a shot that looks like a computer sequence... (Basically: paint it black, paint the edges white, light the hell out of it, and start filming... Or clear-cast a copy of the model parts, paint it black, sand off the paint on the edges, and light it from inside or behind... They could get a stark black/white shot out of that via photographic processes...) The downsides, I guess, is they'd have to have the models ready for this before shooting the briefing scene, and it would be a somewhat different look (more like a wireframe with occlusion, but shadows and such would probably blot out some of the edging, too...)

  • Buck Rogers (Score:5, Interesting)

    by McGregorMortis ( 536146 ) on Wednesday November 18, 2009 @04:46PM (#30148380)

    I remember seeing, hearing or reading something, a long time ago, from one of the effects guys on the Buck Rogers TV series (the Gil Gerrard one.) He was describing an effect in which they needed a 3-D wireframe model of a spaceship rotating on a computer monitor (much like you see here.)

    He said that he spent a fair bit of time trying to program a computer to do it, but couldn't get it to work (not really a math or computer guy at all). In the end, he fell back on what he knew best: mechanical effects. He whipped up a wireframe model using actual wire, painted it day-glo orange, mounted it on a gimbal, and stuck the whole thing inside a hollowed-out computer monitor with the insides painted black.

    Sometimes the old ways are the best ways...

  • by Tetsujin ( 103070 ) on Wednesday November 18, 2009 @04:56PM (#30148524) Homepage Journal

    Yes. I think it's called a mouse.

    Really, you could just use the mouse wheel combined with a single key modifier (hold a key on the keyboard) that would rotate whatever plane (X/Y/Z) you wanted when you spun your mouse wheel.

    Mouse wheels have shitty resolution, though. They click to individual stops, they're sent to the host as if they were button presses...

    More to the point I think is the fact that in current software, when you rotate a model with the mouse, it normally rotates it relative to the position it's in. I'm not sure if this is how those dials worked, but I'm guessing not: probably it would have been simpler then to have each dial affect rotation parameters in a matrix, and then create the projection just by multiplying those matrices together - as opposed to using the dial to rotate (and then re-nomalize) a rotation matrix...

    I think the really neat thing about that system of dials was that it was so responsive. Naturally that's something we can still accomplish...

  • Re:Analog Computers (Score:5, Interesting)

    by Tetsujin ( 103070 ) on Wednesday November 18, 2009 @05:09PM (#30148702) Homepage Journal

    Daisy was sung by a digital computer.

    That's "bicycle built for two" - and if we're talking about computers in movies, here - HAL's singing "bicycle built for two" was a human actor singing, of course. The inspiration for that bit was one of the early examples of computer music by Max Mathews [emf.org]...

    So basically I don't think that's really relevant as an example of computer-generated stuff in films.

  • Re:Better Then CGI (Score:5, Interesting)

    by skine ( 1524819 ) on Wednesday November 18, 2009 @05:10PM (#30148710)

    CGI is like make-up; it's good for covering blemishes, but if it's obvious then you're probably doing it wrong.

    The problem is that the film industry is to CGI what a ten year old girl is to make up; nothing and no one is safe.

  • by slew ( 2918 ) on Wednesday November 18, 2009 @05:12PM (#30148732)

    The ironic part about this is less is often more.

    If you saw this movie in the 1970's and saw a 2009-level computer photorealistic rendering of the trench sequence which is possible on a typical desktop computer today with a decent graphics card, you would probably say that the scene was obviously some model mockup because of the general idea of what a futuristic computer rendering was at the time and the fact that a photorealistic rendering is completely unexpected by the viewer.

    The fact that they stretched the current technology at the time helps in the total illusion of high-tech. Anything higher-tech would have just gave the impression of "magic" and lead to a completely different feeling for the movie go-er and limited the suspension of disbelief.

    "Any sufficiently advanced technology is indistinguishable from magic." - Arthur C. Clarke, Profiles of the Future

  • by Tetsujin ( 103070 ) on Wednesday November 18, 2009 @05:15PM (#30148778) Homepage Journal

    My personal favorite substitute for expensive early computer graphics was in Escape from New York [wikipedia.org]. To do the sequence where Snake is gliding into New York and looking at a computer generated wireframe of the city; James Cameron simply cut out a bunch of boxes, painted the lines on them with phosphorescent paint, and shot it in the dark.

    Yeah, I do tend to wonder why they did that sequence in Star Wars with computers when they could have used the models they were already building and faked a "computer display look" via photographic processes... Among other things I guess this would have meant delaying the briefing scene until they were done with all the Death Star trench parts (since the parts would need to be re-painted in order to do the phosphorescent lines trick) - and it would be a different effect, like wireframe with hidden surface removal... It seems like that guy had a pretty decent set-up for what he was doing, though, so maybe doing the briefing room in Star Wars as a model shot actually would have been more expensive in the end...

  • Re:Analog Computers (Score:4, Interesting)

    by cygnusx ( 193092 ) * on Wednesday November 18, 2009 @05:17PM (#30148804)

    John Whitney only proposed using computers for that sequence. Douglas Trumbull was inspired by his work and used the (analog) slit-scan technique [blogspot.com].

  • by Anonymous Coward on Wednesday November 18, 2009 @05:22PM (#30148854)

    Douglas Trumbull (who also did the visual effects for the original Close Encounters of the Third Kind) did the effects in finale of 2001 using what is called slit-scan http://en.wikipedia.org/wiki/Slit-scan_photography [wikipedia.org] photography combined with high speed photography of chemical reactions and solarization http://en.wikipedia.org/wiki/Solarisation [wikipedia.org] techniques on arial photography. No computers were involved.

  • Re:Better Then CGI (Score:3, Interesting)

    by zoney_ie ( 740061 ) on Wednesday November 18, 2009 @06:04PM (#30149346)

    I would argue that it and other elements of LOTR that used CGI were of the calibre that they were because they relied on *real* stuff (in the case of Gollum, an actual actor having his actions/features captured, not just dialogue). So many wonderful settings, although augmented and given backdrops or details filled in by CGI, were actually created as sets and props. Even the obvious CGI looks better due to relying on real stuff (e.g. replicating orc hordes based on a sizable enough mini-horde of people dressed convincingly as orcs). Watching the making of DVDs for LOTR is pretty mindboggling in terms of seeing the amount of *non* CGI work that went in (and then the CGI on top of that was very cleverly done too). In fact you only have to consider the creation of a mini Hobbiton as an example.

    Now I want to go watch it all again despite the length (although it's something like 4% shorter for us Europeans, about 30 mins across the three extended editions).

    I'd like to see more films done in such a way, although there are plenty now that use CGI not as the be-all and end-all but rather to seamlessly augment "reality" (the fairly decent sets etc. they start with).

  • by Mr. Protocol ( 73424 ) on Wednesday November 18, 2009 @06:22PM (#30149594)

    The machine was a PDP-11. It was a PDP-11/45 running a one-of-a-kind graphics OS, called GRASS, the Graphics Symbiosis System written by Tom DeFanti, a professor at the University of Illinois at Chicago (then the University of Illinois at Chicago Circle). Tom's appointment then was to the Chemistry Dept.; the GRASS system was used primarily for molecular modeling. It drove an Evans & Sutherland Picture System, a giant $100,000 vector graphics engine worth five times what the PDP-11 was worth.

    Larry's work pushed the system to its limits. His work was done at night, on the QT, with Tom's permission. This was done by giving Larry his own disk pack with a copy of the system on it. Larry's use of the system worked around all sorts of bugs in that relatively early version of GRASS. The film was made by pointing a (film) camera at the E&S screen, and running a macro which would render a frame, click the camera, render a frame, click the camera... While the PDP-11 system could in fact render the Death Star trench in real time, by the time you included all the little bits and frobs, the E&S took long enough to draw it that the display flickered. Hence the need to do frame-by-frame. Also, there was no frame-sync hardware in the system; the camera and display were connected only by the solenoid that tripped the camera shutter.

    I played with that disk pack a year or two after the fact and it was a hoot to fly around the Death Star by hand. GRASS pioneered the interactive control of complex graphics, so all the position (and other) variables could trivially be tied to dials, etc. I was discouraged by one thing: the final version of the run had apparently been deleted from the disk. The only version I could find had the big "dish" directly on the equator of the Death Star, not at 45 degrees north latitude as in the film.

    Years after that, I happened to talk to Larry Cuba by phone about something else, and asked him about that. He said the version I saw WAS the final version. Years after that, when I went to my "farewell to Star Wars viewing of Star Wars", I saw he was right. The plans shown to the rebels show the dish on the equator. Obviously the plans were fake. Those rebels were all dead men.

  • Re:Better Then CGI (Score:4, Interesting)

    by mdarksbane ( 587589 ) on Wednesday November 18, 2009 @06:37PM (#30149818)

    I think most of the problem with CGI is that filmmakers trust it too much.

    Take the Burly Brawl scene in The Matrix: Reloaded. Amazing CGI work for most of the fight. Then they go to slow motion and you can see every mistake in plain view.

    If you did most CGI effects the same way they generally do them with models (bad lighting, odd angles, quick cuts) you'd never know it was CGI.

    Oddly enough, it's often the camera-work that gives it away. Some films are finally going away from this, but there's still a very stereotypical CGI camera movement that just doesn't feel natural.

    Well, that and the constant presence of over-animated impossible robotics. Old robots felt so much more realistic when they actually had to be driven by something to work instead of having random pieces pop out everywhere with no support structure.

  • by kaizokuace ( 1082079 ) on Wednesday November 18, 2009 @07:52PM (#30150812)
    I use a SpaceNavigator for 3d modeling and animation and I can't go back to not having it. Having a tactile device to control views speeds up everything so much and is just natural to use vs keyboard shortcuts and keyboard mouseclick combos.
  • Lucasfilm VAX (Score:5, Interesting)

    by Bruce Perens ( 3872 ) * <bruce@perens.com> on Wednesday November 18, 2009 @08:33PM (#30151256) Homepage Journal
    I have the console from the Lucasfilm VAX 780. Just the top part with a few switches and lights, and a key lock, on display on the wall in my office. I removed it before Pixar (which had spun off from Lucasfilm) threw the VAX away. Apparently this is the machine used for the Genesis Effect (Star Trek) and perhaps some later Star Wars effects shot using the Evans and Sutherland Picture System 2 or 3. It would have been purchased in 1981 or later.
  • by flewp ( 458359 ) on Thursday November 19, 2009 @07:30AM (#30154594)
    Using a mouse today isn't exactly clumsy in terms of 3D modeling. In fact, I'd venture a guess that today's software + input methods is a lot less clumsy than all those dials.. I'm a modeler (and texture artist/sometimes generalist) by trade, and it's pretty damn efficient and easy - and in no way clumsy. I think I'd much rather use a mouse and keyboard (and tablet for sculpting) than all those dials and knobs. The mouse gives you a central control tool, and the keyboard can let you quickly and easily apply tools, modifiers, etc, for how and what the mouse is used for. (Or of course, you can use the mouse to click on an icon instead of using hotkeys).

2.4 statute miles of surgical tubing at Yale U. = 1 I.V.League

Working...