Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Movies Technology

Netflix Invents New Green-Screen Filming Method Using Magenta Light (newscientist.com) 36

An anonymous reader quotes a report from NewScientist: Netflix researchers have created a new type of AI-powered green-screen technology that can produce realistic visual effects for film and television in real time. Green-screen technology is routinely used to capture footage of actors that can then be inserted in the foreground of virtual or prerecorded scenes. To do this, actors are filmed against a bright green background, which is easily isolated and removed digitally. This process can be done automatically with reasonable accuracy, such as in television weather forecasts, but it can be thrown by items of green clothing or by transparent or fine objects, like wisps of hair. When greater accuracy is needed in films or television series, specialist operators tweak settings manually, sometimes requiring hours to perfect a shot.

In a bid to create a technique that is both fast and accurate, Netflix has come up with a method it calls Magenta Green Screen (MGS). Actors are filmed against a background of bright green LEDs while being lit from the front with red and blue ones, which together create a magenta glow (see video, [here]). Because digital cameras work by taking an individual red, green and blue value for each pixel, this technique has the effect of creating a green channel that records only the background, with the foreground appearing black, and red and blue channels that record only the foreground, leaving the background looking black. Together these create the magenta and green look. Film editors can replace the green channel in real time, realistically and instantly placing the actors in the foreground of another scene, with even potentially tricky areas, such as transparent bottles or the area around strands of hair, working without problems.

But there is a problem with the method. Because the foreground is only recorded in blue and red, it leaves the actors looking magenta-tinted. To solve this, Netflix uses artificial intelligence to put the full range of color back into the foreground, using a photograph of the actors lit normally as a reference to create a realistic-looking green channel. This AI works quickly, but not yet in real time, although fast techniques such as averaging the red and blue channels to create an approximation of a green channel work effectively enough for the director to monitor while filming.

This discussion has been archived. No new comments can be posted.

Netflix Invents New Green-Screen Filming Method Using Magenta Light

Comments Filter:
  • Near UV? (Score:4, Interesting)

    by bill_mcgonigle ( 4333 ) * on Saturday July 08, 2023 @09:09AM (#63667995) Homepage Journal

    Why not illuminate the scene with UV just outside of visible and put a coating on the greenscreen that doesn't reflect UV?

    Wouldn't that create a perfect mask without AI?

    They could use a sensor with minor parallax or prisims and mirrors without parallax. Heck, even a tuned cmos sensor for the purpose eventually.

    Given the DR of modern sensors they can probably afford a prism setup.

    • Re:Near UV? (Score:5, Informative)

      by Anubis IV ( 1279820 ) on Saturday July 08, 2023 @09:18AM (#63668019)

      I recall seeing that Industrial Light & Magic used a similar technique, but with IR instead of UV, when filming The Irishman. Sort of like an Xbox Kinect on steroids. They used a traditional camera for principal photography, but then had two IR sensitive cameras alongside it (along with some IR grid projectors), positioned for stereoscopic video capture, that way they could get 3D geometry for every scene without impacting the way the scene actually looked to anyone on set.

      Why they chose to go with magenta instead of a non-visible part of the light spectrum is baffling. Instead, they basically recreated a variation of the technique that allowed Disney to create that animated scene with the dancing penguins in the original Mary Poppins, which was done before green screens. That one also involved a special light and special film that they used to create a mask, though that work was obviously done in post.

      • As an aside, Corridor Crew covered the Mary Poppins “magic prism” that allowed them to pull off that trick (but which was subsequently lost) in this episode on YouTube, somewhere around the 10 minute mark. https://youtu.be/26b7uqZcXAY [youtu.be]

        • by mcsynk ( 896173 )

          That's super interesting. Thanks for the link. I dredged around in the comments and came up with this article :

          https://en.wikipedia.org/wiki/... [wikipedia.org]

          Apparently the lost sodium lamp prism thing is a cinematic urban myth. It's just much more expensive than green screen.

          The process is not very complicated in principle. An actor is filmed performing in front of a white screen that is lit with powerful sodium vapor lights. Such light has a narrow color spectrum that falls neatly into a chromatic notch between the various color sensitivity layers of the film, so the odd yellow color does not register on the red, green, or blue layers.[5] This allows the complete range of colors to be used, not only in costumes, but also in makeup and props. A camera with a beam-splitter prism is used to expose two separate film elements. The main element is regular color negative film that is not very sensitive to sodium light, and the other a fine-grain black-and-white film that is extremely sensitive to the specific wavelength produced by the sodium vapor.[5]

          This second film element is used to create a matte, as well as a counter-matte, for use during compositing on an optical printer.

      • Re:Near UV? (Score:5, Insightful)

        by dfghjk ( 711126 ) on Saturday July 08, 2023 @09:47AM (#63668089)

        "... though that work was obviously done in post."

        So is this, only with "AI". Anything done after exposure is literally post-exposure.

        The Netflix idea is absolutely terrible, they destroy the color rendering of the foreground data, the most important data in the scene, and then REPLACE it with artificially generated data. They give greatest fildelity to the parts of the image they wish to remove, the replace everything else, preserving ZERO of the original scene. They must have consulted SuperKendall, master photographer.

        There is no reason to choose from among the three major components of tristimulous color and OBVIOUS reasons NOT to, yet they not only did the exact wrong thing, they chose to exclude the tristimulous color that the eye is MOST sensitive to and which contributes the majority to brightness perception. Truly a worst case solution.

      • by ledow ( 319597 )

        This technique would work with standard RGB cameras, so you could put it in home / amateur software.

        That's literally it's only advantage.

        • That depends on if you need a massive LED video wall that's placed far behind the subjects (as in their examples) or if you could use an 85" LED TV with some sort of tricks to keep the magenta light from reflecting off it
          • by jtgd ( 807477 )
            If the people in the foreground were in front of an 85" LED TV then you could just display what you wanted behind them and skip the green screen. No problem with wisps of hair there.
            • I'm not sure the pallette would line up properly with that. With chromakey, even if it's off it'll be aligned to the same incorrect value (poor accuracy, good precision) Another consideration is that you'd have to have baked background footage with the correct focus, etc. and it could not be changed
      • Why go with a non-visible part of the spectrum when it is just as likely to be reflected off materials as the visible parts? When you choose the costumes you'll have no idea until you bring it on set unless you remember to test it before hand.

      • by tlhIngan ( 30335 )

        The question is - do we need such technologies?

        We already have "virtual green screens" that work in real time that aren't perfect, but work really well. Next time you're on a Zoom call, try enabling it and using the simple webcam in your laptop, you're immediately green screened into a virtual background. Hell, every platform has it - Zoom, Google Meet, Teams, etc.

        Green screens work really well as well, given they've been used for decades now.

        Are we somehow not able to combine the techniques? I can imagine

        • The problems with the current method were listed right there in the summary. "but it can be thrown by items of green clothing or by transparent or fine objects, like wisps of hair."

          We've all seen the bad background replacement around Zoom-type background replacements. The constantly shifting edges around the perimeter of a person's face, especially if they are wearing a headset/mic combo. Hair and transparent objects are problematic because the green background shows through with a variability that cause

    • by RuudNL ( 6186070 )

      Sounds like you're trying to do the inverse of what I've described. Using a prism on the camera would allow you to quickly add an IR sensor as an add-on. Manufacturer would need to support this.

    • Because not everything reflects UV, some parts of the costumes will reflect the UV others won't. Never taken a UV photograph? UV is probably the worst choice given how many things fluoresce in UV. https://en.wikipedia.org/wiki/... [wikipedia.org]

    • Why not illuminate the scene with UV just outside of visible ...

      Not sure how good that would be for the actors' (or crew's) eyes and skin, especially if they're on set filming for hours.

  • Infrared (Score:3, Insightful)

    by RuudNL ( 6186070 ) on Saturday July 08, 2023 @09:21AM (#63668025)

    Sounds like illuminating the background with infrared (840nm, 940nm) would work, and creating cameras that can record a fourth alpha channel (infrared).

    This basically solves all the problems with this technology, it even creates a channel that can be used to realistically light the actors from behind by replacing the infrared with the intended colors. It's kind of like the real life version of an alpha channel.

    If it's a problem that the recording environment is too dark as a result of this, white LEDs can be used to augment the infrared LEDs. Alpha is unaffected because it's a separate channel. The white light can be cut out using the alpha channel as guidance, similar to how this works for virtual production tech.

    The only potential problem is reflectivity of certain substances/fabrics which may be slightly different for infrared compared to visible light, but this can be compensated both pre-shooting and in post production.

    Another potential problem is contamination with other IR sources like iPhone Pro VCSEL LIDAR which fires @ 940nm while taking pictures, but this can be mitigated in post for sure.

    • Re:Infrared (Score:4, Insightful)

      by dfghjk ( 711126 ) on Saturday July 08, 2023 @10:01AM (#63668109)

      The entire reason to use green screen is because everything is illuminated using the same combination of light sources. Also, "green" is not magic, other colors have been used. I knew "blue screen" before "green screen".

      The moment you use different illumination to isolate the foreground, green is what you DON'T want to use. The idea you describe is one of a number of better approaches.

      The fact that Netflix doesn't understand this shows that they are using ignorant computer people to do this, not photography people. RGB is all they know, and they barely know that. And the idea that you DESTROY all the data you want to capture and replace with AI-generated images is just absurd. As though CRI is totally unimportant (except during model training, of course!) Damn it's so stupid.

      My area of photography expertise is underwater, where mixed light is complicated and varying. I always used complementary light and lens filters to manipulate background rendering in LENS, not in post. Other photographers didn't understand it, just like Netflix doesn't.

      • Yea technically its referred to as chroma key in all my software suites. The most popular is that green shade as its less likely to appear otherwise. Blue was often an issue for weathermen because it would key on their flashy ties with blue in it. But you can chroma key on any shade of nearly any color.
        • The sodium vapor process [wikipedia.org] used by disney for things like Mary Poppins used a prism in the camera to get real-time alphas.

          I remember its upsides being better punchouts without gaps and graded transparency of foreground images. Hair looks much better (if the lighting and coloration match well) and they didn't need to make the foreground transparent to hide the masking lines (see snowspeeders.) Personally, I like the better transparency of this process over just about any other process. I imagine that i

    • Well... pretty much exactly what I came here to post. Use an IR-reflective background and wash it with a bright narrow-frequency IR source. Humans won't see it or be damaged by it, but your cameras can image it and your computers can easily filter it.

      If you use a narrow enough frequency and make it bright enough, you really shouldn't have to worry too much about other IR sources causing an issue. Get close enough to the visual spectrum with your chosen frequency and you don't have to worry about heating

    • Filmmaker IQ has a good overview on the evolution of green screen compositing [youtube.com]. Disney had a "Yellow Screen [wikipedia.org]" system using low pressure sodium vapor lights, but this takes extra equipment and work; thus costs more..
    • Frame sync the LEDs such that every X frame is a true color frame. Use that frame to do the color correction. For real time processing, use it going forward, for post processing you may want to use a combination of forward and back frame adjustments.

  • by williamyf ( 227051 ) on Saturday July 08, 2023 @09:25AM (#63668041)

    Like this, or like the "Simian Army" suite of tools. All of this will find uses in many industry players.

    Is good to see that not all the money is used to greenlight cockamamie projects, or to greenlight good projects that then get cancelled after just one season...

  • Sounds a lot cheaper to stick with the old method and just have bald actors who never wear green nor glitter. Sinead O'connor and Patrick Stewart will reign.

  • It seems to me they could (rapidly) cycle the lighting between:
    (1) Just the green back-light (any color should do fine, actually)
    (2) Just the full-spectrum scene lighting (no back-light)
    (3) GoTo (1)
    The cameras for capturing outlines versus actors would need to be synchronized with the lighting cycles of course.
    • by ledow ( 319597 )

      Or just use infrared etc. and tuned sensors and thus avoid the problem entirely.

      Worst case, you have to have a non-IR-reflective wardrobe, but literally you wouldn't be able to tell the difference in the visible spectrum.

  • Just film everything using X-rays so you can get a complete picture, inside and out!

  • Green screens are a fairly recent thing, blue was colour originally in common use for TV, and the process was called chroma keying. Turns out there is even a Wikipedia page cover the colours used https://en.wikipedia.org/wiki/... [wikipedia.org].
  • Lighting the subject with the color compliment of the blue/green screen has been standard since the beginning. This is straightforward for TV studio talking heads.

    The issue on feature films is that it's cheaper to have a compositor spend a week (literally a week) to hand roto the actor than to take an extra twenty minutes to do the lighting on set. I have seen some absolute crap green screens that don't even cover the entire area behind the actor.

    For motion control model photography, they came up with

God help those who do not help themselves. -- Wilson Mizner

Working...