[HN Gopher] Real-time, in-camera background compositing in The M...
       ___________________________________________________________________
        
       Real-time, in-camera background compositing in The Mandalorian
        
       Author : ashin
       Score  : 139 points
       Date   : 2020-02-20 21:12 UTC (1 hours ago)
        
 (HTM) web link (ascmag.com)
 (TXT) w3m dump (ascmag.com)
        
       | devindotcom wrote:
       | This is super interesting stuff and I've been following it for
       | some time. I just wrote it up with a bit more context:
       | 
       | https://techcrunch.com/2020/02/20/how-the-mandalorian-and-il...
       | 
       | It's not just ILM and Disney either, this is going to be
       | _everywhere_. It 's complex to set up and run in some ways but
       | the benefits are enormous for pretty much everyone involved. I
       | doubt there will be a major TV or film production that doesn't
       | use LED walls 5 years from now.
        
       | DonHopkins wrote:
       | "Once Upon a Time" (2011-2018, with Robert Carlyle as
       | Rumplestiltskin!) was shot on virtual chroma-keyed sets with real
       | time integrated pipeline tools to preview how it would look.
       | 
       | https://en.wikipedia.org/wiki/Once_Upon_a_Time_(TV_series)
       | 
       | The tech behind Once Upon a Time's Frozen adventures
       | 
       | https://www.fxguide.com/fxfeatured/the-tech-behind-once-upon...
       | 
       | Once Upon a Time" TV Series VFX Breakdown
       | 
       | https://web.archive.org/web/20180623020817/http://www.animat...
        
       | web-cowboy wrote:
       | So we'll be able to play video game adaptations of the locations
       | in the episodes really easily/soon, right? ;)
        
       | huebomont wrote:
       | Fascinating, but this article needs a proofread, damn...
       | 
       | "The virtual world on the LED screen is fantastic for many uses,
       | but obviously an actor cannot walk through the screen, so an open
       | doorway doesn't work when it's virtual. Doors are an aspect of
       | production design that have to be physical. If a character walks
       | through a door, it can't be virtual, it must be real as the actor
       | can't walk through the LED screen."
       | 
       | Not to mention the multiple paragraphs that are basically re-
       | stated immediately afterwards. It's like they hit publish in the
       | middle of editing.
        
       | vsareto wrote:
       | Can you get a decent job just by knowing Unreal Engine well?
       | Maybe by just doing small POC projects?
        
       | rebuilder wrote:
       | The Mandalorian was probably a very likely candidate for this
       | kind of approach, since it's essentially a western, meaning a lot
       | of wide landscape shots.
       | 
       | The LED screen approach works nicely for fairly uncomplicated
       | background geometry, like a plain. Try shooting Spiderman
       | climbing up walls on that, and things will get tricky fast.
       | 
       | As the article notes, slow camera moves are a plus as well. The
       | reason given is technical, but I also wonder how far you could
       | really push the camera motion even if tracking lag wasn't an
       | issue. The background is calculated to match the camera's
       | viewpoint, so I expect it would be very disorienting for the
       | actors if the camera was moving at high speeds.
        
       | [deleted]
        
       | cbhl wrote:
       | "The solution was ... a dynamic, real-time, photo-real background
       | played back on a massive LED video wall and ceiling ... rendered
       | with correct camera positional data."
       | 
       | Gee, that sounds a lot like a holodeck. We've come a long way
       | from using Wii Sensor Bars[0] for position tracking.
       | 
       | [0] https://www.youtube.com/watch?v=LC_KKxAuLQw
        
         | modeless wrote:
         | The "holodeck" version of this is called a CAVE and the first
         | one was built in 1992:
         | https://www.youtube.com/watch?v=aKL0urEdtPU
         | https://en.wikipedia.org/wiki/Cave_automatic_virtual_environ...
        
         | ragebol wrote:
         | For the VFX industry, the tracking had already been solved for
         | ages, with those reflective little balls on suits etc. in a
         | mocap system. The Wii sensor bar's thing was that it was really
         | cheap.
         | 
         | But yes, damn close to a holodeck. But you can't see depth in
         | this setup, right?
        
           | whatshisface wrote:
           | They have polarized 3D screens, and with head tracking, you
           | have it.
        
           | rebuilder wrote:
           | Mocap systems haven't really been able to produce deliverable
           | results without human intervention very long. I'd argue
           | they're still not there, some filtering and cleanup of the
           | data is usually required. A lot of VFX is still about
           | throwing human labour at problems.
           | 
           | Edit: I should note I'm talking about motion capture for
           | characters etc. Capturing the motion of a rigid object like a
           | camera in a controlled environment is very doable.
        
           | nocut12 wrote:
           | If it's perspective corrected for the camera, it would
           | probably look very distorted for anyone else on set --
           | whether there's depth or not
           | 
           | And that's certainly not the goal with this. Something along
           | those lines has been around for a while (https://en.wikipedia
           | .org/wiki/Cave_automatic_virtual_environ...). This system
           | seems specifically targeted for solving challenges for film
           | production, as it probably should be.
           | 
           | I am pretty impressed that real time rendering has gotten
           | good enough to use for these purposes. I certainly wouldn't
           | have expected that those backgrounds in the show were coming
           | out of a video game engine.
        
             | RaptorJ wrote:
             | From the video posted upthread, around 3:40 it looks like
             | they're doing just that: https://youtu.be/gUnxzVOs3rk?t=220
        
             | miohtama wrote:
             | They mention they cannot push enough GPU juice to the
             | screens, so they only render the camera focus area in full
             | resolution. Also there is 12 frame lag which prevents
             | moving camera too fast.
        
       | en4bz wrote:
       | I think the demise of Lytro was a huge missed opportunity for the
       | film industry. They had this and a number of other features in
       | their cinema camera before they became defunct a few years ago.
       | 
       | https://www.youtube.com/watch?v=4qXE4sA-hLQ
        
       | oseibonsu wrote:
       | Here is the Unreal Engine tech they are using:
       | https://www.unrealengine.com/en-US/spotlights/unreal-engine-... .
       | This is a video of it in action:
       | https://www.youtube.com/watch?v=bErPsq5kPzE&feature=emb_logo .
        
         | KineticLensman wrote:
         | Unreal Engine is also used by the BBC to create virtual studios
         | for football punditry programmes. This uses a simpler green
         | screen technology, but it demonstrates how Epic are moving away
         | from their gaming roots.
        
         | jahlove wrote:
         | Here's a video of it in action on The Mandalorian set:
         | 
         | https://www.youtube.com/watch?v=gUnxzVOs3rk
        
       ___________________________________________________________________
       (page generated 2020-02-20 23:00 UTC)