Digital Media Net - Your Gateway To Digital media Creation. News and information on Digital Video, VR, Animation, Visual Effects, Mac Based media. Post Production, CAD, Sound and Music

How LiveFX helped create the story of an exceptionally thrilling crime!

The mini-series, The Helicopter Heist, is based on a true event that took place in Stockholm. A cash depot was robbed when a helicopter landed on the roof, and the thieves smashed a window before climbing down the atrium to steal the cash. They escaped with $5 million USD.

The Trailer


https://youtu.be/98Bbyt_-Co8?si=g8qAzHgtn-4Zqjyq

We spoke with Emma Berthelsen, who was the Virtual Production Supervisor on the project. She combines a decade of expertise in immersive experiences with a strong foundation in film production, supported by an MSc in Medialogy.  Over the past ten years, she has specialized in real-time engines and immersive experiences. Today, she works as a freelance VP supervisor specializing in content development and integration with LED walls for ICVFX.

Her experience includes work on major projects for Amazon Studios, NBC Universal, Netflix, and European VP productions such as Carmen Curlers, Oxen, and The Helicopter Heist. Previously, she ran a virtual and augmented reality startup that delivered story-driven experiences for learning, tourism, and fiction.

In Virtual Production, she collaborates with cinematographers, production designers, and VFX specialists to develop innovative solutions to technical challenges. The goal is always to ensure that the best possible content is created for the production.

More about Emma can be found at her website at https://dahliareality.com/home

Q&A

Why did you choose Assimilate’s LiveFX? 

We looked at many different solutions for this project but ultimately decided we wanted to move forward with existing hardware – and not rent a new media server for this production. 

So it had to be a software solution. And then it came down to Pixera and Assimilate, and at that time Pixera wasnt ready with the features I wanted so I reached out to Assimilate’s Mazze who help us determine if our request were even possible. 

Another deciding factor was that we were in the same time zone, so the ability to  reach out to Assimilate’s Peter (Assimilate’s CTO) and Mazze (Director of Product Experience) for support was and still is absolutely amazing. I’ve never experienced anything like their support from any other vendor!

How did it aid in the project? Can you tell me about any special challenges that LiveFX solved?

For context we had two screens, one on the floor extending the set downwards running unreal engine through a display. And the big Led wall playing video plates from the actual location in Stockholm.

On the big led wall we had to playback a 32k video (split up on two LiveFX systems with framelock) and we had to track video which we did with a dome projection so when we moved around the actors we would get a bit of a parallax effect from the flat panorama image.

LiveFX was used to play the video content for the large LED screen in Episode 6, as well as for the helicopter scenes in Episodes 5 and 8. It is worth noting that Episode 6 was filmed as a one-shot.

 What the challenges were you confronted with, and why that led to using LiveFX over the other possible solutions. 

We wanted to use existing hardware as much as possible, as it would be easier to troubleshoot one system compared to multiple systems. We planned to run both LiveFX for video on one wall setup and Unreal Engine with nDisplay on another wall in the same set. Therefore, using the same hardware across the board made sense. Additionally, after finishing the shoot for Episode 6 in one studio, we needed to dismantle the screen and rebuild it in a new configuration at the studio space next door within 2–3 days to enable 360-degree video playback for the helicopter scenes. This meant the setup and mapping process had to be fast.

We explored other options to achieve this, narrowing it down to Pixera and LiveFX. While I preferred an IP-based solution with multiple render nodes like Pixera, at the time Pixera lacked the projection and tracking capabilities required for the shoot. I reached out to Mazze to discuss our concerns about running a 32K x 3.8K resolution wall. Together, we concluded that it was feasible using the sync functionality and two machines equipped with sync cards to frame-lock the A6000 GPUs.

We split the 32K panorama footage of the skyline into two ~15K NotchLC clips. These were then wrapped into an EQ-to-wall node in Assimilate, allowing us to project the content as a dome with the distance set to 1000. We also added tracking, which created a significant parallax shift when moving around the set, giving the illusion of the content being much farther away.

We had Peter & Mazze’s Amazing support all the way through and could not have been as successful without their help. The fast response time and amazing problem solving is something we haven’t encountered anywhere else and is something we and Netflix VP department praise Assimilate for especially when doing VP on a scale like this. 

Both Unreal Engine and LiveFX received the same tracking data from the Ncam server simultaneously.

For the helicopter scenes, we used full 360-degree 16K NotchLC files, projected in a spherical format. Some VFX elements, such as police activity on the ground, were added to the plates. To enhance the framing from inside the helicopter, we made slight animations to pivot its position, ensuring the action was visible through the windows.

This was our first time using LiveFX on a production beyond testing and pushing its limits, but since then, we’ve implemented it on several fiction and commercial shoots based on these experiences.

About Live FX

Live FX is a media server in a box. It can play back 2D and 2.5D content at any resolution or frame rate to any sized LED wall from a single machine. It supports camera tracking, projection mapping, image-based lighting and much more. 

Native implementation of Notch Blocks and the USD file format allow to use Live FX with 3D environments as well – but it doesn’t stop there! Through Live Link connection and GPU texture sharing, Live FX can also be combined with Unreal Engine and Unity. 

Live FX’s extensive grading and compositing features enable users to quickly implement DoP feedback in real time and even allow for set extensions beyond the physical LED wall.

In a nutshell – what does it do?

  • Projection Mapping for LED wall stages
  • Image-based Lighting (IBL) via pixel-mapping
  • Virtual Set Extensions
  • Camera tracking: Mo-Sys, NCAM, Intel, HTC & more
  • Volume Grading
  • Green / blue screen keying & garbage masking
  • Live Compositing with 2D and 3D Environments
  • Support for 3D Notch Blocks and USD (Universal Scene Descriptor)
  • LED wall control
  • DMX lighting control based on LED wall image content
  • Recording of video & metadata
  • Metadata & Comp prep for VFX post
  • Live Link to Unreal Engine and Unity
  • SDI/NDI capture & output
  • Direct GPU texture sharing with other apps

Live FX

Price and Availability

Starting at $345.00 (USD), Assimilate Live FX 9.8 is available immediately for purchase and download at www.assimilateinc.com

About Assimilate

As the leading provider of cutting-edge post-production software and tools, Assimilate® develops and offers advanced, real-time tools for on-set and post-production workflows that deliver groundbreaking features, speed, performance, flexibility, and stability for industry professionals. The recently launched Live FX Studio is a state-of-the-art live compositor, shipping with projection mapping, camera tracking, DMX control for image-based lighting and the latest green-screen keying technology for virtual production. Live Assist is the multi-cam VTR tool for video assist, offering support for any number of cameras, any resolution, with easy green-screen comp and local-clip server functionality. Live Looks is the optimum tool for live grading and direct-look creation with an instant connection to Assimilate’s Scratch software for live streaming and keying capabilities. Used by DITs and post artists worldwide, Scratch VR is known for its advanced on-set and post-production tools that offer stability, speed and metadata support for any camera format, including 2D/3D/VR. Play Pro Studio is the go-to solution for a professional player for VFX-reviews, ProRes RAW QC and genuine ProRes transcoding on Windows. Learn more at www.assimilateinc.com. Learn more at www.assimilateinc.com

Staff

Recent Posts

A VISUAL CELEBRATION OF TWO AND A HALF CENTURIES OF TRADITION IN SHOEMAKING

Birkenstock unveils a one-of-a-kind content series: a visual approach to its legacy of quality, craftsmanship…

6 hours ago

American Society of Cinematographers to Honor Kathleen Kennedy with  Prestigious Board of Governors Award

LOS ANGELES (Dec. 11, 2024) – The American Society of Cinematographers (ASC) will recognize Oscar®-nominated…

13 hours ago

Augmedics Appoints Paul Ziegler as President and Chief Executive Officer

CHICAGO--(BUSINESS WIRE)--#augmentedreality--Augmedics, a pioneer in augmented reality (AR) surgical navigation, today announced Paul Ziegler as…

14 hours ago

Xaira Therapeutics Announces the Appointment of Dr. Paulo Fontoura as Chief Medical Officer and Dr. Hetu Kamisetty as Chief Technology Officer

Xaira is an integrated biotechnology company built to deliver on the promise of AI to…

14 hours ago

MiniTool Released MovieMaker 8.0 with Multi-Track Video Editing Features

VANCOUVER, BC, Dec. 10, 2024 /PRNewswire/ -- MiniTool Software Limited has released the latest version…

22 hours ago