
Behind the Scenes of the Netflix hit Secrets We Keep: A Virtual Production Deep Dive
Interview By Lou Wallace
“Secrets We Keep” (Reservatet) has taken the world by storm, topping Netflix charts globally with its chilling suburban mystery. Produced by Uma Film, Secrets We Keep is a Netflix limited series that follows Cecilie as she becomes entangled in the disappearance of her neighbor’s au pair, Ruby, in an affluent Copenhagen suburb.
The investigation reveals dark secrets within the seemingly perfect community and forces Cecilie to confront uncomfortable truths about her own life and those around her. The series explores themes of privilege, complicity, and the lies families tell themselves.
The limited series stands out for its use of virtual production (VP) techniques. We sat down with Theis Emma Berthelsen, Virtual Production Supervisor, to explore how the team brought the story’s setting to life through cutting-edge filmmaking.
About Theis Emma Berthelsen

With over a decade of experience in immersive storytelling and virtual production, Theis Emma Berthelsen holds an M.Sc. in Medialogy and brings a rare blend of technical skill and creative insight. Her portfolio includes work with Amazon Studios, NBC Universal, Netflix, and high-profile European productions such as Sur la route de papa and The Helicopter Heist.
She currently serves as CTO at Norse Theme Parks while working freelance as a VFX/VP supervisor, specializing in integrating content with LED wall setups for in-camera VFX (ICVFX). Her goal is always to ensure the highest quality content delivery by bridging creative vision with technological feasibility.
“In virtual production, my role is about more than supervision—it’s about translating creative intent into a technically sound pipeline that works under the pressure of real-time filming,” she explains.
A 2.5D Solution for Stability and Flexibility
Virtual production was used to set-extend the Winther-Jensen residence — a modern home with expansive panoramic views of a garden, forest, and nearby lake. The location is real and was scheduled for on-site filming later in the production, with the series intercutting between interior and exterior scenes. The interior scenes were among the first to be filmed, shot on a studio soundstage featuring a full-scale set build of the residence’s interior. Filming took place in early spring, even though the story is set during the Danish summer.
This is where virtual production played a critical supporting role. LED walls were installed outside the panoramic windows of the set to simulate the garden view. This setup provided full control over lighting, weather, and time of day — essential at a time of year when trees in Denmark are still bare. Beyond continuity, the virtual environment also became a narrative tool: the weather and light conditions displayed on the LED walls were used to enhance the emotional tone inside the house, aligning the exterior atmosphere with the story’s mood.
For the content the production opted for a 2.5D setup —a decision born out of stability needs and scheduling constraints. The team had 19 total shooting days on the LED stage, with approximately one and a half months of stage build and execution, this meant no time for production halt or technical issues, which is why the team turned to a 2.5D video plate pipeline.
The production used CG-rendered video plates. The reasoning was straightforward: the garden and forested exterior scenes featured dynamic elements like wind-blown trees and a storm sequence. These are difficult to simulate in real time, especially without extended pre-production or a dedicated on-set Virtual Art Department to optimize every weather scenario.
“We needed control and stability,” Berthelsen says. “An unreal engine setup simply wasn’t feasible for what we needed given the time and resource constraints. The 2.5D setup allowed us to simulate rendered wind scenarios while ensuring stable and fast playback.”
The team at Copenhagen Visual recreated the real garden views for the LED screens using extensive reference material from the actual location, including LiDAR scans to ensure accurate tree placement. This was crucial for maintaining visual continuity when cutting between virtual production scenes and on-location footage. The garden environment was split into distance-based layers to create parallax movement as the camera moved on set. This layering also enabled optimized performance, with higher resolution allocated to foreground elements and lower resolution to background elements. All the video layers were rendered with alpha channel in Notch LC, a GPU-based codec, allowing the Assimilate Live FX Studio media server to run the eight concurrent video layers smoothly on the media servers powered by A6000 GPUs.
Technical Breakdown
LED Volume Setup:
- 158 square meters of LED wall
- Two LED walls 20x 4.5meters: one for the main window and another for the side windows of the set.
- Each wall was powered by a dedicated NVidia Live FX machine with A6000 GPUS
- Synchronized using Live FX Studio SyncPlayer and NVidia Quadro Sync
Weather and Time Variations:
- 19 different lighting/weather scenarios were created to simulate changing conditions
2.5D Parallax System:
The 2.5D playback system was set up and composited in Live FX, with NotchLC video layers organized into timelines based on different weather scenarios. This allowed the stage team to switch between setups in seconds. Each scenario’s video layers were stacked in Z-depth and tested with camera tracking to ensure full screen coverage and realistic parallax movement. Using Live FX also gave the stage team complete grading control, enabling real-time adjustments to individual layers or applying an overall grade across the entire LED wall to help blend the virtual background with the physical set easily.
Camera Tracking Challenges and Solutions
Since the entire set was a fully built house with white ceilings, camera tracking posed a significant challenge. Outside-in tracking systems were not feasible, as there was no practical way to mount external cameras with full coverage of the space. The team opted for an inside-out tracking approach, which introduced its own complications. They used the Ncam system (now ZEISS CinCraft Scenario) to address the coverage limitations. However, the white ceiling combined with constantly changing lighting created a problematic environment for inside-out tracking, which relies on high-contrast features to generate a stable point cloud.
To overcome this, the team installed infrared markers across the ceiling, ensuring consistent tracking performance regardless of lighting conditions or camera position. As a backup, fiducial markers were placed strategically throughout the set, allowing the tracking team to quickly reinitialize the point cloud in case of signal loss — minimizing any risk of production delays.
“The solution was to integrate Zeiss’s Ncam’s tracking markers into the design of the set’s ceiling, so they appeared as an integrated part of the production design. This way, we avoided generating a new point cloud for tracking each lighting setup.”
Says VP Technical Supervisor Mark Stig Bertelsen
The tracking system integrated seamlessly with Live FX, enabling operators to apply live tracking data to the 2.5D setup. All video layers adjusted in real time, with no performance loss on playback or interaction.
Production and Collaboration
The project was a collaborative effort between several key players:
- Production Company: Uma Film
- VFX & VP Content: Copenhagen Visual
FilmGEAR Rental delivered the comprehensive technical infrastructure for production. From foundational lighting and grip gear to advanced tracking systems, LED screens, and motion-reactive lighting setups. All of it seamlessly integrated via www.filmgexr.com.
For more on her work, visit: Dahlia Reality
IMDb: https://www.imdb.com/name/nm9708582
About Assimilate

As the leading provider of cutting-edge post-production software and tools, Assimilate® develops and offers advanced, real-time tools for on-set and post-production workflows that deliver groundbreaking features, speed, performance, flexibility, and stability for industry professionals. Their Live FX Studio is a state-of-the-art live compositor, shipping with projection mapping, camera tracking, DMX control for image-based lighting and the latest green-screen keying technology for virtual production. Live Assist is the multi-cam VTR tool for video assist, offering support for any number of cameras, any resolution, with easy green-screen comp and local-clip server functionality. Live Looks is the optimum tool for live grading and direct-look creation with an instant connection to Assimilate’s Scratch software for live streaming and keying capabilities. Used by DITs and post artists worldwide, Scratch VR is known for its advanced on-set and post-production tools that offer stability, speed and metadata support for any camera format, including 2D/3D/VR. Play Pro Studio is the go-to solution for a professional player for VFX-reviews, ProRes RAW QC and genuine ProRes transcoding on Windows.Learn more at www.assimilateinc.com.