Thursday, Week 7. The XR Stage awaited transformation.
Two teams—Production Design and VFX—converged with a shared mission: make the physical set and virtual environment indistinguishable from each other.
Production Design arrived early to begin assembling the physical platform, railway tracks, and practical props.
The set included a 6m x 4m raised platform matching the virtual floor height, weathered marble tile flooring, platform edges with recessed lighting, and three meters of practical railway track with rusted metal rails and weathered concrete ties.
Simultaneously, the VFX team worked at the XR Stage control desk, loading the Unreal environment and calibrating the LED walls.
Colors on the LED panels appeared different than they did on monitors, requiring immediate real-time adjustments.
The team spent hours color-matching the virtual materials to the physical set samples brought by Production Design.
The critical one-meter overlap between physical and digital required meticulous attention.
Physical debris piles were placed at the boundary, while photoscanned versions of the same debris extended seamlessly into the virtual environment.
Lighting temperature was precisely matched across both worlds, and shadows were carefully aligned to maintain visual continuity.
The day was not without setbacks.
Unreal Engine crashed three times during setup, forcing emergency optimization.
The team removed unused assets, simplified particle systems, and reduced draw distances to stabilize performance.
Each crash reminded us that the XR Stage demanded not only creative ambition but also technical discipline.
8:00 AM - Load-in and safety check
10:00 AM - Physical platform assembly begins | Unreal project loaded, camera tracking calibration
10:30 AM - Physical-virtual boundary refinement
10:40 AM - First camera tests and actor blocking
11:00 AM - Final adjustments and rehearsal
11:10 AM - Setup complete
By the end of Thursday, the stage was ready.
The fusion of physical craftsmanship and digital precision had created a world that felt lived-in, forgotten, and hauntingly real.
Two teams—Production Design and VFX—converged with a shared mission: make the physical set and virtual environment indistinguishable from each other.
Production Design arrived early to begin assembling the physical platform, railway tracks, and practical props.
The set included a 6m x 4m raised platform matching the virtual floor height, weathered marble tile flooring, platform edges with recessed lighting, and three meters of practical railway track with rusted metal rails and weathered concrete ties.
Simultaneously, the VFX team worked at the XR Stage control desk, loading the Unreal environment and calibrating the LED walls.
Colors on the LED panels appeared different than they did on monitors, requiring immediate real-time adjustments.
The team spent hours color-matching the virtual materials to the physical set samples brought by Production Design.
The Blending Zone
The critical one-meter overlap between physical and digital required meticulous attention.
Physical debris piles were placed at the boundary, while photoscanned versions of the same debris extended seamlessly into the virtual environment.
Lighting temperature was precisely matched across both worlds, and shadows were carefully aligned to maintain visual continuity.
Technical Challenges
The day was not without setbacks.
Unreal Engine crashed three times during setup, forcing emergency optimization.
The team removed unused assets, simplified particle systems, and reduced draw distances to stabilize performance.
Each crash reminded us that the XR Stage demanded not only creative ambition but also technical discipline.
Setup Timeline
8:00 AM - Load-in and safety check
10:00 AM - Physical platform assembly begins | Unreal project loaded, camera tracking calibration
10:30 AM - Physical-virtual boundary refinement
10:40 AM - First camera tests and actor blocking
11:00 AM - Final adjustments and rehearsal
11:10 AM - Setup complete
By the end of Thursday, the stage was ready.
The fusion of physical craftsmanship and digital precision had created a world that felt lived-in, forgotten, and hauntingly real.
The XR Stage is unforgiving.
What works in the Unreal editor often fails under the scrutiny of camera lenses and LED panels.
Real-time adjustments became the rhythm of our shoot days.
The first major discovery: colors shift dramatically from Unreal to LED to camera sensor.
A warm amber light in the editor appeared cold and greenish through the camera.
The VFX team made continuous LUT adjustments, tweaking post-process volumes between takes to match the director's vision.
Practical lights spilled onto the LED walls, creating unexpected color contamination.
The gaffer and VFX team worked in tandem—adjusting physical light positions while simultaneously modifying virtual light intensities.
Some compromises were necessary: accepting LED spill as "ambient environmental light" rather than fighting it.
Reflective surfaces behaved unpredictably.
The marble floor, designed to be semi-glossy, reflected the LED panels too intensely.
The material artist reduced roughness values in real-time, finding the balance between visual appeal and technical reality.
Frame rate fluctuations threatened several takes.
The systems engineer monitored GPU performance constantly, calling out potential bottlenecks.
When fog density caused a drop to 85 FPS, it was reduced immediately.
When a light configuration proved too expensive, it was simplified between setups.
What works in the Unreal editor often fails under the scrutiny of camera lenses and LED panels.
Real-time adjustments became the rhythm of our shoot days.
Color Correction on the Fly
The first major discovery: colors shift dramatically from Unreal to LED to camera sensor.
A warm amber light in the editor appeared cold and greenish through the camera.
The VFX team made continuous LUT adjustments, tweaking post-process volumes between takes to match the director's vision.
Lighting Adjustments
Practical lights spilled onto the LED walls, creating unexpected color contamination.
The gaffer and VFX team worked in tandem—adjusting physical light positions while simultaneously modifying virtual light intensities.
Some compromises were necessary: accepting LED spill as "ambient environmental light" rather than fighting it.
Material Tweaks
Reflective surfaces behaved unpredictably.
The marble floor, designed to be semi-glossy, reflected the LED panels too intensely.
The material artist reduced roughness values in real-time, finding the balance between visual appeal and technical reality.
Performance Optimization
Frame rate fluctuations threatened several takes.
The systems engineer monitored GPU performance constantly, calling out potential bottlenecks.
When fog density caused a drop to 85 FPS, it was reduced immediately.
When a light configuration proved too expensive, it was simplified between setups.
The camera in virtual production is more than a recording device—it's the bridge between physical and digital realities.
The camera was chosen for its sensor size, color science, and compatibility with the XR Stage's tracking system.
Lens selection balanced focal length, depth of field, and the LED volume's spatial constraints.
The camera was equipped with tracking markers, allowing the XR Stage system to calculate its position and orientation in real-time.
This data fed directly into Unreal Engine, updating the virtual camera's perspective to match the physical camera's movement frame by frame.
The synchronization had to be perfect—any latency would break the illusion.
Before the shoot, virtual cameras were created in Unreal Engine to match the physical camera's specifications:
- Sensor size
- Focal length
- Aspect ratio
- Lens distortion characteristics
This allowed the team to pre-visualize shots in the Unreal editor with accurate framing and perspective.
Camera Specifications
The camera was chosen for its sensor size, color science, and compatibility with the XR Stage's tracking system.
Lens selection balanced focal length, depth of field, and the LED volume's spatial constraints.
Tracking System Integration
The camera was equipped with tracking markers, allowing the XR Stage system to calculate its position and orientation in real-time.
This data fed directly into Unreal Engine, updating the virtual camera's perspective to match the physical camera's movement frame by frame.
The synchronization had to be perfect—any latency would break the illusion.
Virtual Camera Matching
Before the shoot, virtual cameras were created in Unreal Engine to match the physical camera's specifications:
- Sensor size
- Focal length
- Aspect ratio
- Lens distortion characteristics
This allowed the team to pre-visualize shots in the Unreal editor with accurate framing and perspective.
Lighting in an LED volume environment operates on different principles than traditional cinematography.
The LED walls themselves are massive light sources, requiring a completely rethought approach.
Despite the LED walls' contribution, practical lighting remained essential:
Key Light
Soft LED panels provided directional key light, sculpting the actors' faces and adding dimension.
Fill Light
Subtle fill counteracted harsh LED spill and softened shadows.
Rim/Edge Light
Backlight separated actors from the background, preventing them from blending into the virtual environment.
Practical Fixtures
Physical light fixtures (desk lamps, overhead lights) added realism and motivated virtual lighting.
The virtual environment's lighting was carefully designed to complement practical lighting:
Directional Light (Sun/Key)
A virtual directional light simulated sunlight streaming through the station's skylights, motivated by the story's time of day.
Point Lights and Spotlights
Virtual light fixtures—overhead lamps, recessed floor lights—added atmosphere and depth to the scene.
Volumetric Fog
God rays and atmospheric haze added dimension, making light feel tangible.
The LED walls themselves are massive light sources, requiring a completely rethought approach.
Practical Lighting Setup
Despite the LED walls' contribution, practical lighting remained essential:
Key Light
Soft LED panels provided directional key light, sculpting the actors' faces and adding dimension.
Fill Light
Subtle fill counteracted harsh LED spill and softened shadows.
Rim/Edge Light
Backlight separated actors from the background, preventing them from blending into the virtual environment.
Practical Fixtures
Physical light fixtures (desk lamps, overhead lights) added realism and motivated virtual lighting.
Virtual Lighting in Unreal
The virtual environment's lighting was carefully designed to complement practical lighting:
Directional Light (Sun/Key)
A virtual directional light simulated sunlight streaming through the station's skylights, motivated by the story's time of day.
Point Lights and Spotlights
Virtual light fixtures—overhead lamps, recessed floor lights—added atmosphere and depth to the scene.
Volumetric Fog
God rays and atmospheric haze added dimension, making light feel tangible.
Behind every polished frame lies a chaotic, beautiful process of problem-solving, collaboration, and relentless iteration.
Between "Cut" and "Action," the real work happened.
Script supervisors reviewed continuity notes.
The VFX team tweaked light intensities in Unreal.
The gaffer adjusted a practical light's angle by a few degrees.
Actors hydrated and reviewed their blocking.
The director and DP discussed the next shot's emotional tone.
These moments—unseen in the final film—were where creative decisions crystalized.
The Moments Between Takes
Between "Cut" and "Action," the real work happened.
Script supervisors reviewed continuity notes.
The VFX team tweaked light intensities in Unreal.
The gaffer adjusted a practical light's angle by a few degrees.
Actors hydrated and reviewed their blocking.
The director and DP discussed the next shot's emotional tone.
These moments—unseen in the final film—were where creative decisions crystalized.