“The Unity of Dual Essences”
Unreal Engine Scene Builder · Real-time Rendering Operator · On-Set Production
Unreal Engine Scene Builder · Real-time Rendering Operator · On-Set Production
A mixed reality short film that blends virtual and physical worlds in real-time using Unreal Engine and LED volume technology.
The Unity of Dual Essences is a mixed-reality short film that blends virtual and physical worlds in real time using Unreal Engine and LED volume technology. The story unfolds across several settings, and I was responsible for Part 3: The Mountain Temple Scene, where virtual lighting and spatial atmosphere play a critical role in mood and narrative.
We used LiveLink and real-time rendering workflows to bridge virtual scenes with live-action footage. The final output was captured in-camera, simulating a unified world using synchronized physical props, LED walls, and UE-rendered backdrops.
Crew member: Difei, Nan Yang, Wu Wei, Amanda Yerby
Crew member: Difei, Nan Yang, Wu Wei, Amanda Yerby
We aimed for a visual style that merges the neon-drenched aesthetics of a Cyberpunk city with the elegance and symbolism of traditional Chinese Huadan (花旦) opera characters. This juxtaposition of futuristic urban landscapes and classical theatrical elements creates a surreal sense of duality—both temporal and cultural. Through this contrast, we explore themes of identity, memory, and belonging across two parallel worlds: one of steel and circuits, the other of flowers and silk. The images selected reflect this hybrid atmosphere, inspired by 1980s city pop palettes, Blade Runner-esque environments, and the emotional lyricism in Makoto Shinkai’s cinematic compositions.
Scene Development
Virtual Production Workflow
We conducted a real-time virtual production shoot, seamlessly integrating physical performances with Unreal Engine-rendered environments. Using Live Link and a Blackmagic camera system, we captured the actor’s live performance while compositing it in real time against the UE-generated background. This allowed for immediate visualization of final shots and enabled real-time creative decisions on set.
To ensure coherence between the digital and physical elements, we carefully designed the set with foreground props, including a vintage telephone, bouquet, and tent, that matched their virtual counterparts. These physical elements served to anchor the actors in the scene and strengthen the sense of immersion.
We implemented synchronized lighting strategies using adjustable LED light beams that mirrored the virtual scene’s lighting setup. This alignment ensured consistent shadows, color temperature, and overall atmosphere between the real and virtual components, enhancing visual believability.
As part of the on-set technical workflow, I operated the Unreal Engine Listener to manage Open Sound Control (OSC) communication between local UE projects, the rendering workstation, and the camera tracking system. Upon initializing the session, I connected the Listener to the active Unreal project file on my local virtual desktop. This connection enabled synchronized communication with the render node responsible for driving content to the LED wall in real time.
I also ensured stable integration with the Live Link pipeline, which was connected to a Blackmagic camera rig. This setup allowed the camera feed and positional data to be streamed directly into Unreal, enabling accurate compositing and real-time visualization from the correct perspective. This real-time feedback loop was crucial for aligning physical camera motion with virtual camera perspectives, allowing our team to validate shots on set and make immediate adjustments.
Throughout the session, I monitored OSC responses to verify input stability, adjusted parameters as needed for lighting or animation triggers, and collaborated closely with the rendering team to maintain frame synchronization and latency minimization across machines.
Final Comp