Metaverse Virtual Production Platform

Intersection of Reality and Digital Regions! We use the latest PBR technology,Ray tracing chroma keys,Intelligent camera tracking......Create a metaverse visual effect that transcends the real world.

01. INTEGRATION

Tracking and shooting, virtual aerial photography, and centralized control of multi machine switching.

Design software UI to meet the requirements of broadcasting and television professional live streaming.Guidance, title graphics, packaging, shooting control, and lighting control are all organically laid out on the first level interface.One person can smoothly control program production.

Director Processization

Real-time chroma key multi-camera.Each camera can simulate at least 27 different independent virtual positions. And they can form independent motion trajectories with each other.

Camera tracking and virtual aerial photography seamlessly connect

The camera tracking process is assigned to different camera position switching points on the software director console, which control the tracking process of the tracking camera position

At the same time, the current camera can also set independent positions to achieve a large range of aerial photography effects. And they can be combined to achieve one-click control of seamless stitching from aerial photography to track tracking

The advantage of this is that no cameraman is involved throughout the process, and the camera position can be restored with one click

Hardware Switching Panel

Tailored to achieve perfect correspondence between hardware and software functionality。Large screen display, can be used for monitoring during guidance switching, can be multi screen segmentation, or can be output screen pre monitoring.

The integrated LED backlight in its buttons can highlight the current switching status of the target, making it more convenient for program producers to operate in low light.

02. STEP TIMELINE

Intelligent Studio Control Based on Stepping Timeline

In the design of step time line, each track line is assigned to an independent controlled equipment, and the key frame is defined as the SHOT.

As a sequence control unit, the SHOT controls the synchronous operation of different equipment tracks. The SHOT can move back and forth freely, or switch to any SHOT with one key to reset all equipment track contents under the current SHOT.

In addition, a single content of the equipment track can cover multiple sub SHOT and be executed to flexibly schedule the broadcast duration and switching points of each equipment track。

Camera intelligent control

Easily control the operation of the camera pan/tilt, track, and rocker arm.

Online packaging and Title Graphics control

Packaging and Title Graphics are assigned to different equipment tracks, and there are corresponding execution modes under each sub SHOT.

Accurately control the screen entry and exit points of packaging and Title Graphics.

Packaging and Title Graphics can be broadcast across multiple SHOT, with multiple SHOT life cycles.

Packaging and Title Graphics tracks are respectively given independent switching points to independently control the broadcasting of packaging and Title Graphics in the same life cycle.

03. PRODUCTION WORKFLOW

Integrating PBR engines into workflows

Utilizing the hyper realistic 3D rendering level of the "meta" rendering engine, frame by frame display and control of movable objects, character actions, lighting effects, particle effects, and camera visuals in the rendering engine.

Support the real-time input of external video sources and data sources (such as weather, finance, etc.) into the rendering engine, and generate real-time 3D graphics such as bar charts, pie charts, and line charts, meeting the data visualization requirements in program shooting.

PBR performance of U-Meta Engine

From material texture to roughness adjustment, from light absorption, reflection to Fresnel refraction angle setting Physics based rendering will present us with the most realistic 3D scene effect experience

Particle Effects

The rendering engine provides a rich library of particle effects.

The control of particles and materials can be arranged on the step timeline and set independently.

VR Implant

The content implanted in VR is controlled by the rendering engine, which not only has the correct perspective relationship with the moderator and 3D scene, but also has effects such as light and shadow interaction and reflection

04. Camera stabilization and anti shaking

Computer vision and image recognition

The original Aruco real-time pose estimation algorithm detects the relative pose of the Aruco to the camera at the frame level, calculates the real-time pose of the camera, and intelligently integrates it with mechanical sensing data to achieve real-time stability and shake suppression of the camera.

Accurately locate Aruco

Determine the camera shooting range and related parameters of the camera, such as deformation, displacement, scaling, posture, etc., through multiple sets of ArUco Marker binary square marker point matrices Use this to quickly determine and provide feedback on data. The core of stable anti shaking is to eliminate shaking noise while processing and matching data changes, completing synchronization and stable tracking of virtual scenes

Tracking Effect

The camera's movement posture and actual tracking effect.

With the support of this stable tracking system, any camera control device can be selected, such as robot arm, track robot, camera carrying system,etc.

The motion tracking of handheld PTZ is the most extreme tracking usage environment.

05. Intelligent Manufacturing I/O Card

4K broadcast movie level, multi-channel 12G SDI input and output

The built-in embedded system of the I/O card supports edge computing, which can collect and play back multiple input signals at the same time, facilitating later editing and refinement, and minimizing the risk of switching errors.


Dual rendering engine

Efficient and low latency processing algorithms:
use efficient image processing and compression algorithms
improve rendering speed and enable real-time dual rendering in complex scenes

Dual channel real-time output

Independent rendering host
Simultaneously processing PGM, PVW, and real-time dual channel HD-SDI output.