CN219802409U - XR virtual film-making real-time synthesis system - Google Patents

XR virtual film-making real-time synthesis system Download PDF

Info

Publication number
CN219802409U
CN219802409U CN202320705680.1U CN202320705680U CN219802409U CN 219802409 U CN219802409 U CN 219802409U CN 202320705680 U CN202320705680 U CN 202320705680U CN 219802409 U CN219802409 U CN 219802409U
Authority
CN
China
Prior art keywords
real
time
camera
virtual
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202320705680.1U
Other languages
Chinese (zh)
Inventor
徐筱烨
马帅威
雷华根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Yucheng Media Technology Co ltd
Original Assignee
Guangxi Yucheng Media Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Yucheng Media Technology Co ltd filed Critical Guangxi Yucheng Media Technology Co ltd
Priority to CN202320705680.1U priority Critical patent/CN219802409U/en
Application granted granted Critical
Publication of CN219802409U publication Critical patent/CN219802409U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The utility model provides an XR virtual film-making real-time synthesis system. Including real cameras and virtual cameras; the real-time video camera is connected with a camera tracking system, the real video camera and the virtual video camera are connected with a synchronous phase-locked system together, the real video camera is arranged in a performance area, a light simulation screen and a real-time effect display screen are arranged in the performance area, the real video camera and the virtual video camera are connected with a real-time rendering server together, the real-time rendering server is connected with a broadcasting control server, the broadcasting control server is connected with a light synchronization system and a sound synchronization system, the broadcasting control server is connected with an LED screen processor, the LED screen processor is connected with an LED screen controller, and the LED screen controller is connected with an LED display screen. The utility model has the advantages that: the LED display screen presents images to cover a larger field of view, so that the audience experiences a high immersion sense which cannot be realized by a single plane video.

Description

XR virtual film-making real-time synthesis system
Technical Field
The utility model relates to the technical field of XR, in particular to an XR virtual film-making real-time synthesis system.
Background
The extended reality, i.e. XR technology, means that a virtual environment capable of man-machine interaction is created by combining reality with virtual through a computer, which is also a collective name of multiple technologies such as AR, VR, MR and the like. By integrating the visual interaction technologies of the three, the method brings the 'immersion' of seamless transition between the virtual world and the real world for the experienter.
In the current stage XR technology, a green screen shooting is generally adopted, namely, a pure green shooting space is built, no real object performance is carried out in the space to the actors, the shot video content is subjected to post-processing, a pure color background is removed through color matting and other modes, and a virtual background and a special effect are added, so that the purpose of combining the virtual background and the real object is achieved. However, the adoption of the solid background shooting can lead to unnatural ambient light in the shooting environment, and various stray lights can appear in the shot images due to the reflective objects in the scene, so that a great deal of effort is consumed by post-production personnel to eliminate the stray lights. In addition, the performance without real objects needs actors to keep track of the position of the prop, imagine the performance action by the sky, and has higher difficulty and lower efficiency. For this purpose, an XR virtual production real-time synthesis system is proposed for improvement.
Disclosure of Invention
The object of the present utility model is to solve at least one of the technical drawbacks.
Therefore, an object of the present utility model is to provide an XR virtual production real-time synthesis system, which solves the problems mentioned in the background art, and overcomes the shortcomings existing in the prior art.
To achieve the above object, an embodiment of an aspect of the present utility model provides an XR virtual production real-time synthesis system, including a real camera and a virtual camera; the real-time video camera is connected with a camera tracking system, the real video camera and the virtual video camera are connected with a synchronous phase-locked system together, the real video camera is arranged in a performance area, a light simulation screen and a real-time effect display screen are arranged in the performance area, the real video camera and the virtual video camera are connected with a real-time rendering server together, the real-time rendering server is connected with a broadcasting control server, the broadcasting control server is connected with a light synchronization system and a sound synchronization system, the broadcasting control server is connected with an LED screen processor, the LED screen processor is connected with an LED screen controller, and the LED screen controller is connected with an LED display screen.
In any of the above embodiments, it is preferable that the real camera performs video recording and playing on the performance area, and the virtual camera performs three-dimensional laser scanning on the performance prop.
Preferably, in any of the above aspects, the camera tracking system performs pose tracking on the real camera, and the genlock system transmits a synchronization signal to the real camera and the virtual camera.
The technical scheme is adopted: the real camera is used for photographing the non-physical performance in the performance area, the virtual camera scans the performance prop to obtain an image environment so as to replace the image background in the non-physical performance, and the image environment and the action of the actor recorded by the real camera form a finished product image together. The camera tracking system is used for acquiring the gesture data of the real camera and sending the spatial position data of the real camera to the server. The synchronous phase-locked system is respectively connected with the camera tracking system, the real camera and the virtual camera and is used for sending synchronous signals to the camera tracking system, the real camera and the virtual camera.
By any of the above schemes, it is preferable that the light simulation screen surrounds around the presentation area, and the real-time effect display screen is disposed in front of the presentation area.
It is preferable in any of the above embodiments that the real-time rendering server uses one of Notch and Unity3D, unreal4 engines.
The technical scheme is adopted: the real camera photographs the non-real performance of the actors in the performance area to obtain actor action pictures. The light simulation screen adopts the LED screen to replace a pure green background to provide a light environment for the non-physical performance of actors, and the light in the performance environment is more natural due to better natural light restoration. The real-time effect display screen is used for providing real-time performance images for performers, and the real-time rendering server synthesizes and renders the images transmitted by the real camera and the virtual camera. Then the result image is projected onto a real-time effect display screen for the performer to perform self-checking, so that the performance quality can be effectively improved, and further the shooting efficiency can be improved.
In any of the above schemes, preferably, the broadcasting control server transmits image information to the LED screen processor and projects an image picture to the real-time effect display screen.
Preferably, in any of the above schemes, the LED display screen is an immersive five-sided LED display screen.
The technical scheme is adopted: the broadcasting control server is used for controlling a broadcasting program, under the cooperation of the light synchronous system and the sound synchronous system, the broadcasting control server matches corresponding light and sound for a photographic picture, and transmits a result file to the LED screen body processor, the LED screen body processor processes the image file and transmits the image file to the LDE screen body controller, and the LED screen body controller controls the LED display screen to broadcast the image file. The immersion type five-face LED display screen is adopted to display pictures to cover at least 120 degrees (horizontal) x 70 degrees (vertical) of field angle of human eyes, so that a viewer can obtain surrounding multidirectional audiovisual information at the position of the viewer, and the viewer can experience high immersion which cannot be realized by a single plane video.
Compared with the prior art, the utility model has the following advantages and beneficial effects:
1. this real-time synthesis system of virtual film-making of XR sets up the real-time effect show screen of being connected with broadcasting accuse server through the place ahead at the play district, makes the person of performing when the play district carries out the performance, can examine through real-time effect show screen to correct the action in the performance process, elements such as expression, make the performance effect more outstanding, and then can improve video recording efficiency, reduces the consumption of manpower, resource and time in the video recording process.
2. This real-time synthesis system of virtual film-making of XR uses the light analog screen to carry out light projection to the performance district as required through setting up the light analog screen at the performance district, and the real ambient light effect of simulation as far as possible makes the light effect of the image after the synthesis more natural, also reduces the degree of difficulty of post processing, reduces later spent time. Through adopting immersive five-sided LED display screen, LED display screen presents the picture and covers great angle of view, makes the spectator can obtain diversified audio-visual information in every side simultaneously in the position that locates, lets spectator experience the unable high sense of immersing that realizes of single plane video.
Additional aspects and advantages of the utility model will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the utility model.
Drawings
The foregoing and/or additional aspects and advantages of the utility model will become apparent and may be better understood from the following description of embodiments taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic flow chart of the present utility model.
Detailed Description
Embodiments of the present utility model are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present utility model and should not be construed as limiting the utility model.
In the present utility model, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present utility model can be understood by those of ordinary skill in the art according to the specific circumstances.
As shown in fig. 1, the utility model comprises a real camera 1 and a virtual camera 2, wherein the real camera 1 is connected with a camera tracking system 3, the real camera 1 and the virtual camera 2 are jointly connected with a synchronous phase-locked system 4, the real camera 1 is arranged in a performance area 5, a light simulation screen 6 and a real-time effect display screen 7 are arranged in the performance area 5, the real camera 1 and the virtual camera 2 are jointly connected with a real-time rendering server 8, the real-time rendering server 8 is connected with a playing control server 9, the playing control server 9 is connected with a light synchronous system 10 and a sound synchronous system 11, the playing control server 9 is connected with an LED screen processor 12, the LED screen processor 12 is connected with an LED screen controller 13, and the LED screen controller 13 is connected with an LED display screen 14.
Example 1: the real camera 1 performs video recording and playing on the performance area 5, and the virtual camera 2 performs three-dimensional laser scanning on the performance prop. The camera tracking system 3 performs attitude tracking on the real camera 1, and the genlock system 4 transmits a synchronization signal to the real camera 1 and the virtual camera 2. The real camera 1 is used for photographing the non-real performance in the performance area 5, the virtual camera 2 scans the performance prop to obtain an image environment so as to replace the image background in the non-real performance, and the image environment and the action of the actor recorded by the real camera 1 form a finished image together. The camera tracking system 3 is used to acquire pose data of the real camera 1 and to send spatial position data of the real camera 1 to a server. The genlock system 4 is connected to the camera tracking system 3, the real camera 1 and the virtual camera 2, respectively, for transmitting synchronization signals to the camera tracking system 3, the real camera 1 and the virtual camera 2.
Example 2: the light simulation screen 6 surrounds around the presentation area 5, and the real-time effect display screen 7 is arranged in front of the presentation area 5. The real-time rendering server 8 employs one of Notch and Unity3D, unreal4 engines. The performance area 5 is used for the actors to perform the no-real performance, and the real camera 1 photographs the no-real performance of the actors in the performance area 5 to obtain the action picture of the actors. The light simulation screen 6 adopts an LED screen to replace a pure green background to provide a light environment for the non-physical performance of actors, and the light in the performance environment is more natural due to better natural light restoration. The real-time effect display screen 7 is used for providing real-time performance images for the performer, and the real-time rendering server 8 synthesizes and renders the images transmitted by the real camera 1 and the virtual camera 2. Then the result image is projected onto the real-time effect display screen 7 for the performer to perform self-checking, so that the performance quality can be effectively improved, and further the shooting efficiency can be improved.
Example 3: the broadcasting control server 9 transmits the image information to the LED screen processor 12 and projects the image picture to the real-time effect display screen 7. The LED display 14 is an immersive five-sided LED display. The playing control server 9 is used for controlling a playing program, under the cooperation of the light synchronization system 10 and the sound synchronization system 11, the playing control server 9 matches corresponding light and sound to a photographic picture, and transmits a result file to the LED screen processor 12, the LED screen processor 12 processes the image file and transmits the image file to the LDE screen controller 13, and the LED screen controller 13 controls the LED display screen 14 to play the image file. The immersion type five-sided LED display screen 14 is adopted to display images to cover at least 120 degrees (horizontal) x 70 degrees (vertical) of field angle of human eyes, so that a viewer can obtain surrounding multidirectional audiovisual information at the position of the viewer, and the viewer can experience high immersion which cannot be realized by a single plane video.
The working principle of the utility model is as follows:
s1, shooting a performance area 5 by using a real camera 1, and performing three-dimensional laser scanning on performance props by using a virtual camera 2. Simulating real environment light to perform lamplight projection on the performance area 5 by using a lamplight simulation screen 6;
s2, the performer performs non-physical performance in the performance area 5, the real camera 1 and the virtual camera 2 transmit the obtained images to the real-time rendering server 8, and the real-time rendering server 8 renders pictures and transmits the pictures to the playing control server 9. Under the cooperation of the light synchronization system 10 and the sound synchronization system 11, the broadcasting control server 9 matches corresponding light and sound with the photographing picture and transmits a result file to the LED screen body processor 12 and the real-time effect display screen 7;
s3, the performer can perform self-checking according to the images displayed by the real-time effect display screen 7. The completed image file is played on the LED display screen 14 through the LED screen controller 13.
Compared with the prior art, the utility model has the following beneficial effects compared with the prior art:
1. this real-time synthesis system of virtual film-making of XR sets up the real-time effect show screen 7 of being connected with broadcasting accuse server 9 through the place ahead at the performance district 5, makes the person of performing when performance is carried out in the performance district 5, can examine the effect of performing through real-time effect show screen 7 to correct elements such as action, expression in the performance process, make the performance effect more outstanding, and then can improve video recording efficiency, reduce the consumption of in-process manpower, resource and time of recording.
2. This real-time synthesis system of virtual film-making of XR uses light analog screen 6 to carry out light projection to the performance district 5 as required through setting up light analog screen 6 at the performance district 5, and the real ambient light effect of simulation as far as possible makes the light effect of the image after the synthesis more natural, also reduces the degree of difficulty of post processing, reduces later spending time. By adopting the immersive five-sided LED display screen, the LED display screen 14 presents images covering a large angle of view, so that the audience can obtain surrounding multidirectional audiovisual information at the position where the audience is located, and the audience can experience a high immersive sensation which cannot be realized by a single plane video.

Claims (7)

1. An XR virtual film-making real-time synthesis system comprises a real camera (1) and a virtual camera (2); the real-time video camera is characterized in that the real-time video camera (1) is connected with a camera tracking system (3), the real-time video camera (1) and the virtual video camera (2) are connected with a synchronous phase-locked system (4) together, the real-time video camera (1) is arranged in a performance area (5), a lamplight simulation screen (6) and a real-time effect display screen (7) are arranged in the performance area (5), the real-time video camera (1) and the virtual video camera (2) are connected with a real-time rendering server (8) together, the real-time rendering server (8) is connected with a broadcasting control server (9), the broadcasting control server (9) is connected with a lamplight synchronization system (10) and a sound synchronization system (11), the broadcasting control server (9) is connected with an LED screen processor (12), the LED screen processor (12) is connected with an LED screen controller (13), and the LED screen controller (13) is connected with an LED display screen (14).
2. The XR virtual production real-time synthesis system of claim 1, wherein: the real camera (1) performs video recording and playing on the performance area (5), and the virtual camera (2) performs three-dimensional laser scanning on the performance prop.
3. The XR virtual production real-time synthesis system of claim 2, wherein: the camera tracking system (3) tracks the posture of the real camera (1), and the synchronous phase-locked system (4) sends synchronous signals to the real camera (1) and the virtual camera (2).
4. The XR virtual production real-time synthesis system of claim 3, wherein: the lamplight simulation screen (6) surrounds the periphery of the presentation area (5), and the real-time effect display screen (7) is arranged in front of the presentation area (5).
5. The XR virtual production real-time synthesis system of claim 4, wherein: the real-time rendering server (8) adopts one of Notch and Unity3D, unreal4 engines.
6. The XR virtual production real-time synthesis system of claim 5, wherein: the broadcasting control server (9) transmits image information to the LED screen body processor (12) and projects an image picture to the real-time effect display screen (7).
7. The XR virtual production real-time synthesis system of claim 6, wherein: the LED display screen (14) adopts an immersion type five-sided LED display screen.
CN202320705680.1U 2023-04-03 2023-04-03 XR virtual film-making real-time synthesis system Active CN219802409U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202320705680.1U CN219802409U (en) 2023-04-03 2023-04-03 XR virtual film-making real-time synthesis system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202320705680.1U CN219802409U (en) 2023-04-03 2023-04-03 XR virtual film-making real-time synthesis system

Publications (1)

Publication Number Publication Date
CN219802409U true CN219802409U (en) 2023-10-03

Family

ID=88182163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202320705680.1U Active CN219802409U (en) 2023-04-03 2023-04-03 XR virtual film-making real-time synthesis system

Country Status (1)

Country Link
CN (1) CN219802409U (en)

Similar Documents

Publication Publication Date Title
US9160938B2 (en) System and method for generating three dimensional presentations
US6335765B1 (en) Virtual presentation system and method
RU2161871C2 (en) Method and device for producing video programs
US10121284B2 (en) Virtual camera control using motion control systems for augmented three dimensional reality
TWI530157B (en) Method and system for displaying multi-view images and non-transitory computer readable storage medium thereof
EP0669758B1 (en) Time-varying image processor and display device
CN105072314A (en) Virtual studio implementation method capable of automatically tracking objects
US11488348B1 (en) Computing virtual screen imagery based on a stage environment, camera position, and/or camera settings
CN110866978A (en) Camera synchronization method in real-time mixed reality video shooting
EP1843581A2 (en) Video processing and display
JP3526897B2 (en) Image display device
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
CA2244467C (en) Chroma keying studio system
CN114125301A (en) Virtual reality technology shooting delay processing method and device
US20090153550A1 (en) Virtual object rendering system and method
KR20190031220A (en) System and method for providing virtual reality content
WO2023236656A1 (en) Method and apparatus for rendering interactive picture, and device, storage medium and program product
CN116320363B (en) Multi-angle virtual reality shooting method and system
JP4019785B2 (en) Image display system, image processing apparatus, and image display method
CN219802409U (en) XR virtual film-making real-time synthesis system
CN102118576B (en) Method and device for color key synthesis in virtual sports system
JP7011728B2 (en) Image data output device, content creation device, content playback device, image data output method, content creation method, and content playback method
JP2021015417A (en) Image processing apparatus, image distribution system, and image processing method
KR102654323B1 (en) Apparatus, method adn system for three-dimensionally processing two dimension image in virtual production
WO2022209130A1 (en) Information processing device, information processing method, and system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant