CN109639933B - Method and system for making 360-degree panoramic program in virtual studio - Google Patents

Method and system for making 360-degree panoramic program in virtual studio Download PDF

Info

Publication number
CN109639933B
CN109639933B CN201811497814.5A CN201811497814A CN109639933B CN 109639933 B CN109639933 B CN 109639933B CN 201811497814 A CN201811497814 A CN 201811497814A CN 109639933 B CN109639933 B CN 109639933B
Authority
CN
China
Prior art keywords
virtual
panoramic
studio
program
dimensional scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811497814.5A
Other languages
Chinese (zh)
Other versions
CN109639933A (en
Inventor
李智鹏
颜庆聪
胡彦雷
孙阳
周华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Makemagic Technology Development Co ltd
Original Assignee
Beijing Makemagic Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Makemagic Technology Development Co ltd filed Critical Beijing Makemagic Technology Development Co ltd
Priority to CN201811497814.5A priority Critical patent/CN109639933B/en
Publication of CN109639933A publication Critical patent/CN109639933A/en
Application granted granted Critical
Publication of CN109639933B publication Critical patent/CN109639933B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a method and a system for making a 360-degree panoramic program in a virtual studio, which comprises the following steps: two locators, a VR head display, two controllers, at least one camera, and a virtual studio mainframe. Setting the moving range of the VR head display and the controller by using the positioner; designing a 360-degree panoramic virtual three-dimensional scene according to the program type; previewing a rendered 360-degree panoramic virtual three-dimensional scene in a VR head display in real time; in the environment of light of a green box or a blue box and a studio, a high-definition foreground signal is shot by a camera, and the foreground signal with an alpha channel is cut into a host frame in a virtual three-dimensional scene through image matting processing; and synthesizing and rendering the panoramic video in real time, and recording the panoramic video to generate a panoramic program. The invention can produce professional panoramic programs, thereby enriching VR panoramic contents; the invention has available sceneries around the horizontal direction; if the program needs to be modified, only the 360-degree panoramic virtual three-dimensional scene needs to be redesigned.

Description

Method and system for making 360-degree panoramic program in virtual studio
Technical Field
The invention relates to the technical field of digital audio and video production, in particular to a method and a system for producing a 360-degree panoramic program in a virtual studio.
Background
With the development of VR technology, panoramic content is always very lacking, and a panoramic camera and a wireless microphone are usually used to shoot panoramic materials, and a panoramic program is made through post-editing. The mode has higher requirements on shooting equipment, shooting environment and post-production level, generates special shooting and field reports of panoramic programs, and is rarely used in studios.
Panoramic shooting requires a large amount of preliminary preparation work, and the panoramic camera, the audio acquisition equipment, the mobile shooting equipment and the like are debugged. Because the panoramic camera can present 360-degree omnibearing shooting pictures, shooting parameters such as ISO, color temperature, exposure and the like need to be adjusted according to field light, and the overall brightness and color temperature of the panoramic picture can achieve the best effect. Meanwhile, the relative position between the panoramic camera and the shot object needs to be reasonably arranged, and the shot object is not positioned at the abutted seam between the lenses. For panoramic cameras, there is a safety distance, which usually requires the subject to be located more than 1.5 meters from the panoramic camera. According to the above requirements, even at a distance other than the safe distance, if the subject is at the lens joint, a certain distortion is inevitably generated by image acquisition, deformation, and splicing. Therefore, the seam exists objectively and cannot be eliminated completely, and the later editing cannot be done.
In addition, when panoramic shooting is performed, live sound needs to be collected. To avoid audio cables entering the picture, wireless microphones are typically employed. In the early stage, the volume of the microphone needs to be tested, and the volume is adjusted to a proper range. The shooting environment also has severe requirements, too many miscellaneous scenes cannot enter the picture in the shooting site, or a large number of virtual objects need to be implanted to shield the miscellaneous scenes during later editing.
The panoramic camera is generally used for shooting some special programs or carrying out on-site reporting, and when the manufactured panoramic program is worn by a VR head to be watched, the panoramic program has an immersive sense. However, this approach is not suitable for panoramic photography in a studio, mainly for two reasons: firstly, the environment in the studio is very complex, and the panoramic camera can shoot the studio lights, the ordinary camera and other miscellaneous scenes. Secondly, because the panoramic camera adopts a fixed-focus lens, close-up of characters cannot be shot through lens zooming; in addition, the definition of a local picture is not high in the shot panoramic video signal. Therefore, a panoramic program shot with a panoramic camera in a live action studio is rarely seen. Therefore, the panoramic shooting mode is easily limited by objective conditions.
Besides shooting in a live-action studio, another implementation mode for producing studio programs is to adopt a virtual studio technology. Compared with the live-action studio, the virtual studio does not relate to the reconstruction and decoration of the live-action studio. A professional green box (or a professional blue box) is utilized, professional studio light is matched, and a common camera is used for shooting high-definition foreground signals. The shot foreground signal is synthesized with the virtual three-dimensional scene through professional image matting processing, the scene is rendered into a common studio program, and the synthesized signal can be directly output or recorded. By adopting the method, too many miscellaneous scenes cannot be introduced, and even if the miscellaneous scenes exist, the miscellaneous scenes can be removed by using a software algorithm without longitudinal light or the outside of the two side edges of the ground and the transverse green box. In addition, the virtual studio has the advantage that if the program is modified, only the virtual three-dimensional scene needs to be redesigned, so that the influence is small and the cost is low.
The virtual studio technology provides an implementation mode for studio program production, can prevent scenes from entering pictures, and has ideal effect; the cost of program revising is also very low compared to live-action studios. However, the design of a common virtual studio is originally designed to record a flat program, the used virtual three-dimensional scenes are all local scenes, and the rest of the space is an open area. For example, a news scene usually comprises a main broadcasting station and a background large screen, and a single-person or double-person sitting program is recorded by utilizing the main broadcasting station scene; recording programs broadcast by a host station by utilizing a background large-screen view; the two scenes can also be used for recording interactive programs combined by sitting and standing. The design of the virtual three-dimensional scene is oriented to the production of the plane program, and the production requirement of the studio panoramic program cannot be met.
Then, if a panoramic program is recorded by using the virtual studio principle, two conditions need to be met: firstly, a virtual three-dimensional scene using a 360-degree panorama is designed, the scene is closed, and rich scenes can be provided around the scene in the horizontal direction. Secondly, a virtual three-dimensional scene with a 360-degree panorama needs to be previewed, and the size and the position of a viewpoint of the scene need to be adjusted. The virtual three-dimensional scene of 360-degree panorama is synthesized with the scratched foreground signal, and real-time rendering is performed to obtain a panoramic video, so that the problem of splicing seams between lenses of the panoramic camera can be solved. The definition of close-up pictures in the panoramic video can be ensured by adjusting the distance between the watching viewpoint and the host frame cut into the foreground signal, namely the shooting effect presented by the virtual camera. Moreover, the miscellaneous scenes carried in the foreground signals can be removed through an algorithm, the influence caused by the fact that a shooting environment enters a picture is avoided, and post-editing processing is not needed.
In conclusion, the panoramic camera is not suitable for shooting in a studio, objectively existing seam splicing and low local picture definition, and the problem of scene entrance in miscellaneous scenes cannot be overcome or avoided temporarily. The panoramic program is manufactured by utilizing a virtual studio principle, a 360-degree panoramic virtual three-dimensional scene is adopted, after a foreground signal shot by a common camera is scratched, the foreground signal is synthesized with the virtual three-dimensional scene, a panoramic video is rendered in real time and recorded, and the problem that panoramic shooting cannot be overcome is solved. Certainly, the virtual three-dimensional scene of the rendered 360-degree panorama must be previewed by means of VR positioning and display equipment, and debugging work in the early stage of program recording is completed. Therefore, a complete and feasible technical implementation scheme is provided for the production of the panoramic program in the studio.
Disclosure of Invention
In order to solve the problems, the invention relates to a method and a system for making a panoramic program by using VR positioning technology and matching with a 360-degree panoramic virtual three-dimensional scene, and the method is used for making the panoramic program in a studio and provides a quick solution for making VR panoramic content. And designing a 360-degree panoramic virtual three-dimensional scene according to the program type. Utilize VR space positioning equipment, set up the home range that VR head shows and the controller, through wearing VR head and showing, realize previewing in real time to the virtual three-dimensional scene of 360 degrees panoramas after rendering up. Under the environment of a professional green box (or blue box) and studio light, the high-definition foreground signals shot by the camera are cut into a host frame in a virtual three-dimensional scene through professional image matting processing, and are synthesized and rendered into a panoramic video in real time and recorded to generate a panoramic program. Like this, just can produce the panoramic material in a large amount of studios, carry out the later stage with the panoramic material of outer shooing and cut, produce professional panorama program to richen VR panorama content.
Specifically, according to an aspect of the present invention, there is provided a system for 360 degree panoramic programming of a virtual studio, including:
the system comprises two positioners, a VR head display, two controllers, at least one camera and a virtual studio host;
the VR head display is connected with the virtual studio host through a USB and an HDMI cable respectively, and the two cameras are connected with the virtual studio host through an HD-SDI/HDMI cable.
Preferably, the method further comprises the following steps: and the sound console is connected with the virtual studio host through a USB cable.
Preferably, the USB cable is used for transmitting spatially located real-time data, and includes: the spatial positions and angles of the positioner, the VR head display and the controller; the HDMI cable is used for outputting a 360-degree panoramic virtual three-dimensional scene rendered by the virtual studio host to a screen of a VR head display.
Preferably, the locator and the controller communicate with the VR head display in a wireless manner for real-time data communication.
Preferably, the two positioners are fixed at the positions with the same height of more than 2 meters away from the ground at the outer edges of the vertical surfaces at the two sides of the green box or the blue box, and the VR head display and controller is placed in a space area covered by the positioners.
Preferably, the two controllers are matched with a VR head display for use, so that interactive control is realized.
According to another aspect of the present invention, there is also provided a method for 360 degree panoramic programming of a virtual studio, including:
setting the moving range of the VR head display and the controller by using the positioner;
designing a 360-degree panoramic virtual three-dimensional scene according to the program type;
previewing a rendered 360-degree panoramic virtual three-dimensional scene in a VR head display in real time;
in the environment of light of a green box or a blue box and a studio, a high-definition foreground signal is shot by a camera, and the foreground signal with an alpha channel is cut into a host frame in a virtual three-dimensional scene through image matting processing;
and synthesizing and rendering the panoramic video in real time to obtain the panoramic video, and recording the panoramic video to generate a panoramic program or directly outputting the panoramic program.
Preferably, the setting of the range of motion of the VR head display and the controller by using the positioner includes:
the trigger key of the controller is hooked to move along the edge of the active area, and when the trigger key is released near the initial position, the motion track of the controller is automatically closed to be a closed quadrangle;
the controller is horizontally placed on the ground of the defined area, and the positioner obtains the current spatial position of the ground through scanning.
Preferably, in the virtual three-dimensional scene with the 360-degree panorama, a plurality of scenes are provided around the horizontal direction, and the overall color tone and style are kept consistent; the top adopts a closed design; the ground extends to the edge of the scene and is seamlessly connected.
Preferably, the plurality of scenes includes: the system comprises a main broadcasting station, a large screen, a triple screen, a comment area, a interview area, a static display area and an open office area.
The invention has the following beneficial effects: by utilizing the principle of a virtual studio, a virtual three-dimensional scene with 360-degree panorama is designed, wherein abundant shooting scenes are covered, and the virtual three-dimensional scene is used for recording panoramic programs in the studio. In the shooting environment of professional green box (or blue box) and studio light, the foreground signal shot by the camera is cut into a host frame in a virtual three-dimensional scene with 360-degree panorama through image matting processing, and is subjected to real-time synthesis and rendering to be recorded as a panoramic material. Compared with the shooting mode of a panoramic camera, the method has no abutted seam between lenses; the definition of a close-up picture in the panoramic video is ensured by adjusting the distance from the watching viewpoint to the host frame cut into the foreground signal; in addition, the miscellaneous scenes carried in the foreground signals can be removed through an algorithm, the influence of the picture entering of the shooting environment is avoided, and the post-editing processing is not needed. Certainly, the virtual three-dimensional scene of the rendered 360-degree panorama needs to be previewed and debugged by means of VR positioning and display equipment, the viewing viewpoint of the panorama program is set, the size of the virtual three-dimensional scene is adjusted, and the position of the host frame is changed. The virtual three-dimensional scene with 360-degree panorama is utilized, the production of multi-file programs can be completed, the program is easy to change, and only the virtual three-dimensional scene needs to be redesigned. By adopting the method, the panoramic program with rich forms and shocking effects in the studio can be manufactured.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a system diagram of the method for making a 360-degree panoramic program in a virtual studio according to the present invention;
fig. 2 is a schematic diagram of a logical relationship of the virtual studio host function module according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to solve the problem of making panoramic programs in a studio, the invention provides a method for making 360-degree panoramic programs in a virtual studio. The system comprises: two locators, a VR head display, two controllers, and a virtual studio host. The system is realized by the following steps: firstly, couple together above-mentioned equipment, the first apparent USB that passes through of VR is connected with virtual studio host computer with the HDMI cable, and the USB cable is used for transmitting space positioning's real-time data, includes: the spatial positions and angles of the positioner, the VR head display and the controller; the HDMI cable is used for outputting a 360-degree panoramic virtual three-dimensional scene rendered by the virtual studio host to a screen of a VR head display. The positioner and the controller are in real-time data communication with the VR head display in a wireless mode.
Then, the positioner is used for setting the movable range of the VR head display and the controller. Usually, two locators are fixed at positions with the same height of more than 2 meters away from the ground on the outer edges of vertical surfaces on two sides of a professional green box (or blue box), so that the sight between the two locators is not blocked; the best fixing mode is wall-mounted, so that the influence of the interference of ground vibration on the positioner on the stability of space positioning is avoided; the angle of the positioner can be adjusted through a special wall hanging piece, so that the positioner inclines downwards at a proper angle, generally 30-45 degrees, and faces to the center of the activity area; if the angle of the positioner needs to be fixed, the connected locking ring is screwed; the scanning field of view of each locator is 150 degrees horizontally and 110 degrees vertically; in order to realize accurate space positioning, the projection distance of the two on the ground is required to be less than 8 meters. In order for the positioner, the VR head display and the controller to work properly, the VR head display and the controller need to be placed in the space area covered by the positioner. And the two controllers are matched with the VR head display for use, so that interactive control is realized.
The movable range of the VR head display and the controller is set for the first time, the controller (both the VR head display and the controller) is required to be used, the trigger key is hooked to move along the edge of the movable area, and when the VR head display and the controller return to the position near the initial position, the trigger key is loosened, and the motion track of the controller is automatically closed to be a closed quadrangle. Then, the controller is horizontally placed on the ground of the defined area, and the locator can obtain the space position of the current ground through scanning. After the operations are completed, the active area of the VR head display and controller is set, and the VR head display and controller is arranged in the space area.
Next, the present invention is also focused. And designing a 360-degree panoramic virtual three-dimensional scene according to the program type. Compared with the common virtual three-dimensional scene, the three-dimensional scene has the advantages that a 360-degree panoramic design form is adopted, rich scenes are provided around the horizontal direction, and the integral tone and style are kept consistent; the top adopts a closed design, which is different from an open type and a transparent type which are consistent with a common virtual three-dimensional scene; the ground extends to the edge of the scene and is seamlessly connected. For a virtual three-dimensional scene with a 360-degree panorama, different scenes can be designed around the virtual three-dimensional scene, and the virtual three-dimensional scene covers: the panoramic studio system comprises a main broadcast station, a large screen, a triple screen, a comment area, a talk area, a static display area, even an open office area and the like so as to record different types of panoramic programs, and the effect of the panoramic studio presented is more shocking. The virtual three-dimensional scene of 360-degree panorama is synthesized with the scratched foreground signal, the panorama is rendered in real time to form a panoramic video, and the panoramic video can be directly output or recorded to generate a panoramic material.
The first embodiment is as follows:
the invention grasps the characteristics of immersion and presence of VR panoramic programs, and provides a method for making 360-degree panoramic programs in a virtual studio by applying the principle of the virtual studio, thereby providing a convenient and rapid solution for making panoramic programs in the studio.
The embodiment is a specific implementation method of the present invention, and fig. 1 is a system diagram illustrating a method for making a 360-degree panoramic program in a virtual studio. Here, the entire system will be described by taking an L-shaped green box (or blue box) as an example. The spatial localization apparatus comprises: two locators, namely a locator A and a locator B; the VR head display and two controllers, controller 1 and controller 2 respectively. First, the VR head display needs to be connected to the virtual studio host, and the USB cable is used to transmit the spatially positioned real-time data, including: spatial position and angle of VR head display and controller; the HDMI (or DP) cable is used to transmit a rendered 360-degree panoramic virtual three-dimensional scene, in which a VR headset can be worn. Then, a positioner is installed, and the movable range of the VR head display and the controller is set. As shown in figure 1, two locators are fixed at the same height of more than 2 meters from the ground at the outer edge of the vertical surface of two sides of a professional green box (or blue box), and the sight between the two locators is ensured not to be blocked. In view of the practical situation, a professional studio light is hung above the green box, and if the light height is low and the view between the two positioners is possibly blocked, a special synchronous data line can be used for connecting the positioner A, B to ensure the normal communication. The proposal adopts a wall-hanging mode to fix the positioner, thereby avoiding the influence of the interference of ground vibration on the space positioning stability. The positioner is arranged on the special wall hanging frame, and the angle of the positioner is adjusted to enable the positioner to incline downwards by a proper angle, generally 30-45 degrees, and the positioner faces to the center of the activity area according to specific conditions. If the angle of the positioner needs to be fixed, the connected locking ring is screwed. The scanning visual field of each locator is 150 degrees horizontally and 110 degrees vertically, and the distance between the two projectors on the ground is less than 8 meters in order to ensure accurate space location.
Then, the movable range of the VR head display and the controller is set. When the initial setting is carried out, any one controller is held by a hand, the trigger key is hooked to move along the edge of the movable range, the trigger key returns to the initial point position to be loosened, and the motion track of the controller can be automatically closed to be a quadrangle. Then, the ground needs to be positioned by the controller, the controller is horizontally placed on the ground in a defined area, and the two positioners can accurately obtain the spatial position of the ground through infrared scanning. After the above operations are completed, in order to ensure the normal operation of the positioner, the VR head needs to be placed within the movable range. And the two controllers are mainly matched with the VR head display for interactive control, and can be kept closed if not used.
The panoramic program in the studio is usually made with 1-4 camera positions, and the characters and scenery, i.e. foreground signals, in the professional green box (or blue box) and the studio light environment are shot from different angles. As shown in fig. 1, the shooting device refers to a professional video camera, a consumer DV, a single-lens reflex (or mini-lens) camera, etc., and is capable of outputting a high-definition foreground signal to the virtual studio host through the HD-SDI/HDMI cable.
It is also an indispensable part of the virtual studio for the processing of audio. The shooting scene, the sound of host and honored guest passes through microphone, background audio and passes through professional player and insert the sound console respectively, exports to virtual studio host computer through the USB cable after the audio mixing. The latter realizes processing such as audio acquisition, synthesis, time delay and the like, and is not the key point of the invention and is not elaborated too much here.
The implementation steps of the method described in this embodiment are described as follows:
fig. 2 is a schematic diagram of a logical relationship of the virtual studio host function module according to an embodiment of the present invention. The virtual studio host computer comprises functional modules such as video I/O, a real three-dimensional scene, parameter setting, director switching, image matting, synthesis rendering, recording, VR terminal communication, external audio processing and the like. Next, a specific flow of 360-degree panorama program production will be described.
The shooting environment is still a professional green box (or blue box) matched with professional studio lighting. Usually, 2-4 shooting devices are used for shooting high-definition foreground signals. Shooting equipment (a plurality of, at least one), the high definition prospect signal that shoots is exported to virtual studio's host computer. And the video I/O module is used for collecting foreground signals of different camera positions and delivering the foreground signals to the image matting module frame by frame to carry out professional image matting processing. The image matting module is used for matting out pure colors (green or blue) in the foreground signals frame by frame; then, the foreground signal with alpha channel is sent to the synthesis rendering module frame by frame.
And the real three-dimensional scene module is used for importing a 360-degree panoramic virtual three-dimensional scene required by the production of the studio panoramic program and is compatible with scene files in the formats of obj, x and 3ds generated by 3D design software. The 360-degree panoramic virtual three-dimensional scene customized according to the program type contains rich scenes around the horizontal direction. When the scene is designed, the requirement of shooting scene setting is combined, a host frame capable of cutting in high-definition foreground signals is preset, and the picture frame is 16: 9. corresponding to the presenter frame, a virtual camera is preset in a virtual three-dimensional scene of the 360-degree panorama to shoot the presenter frame and the scene behind the presenter frame.
After the loading of the 360-degree panoramic virtual three-dimensional scene is completed, the rendering synthesis module carries out real-time rendering processing, the real-time rendering processing is output to a VR head display through an HDMI (or DP) cable, and after the VR head display is worn, the real-time rendering synthesis module can be placed in the 360-degree panoramic virtual three-dimensional scene for previewing in an immersive mode. The position of the initial viewpoint is the central point of the scene, and all preset presenter boxes in the scene are hidden in a default state. At this time, the spatial range of the virtual three-dimensional scene with the 360-degree panorama can be intuitively perceived, and the size of the virtual three-dimensional scene is adjusted through the parameter configuration module. There is no specific limitation in scene design, and a virtual three-dimensional scene with a 360-degree panorama can be designed very macro and also can be designed very unique. The controller can roam freely in the virtual three-dimensional scene, and the controller moves through the direction keys of the touch pad to traverse the virtual scene around the horizontal direction.
Subsequently, the setting of parameters related to the panorama program production needs to be completed. And adjusting the virtual three-dimensional scene of the 360-degree panorama to be in a proper size according to the preview condition of the VR head display. And then, aiming at the scene selected by the panoramic program, displaying a host frame preset by the scene, referring to the shooting visual angle presented by the corresponding virtual machine position, and cutting a foreground signal shot by the real machine position into the host frame. And selecting a signal source of the actual shooting equipment from a drop-down list aiming at the virtual machine position. At this moment, foreground signal in the host frame is collected through the video I/O module, and is handled by professional image matting of image matting module, and the area alpha passageway can be directly synthesized with the virtual background and be played up. And (4) cutting foreground signals of all real camera positions into the host frame corresponding to the virtual machine position by adopting the same method. Then, according to the composition effect between the foreground signal and the virtual background, the size of the host frame is zoomed, and the front and back positions are adjusted; the viewpoint position of the virtual machine position is also changed and moves along the directions of the X axis, the Y axis and the Z axis; after the space position of the virtual machine position is determined, the lens orientation and the focal length of the virtual machine position can be changed, so that the virtual machine position presents the best synthetic effect. Considering the viewing viewpoint of the panoramic program, the height of the panoramic program is basically consistent with the height of the VR head display after the VR head display is actually worn, so the height of the virtual machine position is not suitable to be too high or too low. The view angle of the virtual machine position is used as the initial view angle of the panoramic program at the viewing viewpoint. By adjusting the spatial positions of the host frame and the virtual machine position, the foreground signal can be clearly presented in front of the watching viewpoint of the panoramic program, and the definition of a local picture is ensured. If there are scenes in the foreground signal, such as light above the green box, in the actual shooting process, the scenes are removed in the parameter configuration module by using an algorithm because the real camera is fixed in the program making process.
After the preparation work of recording the panoramic program is finished, all the configuration parameters are stored as engineering files and can be directly called next time. Then, a panoramic program can be recorded, a composite picture of each virtual machine position is previewed in a planar mode in real time by using a broadcast guide switching module, and switching scheduling is carried out among different virtual machine positions according to the field situation. The synthesized signal of the current virtual machine bit is directly output or recorded as a PGM signal. Moreover, the director switching can be superimposed with transition special effects, such as fade-in and fade-out, and the like, and the characteristics of the panoramic video are commonly used.
In the process of recording the panoramic program, aiming at the current virtual machine position switched by the broadcasting guide, the visual angle of the virtual machine position is taken as an initial visual angle, the synthesis and rendering module is used for synthesizing the foreground signal and the virtual background, the panoramic video with the resolution of 3840 × 2160 is rendered in real time, and the panoramic video can be directly output through the video I/O module or sent to the recording module frame by frame and is recorded to the local to generate the panoramic program with the resolution of 4K. And (4) synthesizing the rendered panoramic video in real time along with the live broadcasting guide switching, wherein the watching viewpoint and the initial visual angle of the panoramic video change along with the virtual machine position.
In addition, in the recording process of the panoramic program in the studio, the audio frequency needs to be processed, the sound of a host and a guest passes through microphones, the live background sound effect passes through a special player and is respectively connected to sound mixing consoles, and after sound mixing, the live background sound effect is sent to a virtual studio host through a USB cable. The external audio processing module realizes audio acquisition, time delay and mute processing, and then sends the audio to the synthesis rendering module to complete the synthesis of the panoramic video and the on-site audio.
The benefits brought by the method described in this example are very significant: by utilizing the principle of a virtual studio, a virtual three-dimensional scene with 360-degree panorama is designed, wherein rich shooting scenes are covered, and recording of panoramic programs in the studio can be completed. In the shooting environment of professional green box (or blue box) and studio light, the foreground signal shot by the camera is cut into the host frame through image matting processing, the host frame is placed in a suitable scene in a 360-degree panoramic virtual three-dimensional scene, and the foreground signal is recorded into a panoramic program through real-time synthesis rendering. By adopting the method, the problem of seam splicing existing between the shooting lenses of the panoramic camera is solved; irrelevant miscellaneous scenes cannot be introduced due to actual shooting; and moreover, the problem of low local close-up definition of the panoramic video is solved by adjusting the distance between the host frame and the viewing viewpoint. Of course, the rendered 360-degree panoramic virtual three-dimensional scene is previewed and roamed by means of VR positioning and head display equipment, and all virtual scenes designed in the virtual three-dimensional scene are traversed. The virtual three-dimensional scene with 360-degree panorama is utilized, the production of multi-file programs can be completed, the program is easy to change, and only the virtual three-dimensional scene needs to be redesigned. By the method, the panoramic program with rich forms and shocking effects in the studio can be manufactured.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (4)

1. A system for virtual studio 360 degree panoramic programming, comprising:
the system comprises two positioners, a VR head display, two controllers, at least one camera and a virtual studio host;
the two positioners and the controller are in real-time data communication with the VR head display in a wireless mode, the two positioners are fixed at positions with the same height of more than 2 meters away from the ground on the outer edges of the vertical surfaces of the two sides of the green box or the blue box, the VR head display and the controller are placed in a space area covered by the positioners, and the two controllers are matched with the VR head display for use, so that interactive control is realized;
the VR head display is connected with the virtual studio host through a USB and an HDMI cable respectively, and the camera is connected with the virtual studio host through an HD-SDI/HDMI cable;
the virtual studio host comprises the following functional modules: the system comprises a video I/O module, a real three-dimensional scene module, a parameter setting module, a director switching module, an image matting module, a synthesis rendering module, a recording module, a VR terminal communication module and an external audio processing module;
the video I/O module is used for collecting foreground signals of different camera positions and transmitting the foreground signals to the image matting module frame by frame to carry out professional image matting processing;
the image matting module is used for matting out pure colors in the foreground signals frame by frame; then, the foreground signal with alpha channel is sent to the synthesis rendering module frame by frame;
the real three-dimensional scene module is used for importing a virtual three-dimensional scene of a 360-degree panorama required by the studio panoramic program; after the loading is finished, the synthesis module carries out real-time rendering processing on the loaded 360-degree panoramic virtual three-dimensional scene and outputs the processed scene to a VR head display through an HDMI or DP cable;
the trigger key of the controller is hooked to move along the edge of the active area, and when the trigger key is released near the initial position, the motion track of the controller is automatically closed to be a closed quadrangle;
the controller is horizontally placed on the ground of the defined area, and the positioner obtains the current spatial position of the ground through scanning.
2. The system of claim 1, further comprising:
and the sound console is connected with the virtual studio host through a USB cable.
3. The system of claim 1, wherein the virtual studio 360 degree panorama program production system,
the USB cable is used for transmitting real-time data of space positioning, and comprises: the spatial positions and angles of the positioner, the VR head display and the controller; the HDMI cable is used for outputting a 360-degree panoramic virtual three-dimensional scene rendered by the virtual studio host to a screen of a VR head display.
4. A method for making a 360-degree panoramic program of a virtual studio includes:
use two locators of dedicated synchronous data line connection to utilize two locators to set up the scope of motion that the first apparent and controller of VR, include:
the trigger key of the controller is hooked to move along the edge of the active area, and when the trigger key is released near the initial position, the motion track of the controller is automatically closed to be a closed quadrangle;
horizontally placing the controller on the ground of a defined area, and obtaining the current spatial position of the ground by the positioner through scanning;
designing a 360-degree panoramic virtual three-dimensional scene according to the program type;
previewing a rendered 360-degree panoramic virtual three-dimensional scene in a VR head display in real time;
in the environment of light of a green box or a blue box and a studio, a high-definition foreground signal is shot by a camera, and the foreground signal with an alpha channel is cut into a host frame in a virtual three-dimensional scene through image matting processing;
synthesizing and rendering the panoramic video in real time into a panoramic video, and recording the panoramic video to generate a panoramic program or directly outputting the panoramic program; in the virtual three-dimensional scene of the 360-degree panorama, a plurality of scene types are provided around the horizontal direction, and the integral tone and style are kept consistent; the top adopts a closed design; the ground extends to the edge of the scene and is in seamless connection;
the multiple views include: the system comprises a main broadcasting station, a large screen, a triple screen, a comment area, a interview area, a static display area and an open office area;
the panoramic program can be recorded, a composite picture of each virtual machine position is previewed in a planar mode in real time by using a broadcast guide switching module, and switching and scheduling are carried out among different virtual machine positions according to the field condition; the composite signal of the current virtual machine position is directly output or recorded as a PGM signal, and the director switching can superpose the transition special effect to directly output or record the PGM signal; after the preparation work of recording the panoramic program is finished, all the configuration parameters are stored as engineering files and can be directly called next time.
CN201811497814.5A 2018-12-07 2018-12-07 Method and system for making 360-degree panoramic program in virtual studio Expired - Fee Related CN109639933B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811497814.5A CN109639933B (en) 2018-12-07 2018-12-07 Method and system for making 360-degree panoramic program in virtual studio

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811497814.5A CN109639933B (en) 2018-12-07 2018-12-07 Method and system for making 360-degree panoramic program in virtual studio

Publications (2)

Publication Number Publication Date
CN109639933A CN109639933A (en) 2019-04-16
CN109639933B true CN109639933B (en) 2021-12-24

Family

ID=66072074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811497814.5A Expired - Fee Related CN109639933B (en) 2018-12-07 2018-12-07 Method and system for making 360-degree panoramic program in virtual studio

Country Status (1)

Country Link
CN (1) CN109639933B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110290290A (en) * 2019-06-21 2019-09-27 深圳迪乐普数码科技有限公司 Implementation method, device, computer equipment and the storage medium of the studio cloud VR
CN110517346B (en) * 2019-08-30 2021-06-18 腾讯科技(深圳)有限公司 Virtual environment interface display method and device, computer equipment and storage medium
CN110866978A (en) * 2019-11-07 2020-03-06 辽宁东智威视科技有限公司 Camera synchronization method in real-time mixed reality video shooting
CN111399653A (en) * 2020-03-24 2020-07-10 北京文香信息技术有限公司 Virtual interaction method, device, equipment and computer storage medium
CN112383679A (en) * 2020-11-02 2021-02-19 北京德火科技有限责任公司 Remote same-screen remote interview mode of AR immersive panoramic simulation system at different places and control method thereof
CN112312112A (en) * 2020-11-02 2021-02-02 北京德火科技有限责任公司 Multi-terminal control system of AR immersion type panoramic simulation system and control method thereof
CN113489920B (en) * 2021-06-29 2024-08-20 维沃移动通信(杭州)有限公司 Video synthesis method and device and electronic equipment
CN114173020A (en) * 2021-12-31 2022-03-11 北京德火科技有限责任公司 Foreground-background separation method and system applied to multiple virtual studios
CN114173021B (en) * 2022-02-14 2022-06-24 中国传媒大学 Virtual broadcasting method and system based on high-definition multi-screen
CN114845136B (en) * 2022-06-28 2022-09-16 北京新唐思创教育科技有限公司 Video synthesis method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205622745U (en) * 2016-05-10 2016-10-05 倪宏伟 Real -time synthesis system of virtual reality true man
CN106663411A (en) * 2014-11-16 2017-05-10 易欧耐特感知公司 Systems and methods for augmented reality preparation, processing, and application
CN106657719A (en) * 2017-01-04 2017-05-10 海南大学 Intelligent virtual studio system
CN207460313U (en) * 2017-12-04 2018-06-05 上海幻替信息科技有限公司 Mixed reality studio system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11665308B2 (en) * 2017-01-31 2023-05-30 Tetavi, Ltd. System and method for rendering free viewpoint video for sport applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106663411A (en) * 2014-11-16 2017-05-10 易欧耐特感知公司 Systems and methods for augmented reality preparation, processing, and application
CN205622745U (en) * 2016-05-10 2016-10-05 倪宏伟 Real -time synthesis system of virtual reality true man
CN106657719A (en) * 2017-01-04 2017-05-10 海南大学 Intelligent virtual studio system
CN207460313U (en) * 2017-12-04 2018-06-05 上海幻替信息科技有限公司 Mixed reality studio system

Also Published As

Publication number Publication date
CN109639933A (en) 2019-04-16

Similar Documents

Publication Publication Date Title
CN109639933B (en) Method and system for making 360-degree panoramic program in virtual studio
US10582182B2 (en) Video capture and rendering system control using multiple virtual cameras
US10121284B2 (en) Virtual camera control using motion control systems for augmented three dimensional reality
CN101692693B (en) Multifunctional integrated studio system and a method
US10154194B2 (en) Video capturing and formatting system
CN110225224B (en) Virtual image guiding and broadcasting method, device and system
CN105264876A (en) Method and system for low cost television production
KR101530826B1 (en) Playing method and the system of 360 degree spacial video
US8885022B2 (en) Virtual camera control using motion control systems for augmented reality
CN105072314A (en) Virtual studio implementation method capable of automatically tracking objects
CN109803094A (en) A kind of virtual three-dimensional scene editing system, method and device
GB2436921A (en) Methods and apparatus providing central, primary displays with surrounding display regions
US11176716B2 (en) Multi-source image data synchronization
KR20160021706A (en) Playing method and the system of 360 degree spacial video
CN107172413A (en) Method and system for displaying video of real scene
US20090153550A1 (en) Virtual object rendering system and method
CN116320363B (en) Multi-angle virtual reality shooting method and system
CN107888890A (en) It is a kind of based on the scene packing device synthesized online and method
CN112887514A (en) Real-time visualization method and system suitable for virtual shooting and production of film and television
KR102474451B1 (en) Apparatus, method, system and program for recording data in virtual production
JP2019102940A (en) Virtual viewpoint content generation system, voice processing device, control method for virtual viewpoint content generation system, and program
CN207652589U (en) It is a kind of based on the scene packing device synthesized online
JP2002271692A (en) Image processing method, image processing unit, studio apparatus, studio system and program
Tricart Live-Action VR Capture and Post-Production
CN116016808A (en) Three-dimensional virtual studio system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211224

CF01 Termination of patent right due to non-payment of annual fee