CN109769082B - Virtual studio construction system and recording method based on VR tracking - Google Patents

Virtual studio construction system and recording method based on VR tracking Download PDF

Info

Publication number
CN109769082B
CN109769082B CN201811513909.1A CN201811513909A CN109769082B CN 109769082 B CN109769082 B CN 109769082B CN 201811513909 A CN201811513909 A CN 201811513909A CN 109769082 B CN109769082 B CN 109769082B
Authority
CN
China
Prior art keywords
virtual studio
real
controller
virtual
tracker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811513909.1A
Other languages
Chinese (zh)
Other versions
CN109769082A (en
Inventor
李智鹏
颜庆聪
胡彦雷
孙阳
周华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Makemagic Technology Development Co ltd
Original Assignee
Beijing Makemagic Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Makemagic Technology Development Co ltd filed Critical Beijing Makemagic Technology Development Co ltd
Priority to CN201811513909.1A priority Critical patent/CN109769082B/en
Publication of CN109769082A publication Critical patent/CN109769082A/en
Application granted granted Critical
Publication of CN109769082B publication Critical patent/CN109769082B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to the field of virtual studio construction, in particular to a virtual studio construction system and a recording method based on VR tracking; wherein the construction system comprises: the system comprises at least one camera, trackers matched with the number of cameras, receivers matched with the number of trackers, two locators, two controllers, a VR display controller and an optional sound console. The recording method comprises the following steps: the virtual studio host carries out rendering synthesis on the video subjected to image matting processing and the follow-up real three-dimensional scene under the control of parameter setting and director switching, and completes recording of the virtual studio by adding the audio subjected to external audio processing. The invention acquires the shooting angle of the shooting video in real time to control the real three-dimensional scene to follow without post-processing, and can more simply realize shooting of various angles and patterns compared with a mechanical rail or trackless virtual studio system, so that the presentation form of the video is richer.

Description

Virtual studio construction system and recording method based on VR tracking
Technical Field
The invention relates to the field of virtual studio construction, in particular to a virtual studio construction system and a virtual studio recording method based on VR tracking.
Background
Compared with the live-action studio, the virtual studio does not need decoration, and has low cost of program reprinting and small environmental influence, thereby being popularized and used to a certain extent. The virtual studio system can be divided into a mechanical rail tracking virtual studio system and a trackless virtual studio system. The latter does not support station tracking, which keeps the camera still during recording programs, including: spatial position, lens angle, and lens focal length; when multi-angle shooting is needed, a virtual camera is created and a machine position is adjusted by generally utilizing a software algorithm, movement, virtual rocker arm, virtual zooming and the like between different virtual machine positions are carried out, and then the shooting requirement of multi-angle is realized through rendering synthesis. The characters of the program manufactured in the way are relatively static and cannot move back and forth; moreover, the simulation effects of virtual rocker arm, virtual zooming and the like realized by the algorithm result in that the distance between the foreground video and the virtual machine position in the virtual three-dimensional scene is relatively far, and the definition of the rendered and synthesized picture is general.
Compared with a trackless virtual studio system, the virtual studio system supporting mechanical rail tracking well solves the problems. The horizontal swinging, the up-and-down pitching and the lens zooming of a real camera can be realized through the guide rail, but the camera position still cannot move; generally, three groups of sensors are used for respectively acquiring angle offsets of a three-dimensional space X, Y and a Z direction, and then conversion processing is performed, so that the change of a shooting angle and a focal length in a virtual three-dimensional scene is controlled, and finally a simulation program shooting effect is realized through rendering synthesis. However, the virtual studio system with the mechanical rail tracking has high manufacturing cost, complex system, high shooting difficulty and high technical requirements for operators.
Disclosure of Invention
In order to solve the above problem, according to an aspect of the present invention, a virtual studio building system based on VR tracking is disclosed, including: the system comprises at least one camera, trackers matched with the number of cameras, receivers matched with the number of trackers, two locators, two controllers and a VR display controller; the VR display controller is connected with a virtual studio host, the at least one camera is respectively connected with the virtual studio host, the tracker is arranged on the corresponding camera, and the receiver is connected with the virtual studio host; the two positioners and the two controllers are respectively communicated with the VR display controller; wherein the two positioners communicate with each other.
Preferably, the virtual studio building system further includes: and the sound console is connected with the virtual studio host through a USB cable.
Preferably, the VR display controller is configured to receive real-time data containing spatial information from the locator and the controller and send the real-time data to the virtual studio host; the receiver communicates with the tracker in a wireless mode, so that real-time data containing sensing information of the tracker is obtained and sent to the virtual studio host.
More preferably, the virtual studio host calculates the spatial position of the tracker in real time according to the obtained real-time data containing the spatial information and the real-time data containing the sensing information.
Preferably, the two positioners are fixed at the same height of more than 2 meters away from the ground at the outer edges of the vertical surfaces at the two sides of the green box or the blue box, and the VR display controller and the controller are arranged in the area covered by the positioning to realize interactive control.
Preferably, the virtual studio host has a plurality of functions including: video I/O, loading of a real three-dimensional scene, parameter setting, guide switching, image matting, external audio processing and synthesis rendering and recording realized by utilizing the functions.
Preferably, the at least one camera comprises: professional video cameras, consumer grade DVs, single lens reflex and/or miniature single lens reflex cameras.
According to another aspect of the invention, a virtual studio recording method based on VR tracking is disclosed, the method is based on the virtual studio construction system, and the method specifically comprises the following steps: setting the moving range of the tracker by utilizing the controller, the positioner and the VR display controller; designing a real three-dimensional scene simulating a real environment according to the construction requirement of a virtual studio; shooting through the at least one camera provided with the tracker under a green box or blue box environment in which studio lights and a positioner are arranged, so as to obtain a shot video; on one hand, the virtual studio host acquires the shot video in real time and performs image matting processing on the shot video, and on the other hand, the virtual studio host controls the follow-up of the true three-dimensional scene according to the real-time calculated space position of the tracker; the virtual studio host carries out rendering synthesis on the video subjected to image matting processing and the follow-up real three-dimensional scene under the control of parameter setting and director switching, and completes recording of the virtual studio by adding the audio subjected to external audio processing.
Further, the composition rendering includes: cutting foreground video with an alpha channel into a host box of the true three-dimensional scene.
Further, the range of motion that utilizes controller, locator and VR display controller to set up the tracker includes: hooking a trigger key of the controller to move along the edge of the movable range, and loosening the trigger key after returning to the initial position; calculating to obtain a moving track of the controller through the positioner and real-time data of the controller, wherein the moving track of the controller is automatically closed into a closed quadrangle; placing the controller on the ground within the range of motion, thereby obtaining a spatial position of a reference plane of the range of motion.
The invention has the advantages that: compared with a mechanical rail tracking virtual studio system, the method has the advantages that the shooting angle of the shot video is acquired in real time, so that the real three-dimensional scene is controlled to follow, the visual angle of the virtual scene does not need to be changed through post-processing, the post-processing is simpler, and the shooting effect is more vivid; in addition, the use of guide rails is avoided, so that the complexity of operation, the operation requirement and the cost of system composition are greatly reduced. It goes without saying that the system of the present invention can simply realize the shooting of various angles and patterns, and make the representation form of the video richer, no matter the system is relatively tracked by the rail or trackless virtual studio system.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 shows a schematic composition diagram of the virtual studio building system of the present invention.
Fig. 2 shows a functional diagram of a virtual studio host of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 is a schematic diagram of the virtual studio building system according to the present invention. Wherein, the virtual studio construction system comprises: the system comprises at least one camera, trackers matched with the number of cameras, receivers matched with the number of trackers, two locators, two controllers and a VR display controller; wherein, VR display controller is connected with virtual studio host computer through USB and HDMI cable respectively, at least one camera respectively through HD-SDI/HDMI cable with virtual studio host computer connects, with the tracker that camera quantity matches is set up on the camera that corresponds, with the receiver that tracker quantity matches respectively through the USB cable with virtual studio host computer connects. In addition, the virtual studio building system further comprises a sound console which is connected with the virtual studio host through a USB cable.
Specifically, the positioner includes: the infrared laser system comprises an infrared LED array, a first infrared laser transmitter rotating along a horizontal axis and a second infrared laser transmitter rotating along a vertical axis; the infrared LED array is used for emitting array infrared rays of frequency; the first laser emitter and the second laser emitter rotate according to a set period, and emit laser rays in the rotating process. The tracker includes: a plurality of light sensitive sensors, wherein the light sensitive sensors are used for sensing array infrared rays emitted by the infrared LED and/or laser rays emitted by the first laser and the second laser. The VR display controller is used for respectively receiving the real-time data containing the space information of the locators and the controllers and sending the real-time data to the virtual studio host, wherein each locator is arranged on a corresponding camera, and the number of the cameras is determined according to the shooting requirement; the receiver communicates with the tracker in a wireless mode, so that real-time data containing sensing information of the tracker is obtained and sent to the virtual studio host. And the virtual studio calculates the space position of the tracker in real time according to the obtained real-time data containing the space information and the real-time data containing the sensing information. The controller includes: and the photosensitive sensors are used for delimiting the shooting area by sensing array infrared light emitted by the infrared LED array of the positioner. Two locators are fixed in green case or blue case both sides facade outer fringe apart from the same height more than 2 meters on ground, VR show the accuse ware and the controller set up in the region that the location covered realizes interactive control.
Fig. 2 is a functional diagram of a virtual studio host according to the present invention. Wherein, virtual studio's host computer includes multiple functional module, includes: video I/O, real three-dimensional scene, parameter setting, director switching, image matting, external audio processing and synthetic rendering and recording realized by utilizing the functions. By utilizing the virtual studio construction system and the virtual studio host, the invention can realize the recording work of the virtual studio, and the specific method comprises the following steps: setting the moving range of the tracker by utilizing the controller, the positioner and the VR display controller; designing a real three-dimensional scene simulating a real environment according to the construction requirement of a virtual studio; shooting through the at least one camera provided with the tracker under a green box or blue box environment in which studio lights and a positioner are arranged, so as to obtain a shot video; on one hand, the virtual studio host acquires the shot video in real time and performs image matting processing on the shot video, and on the other hand, the virtual studio host controls the follow-up of the true three-dimensional scene according to the real-time calculated space position of the tracker; the virtual studio host carries out rendering synthesis on the video subjected to image matting processing and the follow-up real three-dimensional scene under the control of parameter setting and director switching, and completes recording of the virtual studio by adding the audio subjected to external audio processing. Wherein the composite rendering comprises: cutting foreground video with an alpha channel into a host box of the true three-dimensional scene. Utilize controller, locator and VR to show the home range that the accuse ware set up the tracker and include: hooking a trigger key of the controller to move along the edge of the movable range, and loosening the trigger key after returning to the initial position; calculating to obtain a moving track of the controller through the positioner and real-time data of the controller, wherein the moving track of the controller is automatically closed into a closed quadrangle; placing the controller on the ground within the range of motion, thereby obtaining a spatial position of a reference plane of the range of motion. The recording method of the virtual studio and/or the virtual studio of the invention has low requirement on the camera equipment, and can use the method comprising the following steps: professional video cameras, consumer-grade DVs, slr and/or miniature slrs all complete the shooting work of the virtual studio. The virtual studio construction system and/or the virtual studio recording method of the present invention will be further disclosed by the following embodiments:
first embodiment (virtual studio construction system for virtual studio recording)
The two positioners are fixed at the same height of more than 2 meters away from the ground at the outer edges of the vertical surfaces at the two sides of the professional green box (or blue box), so that the sight between the two positioners is not blocked. In consideration of practical situations, a professional studio light is hung above the green box, and when the green box is arranged at a lower distance from the ground, the view between the two positioners can be blocked, so that the positioners A and B are preferably connected by using a special synchronous data line, thereby ensuring normal communication between the two positioners. The two positioners are preferably arranged in a wall-mounted fixing mode and used for avoiding the positioners from being interfered by ground vibration and influencing the stability of space positioning. And (3) using a special wall-mounted support, and adjusting the angle of the positioner to enable the positioner to incline downwards by a proper angle, generally 30-45 degrees, towards the center of a camera position tracking area. In the embodiment, the scanning field of view of each locator is 150 degrees horizontally and 110 degrees vertically, and in order to ensure accurate spatial location, the distance between the scanning projection and the ground is less than 8 meters.
The virtual studio building system of the present invention is then set up including fixing the tracker on the cold or hot shoe of the camera. Then, arranging the VR display controller and the controller in the positioning coverage area; holding any one of the two controllers by hand, hooking the trigger key to start the controller and move along the edge of the tracking range, and loosening the trigger key after returning to the starting point; the VR display controller acquires the position information of the controller in real time and sends the position information to the virtual studio host, so that the irregular track formed by the movement of the VR display controller is automatically closed into a closed quadrangle, and the tracking range is defined. When the camera is out of the tracking range, the transmission data of the tracker disposed thereon is regarded as invalid data. Then, after equipment including the tracker is calibrated, the shooting and recording process of a formal virtual studio is started; in the shooting process, an infrared LED array in the positioner flickers according to a set frequency, on one hand, a plurality of photosensitive sensors are arranged in the VR display controller, the work state of the positioning is detected by sensing the flickers, and whether the VR display controller works according to set parameters is judged; in another aspect, the controller and the tracker sense the flicker through their respective light sensitive sensors, thereby synchronizing with the positioner; in the present embodiment, 2 lasers in each transmitter rotate one circle, which requires 10ms, and 20ms is one cycle; when the cycle begins, the infrared LED array flashes, the laser rotating along the horizontal axis in the next 10ms sweeps the whole space, and the lasers selected along the vertical axis do not emit light; the laser rotating in the vertical axis for the next 10ms sweeps the entire space, and the laser in the horizontal axis does not emit light. Inside the tracker, the VR display controller and the controller, an enough number of photosensitive sensors are used, and infrared LED flash emitted by the positioner is used as a synchronous signal; respectively determining the time for the horizontal axis laser and the vertical axis laser to reach the sensor through the photosensitive sensor, wherein the time is also the time for the horizontal axis laser and the vertical axis laser to respectively rotate to a certain angle, and then the angle of the sensor relative to the horizontal axis and the vertical axis of the positioner is known; the position and trajectory of the tracker can then be calculated from the position differences of the sensors. Similarly, the positions or the tracks of the tracker, the VR display controller and the controller can be obtained; so that a tracker, located above the camera, can provide the spatial dynamics of the camera in real time and use the obtained spatial data for controlling the follow-up of the true three-dimensional scene, wherein the spatial data comprises: spatial position (x, y, z) and spatial angle (pitch, yaw, roll); the virtual studio host acquires spatial data by using a Tracker data receiver, and the Tracker data receiver acquires and generates a list corresponding to the spatial information, so that the spatial data can be conveniently applied and managed; in addition, the virtual studio host machine also performs pure color filtering image matting on the shooting video of the camera in the process of acquiring the space information in real time and controlling the follow-up of the true three-dimensional scene, renders and synthesizes the image matting processed video and the follow-up true three-dimensional scene under the control of parameter setting and director switching, and then completes the recording of the virtual studio by adding the audio processed by external audio and outputs PGM signals.
Second embodiment (System debug)
The system is very simple to debug, actual measurement is not needed, only a tracking and positioning range needs to be set, a mapping relation between a tracker above a real camera and a virtual camera position is established, the proportion of a virtual three-dimensional scene is adjusted, a proper scene is selected, the project file is stored for subsequent operation, and then the virtual studio host can shoot and produce programs by directly loading the stored project file. Compared with a virtual studio system supporting mechanical tracking, the virtual three-dimensional scene in the invention does not need to be matched with the virtual studio system according to the shooting environment according to the following ratio of 1: 1 (the actual shooting environment does not need to be measured in the design process of the virtual scene, and the structural proportion of the virtual scene is not needed to be noticed), so the scene design is more flexible and is not limited by objective conditions. The shooting equipment can be taken away for shooting at any time, and after the shooting equipment is used, the tracker is installed again, and normal use is recovered immediately. There is no special requirement for the photographer. In addition, the system debugging process of the embodiment of the invention mainly comprises the following steps: parameter settings and system calibration, wherein,
the parameter setting comprises the following steps:
setting a proportional relation between a foreground and a background, wherein the foreground is a video shot by a camera, and the background is a true three-dimensional scene; through the setting of the proportional relation between the foreground and the background, the real three-dimensional scene does not need to be shot according to the real shooting environment 1: 1, customizing, wherein a system program can automatically match a background according to the foreground, thereby facilitating the production of a real three-dimensional scene;
and setting the position of the virtual three-dimensional scene, and rotating counterclockwise or clockwise by a certain angle according to the selected shooting scene, wherein the adjustment range is-180 to +180 degrees.
And setting the horizontal and front-back deviation of the virtual machine position relative to the virtual three-dimensional scene, and placing the virtual machine position at the optimal shooting angle, wherein the height of the virtual machine position does not need to be set, and the Z data of the tracker is used as the standard.
The method comprises the steps of binding a signal source and a virtual camera, visually displaying the action of a real camera through the virtual camera, and connecting the virtual camera with a corresponding video I/O interface through numbering the virtual camera, so that the binding of the signal source and the virtual camera is realized.
Setting a shooting focal length of a camera, and setting the shooting focal length of the camera to be an equivalent 35mm focal length in order to prevent position drift of fusion of a shooting video and a real three-dimensional scene, wherein the 35mm focal length is used as a focal length standard and is generally manufactured in a 35mm focal length ratio in manufacturing the real three-dimensional scene; wherein, the conversion relation of the 35mm focal length is as follows: equivalent 35mm focal length is equal to the actual focal length and the focal length conversion coefficient; wherein the focal length conversion coefficient is the diagonal length of a photosensitive chip CCD (or CMOS) of the actual shooting equipment.
Setting a director switching operation object, wherein the director switching can switch a real three-dimensional scene and a virtual machine position (virtual camera); in addition, by director switching also how to insert such as: streaming media signals, local video, three-dimensional animation, pictures, PPT, BGM, etc.
The system calibration comprises the following steps:
calibrating the tracker, placing the tracker on the ground, and setting the space height coordinate Z sensed by the tracker to be zero at the moment; in addition, since the tracker is fixed on the cold shoe (or hot shoe) of the photographing apparatus by the connector, the central point of the tracker has a significant deviation from the spatial position of the photosensitive chip of the photographing apparatus, and the tracker must be corrected. The spatial distance between the tracker and the photosensitive chip is measured, and the actual wheelbase deviations (X ', Y ', Z ') are input. In this way, the virtual camera can coincide with the center point of the real photographing apparatus in the virtual three-dimensional scene. Therefore, the problem that the viewpoint between the foreground video signal and the virtual three-dimensional scene is staggered when the real shooting equipment moves is solved.
Synchronous calibration, in the actual shooting process, because the foreground signal of the shot video needs to be subjected to image matting processing, the computation amount is large, and certain processing delay can be generated; therefore, the tracking data of the tracker needs to be adjusted accordingly, such as delayed by 10 frames, which is handled according to the actual situation; therefore, synchronous movement of the foreground signal and the virtual machine position (virtual camera) is ensured when the camera moves. And finally, displaying the video subjected to fusion rendering through the VR display controller.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (1)

1. A VR tracking based virtual studio construction system, comprising:
the system comprises at least one camera, trackers matched with the number of cameras, receivers matched with the number of trackers, two locators, two controllers and a VR display controller;
the VR display controller is connected with a virtual studio host, the at least one camera is respectively connected with the virtual studio host, the tracker is arranged on the corresponding camera, and the receiver is connected with the virtual studio host; the two positioners and the two controllers are respectively in communication connection with the VR display controller; wherein, the two positioners are communicated with each other; the two positioners are fixed at the same height of more than 2 meters away from the ground at the outer edges of the vertical surfaces at the two sides of the green box or the blue box, and the VR display controller and the controller are arranged in the area covered by the positioners to realize interactive control;
the VR display controller is used for respectively receiving the real-time data containing the spatial information and the sensing information of the positioner and the controller and sending the real-time data to the virtual studio host; the receiver is in wireless communication with the tracker, so that real-time data containing sensing information of the tracker is obtained and sent to the virtual studio host;
utilize controller, locator and VR to show the home range that the accuse ware set up the tracker, include:
hooking a trigger key of the controller to move along the edge of the movable range, and loosening the trigger key after returning to the initial position;
calculating to obtain a moving track of the controller through the positioner and real-time data of the controller, wherein the moving track of the controller is automatically closed into a closed quadrangle;
placing the controller on the ground within the range of motion, thereby obtaining a spatial position of a reference plane of the range of motion;
designing a real three-dimensional scene simulating a real environment according to the construction requirement of a virtual studio;
shooting through the at least one camera provided with the tracker under a green box or blue box environment in which studio lights and a positioner are arranged, so as to obtain a shot video;
the virtual studio host has a plurality of functions including: video I/O, loading of a real three-dimensional scene, parameter setting, guide switching, image matting, external audio processing and synthesis rendering and recording realized by utilizing the functions; the virtual studio host computer acquires the shot video in real time and performs image matting processing on the shot video on the one hand, calculates the spatial position of the tracker in real time according to the acquired real-time data containing the sensing information to control the follow-up of the real three-dimensional scene on the other hand, performs rendering synthesis on the image matting processed video and the follow-up real three-dimensional scene under the control of parameter setting and director switching, and completes recording of the virtual studio by adding the audio processed by external audio;
further comprising: and the sound console is connected with the virtual studio host through a USB cable.
CN201811513909.1A 2018-12-11 2018-12-11 Virtual studio construction system and recording method based on VR tracking Active CN109769082B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811513909.1A CN109769082B (en) 2018-12-11 2018-12-11 Virtual studio construction system and recording method based on VR tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811513909.1A CN109769082B (en) 2018-12-11 2018-12-11 Virtual studio construction system and recording method based on VR tracking

Publications (2)

Publication Number Publication Date
CN109769082A CN109769082A (en) 2019-05-17
CN109769082B true CN109769082B (en) 2021-07-20

Family

ID=66450411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811513909.1A Active CN109769082B (en) 2018-12-11 2018-12-11 Virtual studio construction system and recording method based on VR tracking

Country Status (1)

Country Link
CN (1) CN109769082B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312112A (en) * 2020-11-02 2021-02-02 北京德火科技有限责任公司 Multi-terminal control system of AR immersion type panoramic simulation system and control method thereof
CN112383679A (en) * 2020-11-02 2021-02-19 北京德火科技有限责任公司 Remote same-screen remote interview mode of AR immersive panoramic simulation system at different places and control method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205622745U (en) * 2016-05-10 2016-10-05 倪宏伟 Real -time synthesis system of virtual reality true man
CN106937128A (en) * 2015-12-31 2017-07-07 幸福在线(北京)网络技术有限公司 A kind of net cast method, server and system and associated uses
CN208094659U (en) * 2018-02-05 2018-11-13 广州必威易微播科技有限责任公司 A kind of video capture device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105072314A (en) * 2015-08-13 2015-11-18 黄喜荣 Virtual studio implementation method capable of automatically tracking objects
US11665308B2 (en) * 2017-01-31 2023-05-30 Tetavi, Ltd. System and method for rendering free viewpoint video for sport applications

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106937128A (en) * 2015-12-31 2017-07-07 幸福在线(北京)网络技术有限公司 A kind of net cast method, server and system and associated uses
CN205622745U (en) * 2016-05-10 2016-10-05 倪宏伟 Real -time synthesis system of virtual reality true man
CN208094659U (en) * 2018-02-05 2018-11-13 广州必威易微播科技有限责任公司 A kind of video capture device

Also Published As

Publication number Publication date
CN109769082A (en) 2019-05-17

Similar Documents

Publication Publication Date Title
JP6131403B2 (en) System and method for 3D projection mapping with robot controlled objects
US10580153B2 (en) Optical navigation and positioning system
WO2021238804A1 (en) Mixed reality virtual preview photographing system
CN106780601B (en) Spatial position tracking method and device and intelligent equipment
JP6551392B2 (en) System and method for controlling an apparatus for image capture
AU2020417796B2 (en) System and method of capturing and generating panoramic three-dimensional images
CN108830906B (en) Automatic calibration method for camera parameters based on virtual binocular vision principle
US9881377B2 (en) Apparatus and method for determining the distinct location of an image-recording camera
CN109769082B (en) Virtual studio construction system and recording method based on VR tracking
WO2015039911A1 (en) Method for capturing the 3d motion of an object by means of an unmanned aerial vehicle and a motion capture system
CN104657970B (en) A kind of scaling method and calibration system of full-automatic binocular endoscope
US20220067972A1 (en) Multi-presence detection for performance capture
JP2018527575A (en) Device and method for finding a measurement point using an image capture device
CN212231547U (en) Mixed reality virtual preview shooting system
US11055922B2 (en) Apparatus and associated methods for virtual reality scene capture
CN111147840A (en) Automatic control and communication system for video and audio acquisition of 3D camera rocker arm
CN117527995A (en) Simulated live-action shooting method and system based on space simulated shooting
CN115103169A (en) Projection picture correction method, projection picture correction device, storage medium and projection equipment
CN202856914U (en) Photographic equipment of synthesis photography system based on computer virtual three-dimensional scene database
CN116368350A (en) Motion capture calibration using targets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant