CN110866978A - Camera synchronization method in real-time mixed reality video shooting - Google Patents

Camera synchronization method in real-time mixed reality video shooting Download PDF

Info

Publication number
CN110866978A
CN110866978A CN201911083958.0A CN201911083958A CN110866978A CN 110866978 A CN110866978 A CN 110866978A CN 201911083958 A CN201911083958 A CN 201911083958A CN 110866978 A CN110866978 A CN 110866978A
Authority
CN
China
Prior art keywords
camera
virtual
scene
video
reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911083958.0A
Other languages
Chinese (zh)
Inventor
杨晓春
王斌
冯策
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Dongzhiweishi Technology Co Ltd
Original Assignee
Liaoning Dongzhiweishi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Dongzhiweishi Technology Co Ltd filed Critical Liaoning Dongzhiweishi Technology Co Ltd
Priority to CN201911083958.0A priority Critical patent/CN110866978A/en
Publication of CN110866978A publication Critical patent/CN110866978A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a camera synchronization method in real-time mixed reality video shooting, which relates to the technical field of mixed reality video shooting and comprises the following steps: s1, establishing a virtual scene model; s2, importing the obtained virtual scene model into a virtual reality development platform, and adding a scene virtual camera for the virtual scene model; s3, establishing communication connection between the virtual reality equipment and the virtual reality development platform; s4, building an indoor shooting environment, configuring light and a camera, and building communication between the camera and a virtual reality development platform; s5, configuring optical parameters of the video camera and the scene virtual camera, keeping the optical parameters of the video camera and the scene virtual camera consistent, and synchronizing the motion poses of the video camera and the scene virtual camera; and S6, synthesizing the pictures obtained by the scene virtual camera and the video camera to form a final special effect video picture, which can improve the reality of the mixed reality video and achieve the aim of falsifying and falsifying.

Description

Camera synchronization method in real-time mixed reality video shooting
Technical Field
The invention relates to the technical field of mixed reality video shooting, in particular to a camera synchronization method in real-time mixed reality video shooting.
Background
Virtual Reality (VR) technology is a computer simulation system that can create and experience a Virtual world, which uses a computer to create a Virtual environment, is a systematic simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors, and immerses users in the environment.
The virtual reality technology can immerse a user in a three-dimensional virtual environment generated by a computer, is isolated from the real environment, and needs to achieve the purpose of experience by means of corresponding hardware equipment, wherein the equipment which is mainstream at present is HTC Vive and Oculus Rift. On the basis of virtual Reality, an Augmented Reality (AR) technology and a Mixed Reality (MR) technology are derived, wherein the Augmented Reality technology is a fusion technology for adding or removing interactive virtual objects or information generated by a computer in real time in a real environment, and is used for superimposing virtual Reality on the real world, and currently, mainstream equipment is microsoft Hololens glasses and Google Glass glasses of Google; the mixed reality technology mixes the real environment and the virtual environment with each other through the hologram, is the integration of VR technology and AR technology, and the video of advanced mixed reality technology shooting can reach with the purpose of falsely being about the truth.
The virtual reality technology can provide colorful virtual experience through a player wearing the helmet, but the virtual experience can only be shared into videos through a first person, and other audiences who do not wear the helmet cannot obtain satisfactory watching experience. The mixed reality technology innovatively fuses the virtual scene and the real player and shoots the virtual scene into a video with great immersion. The mixed reality video can share the virtual reality scene more intuitively, and can also carry out the special effect film shooting fast and efficiently simultaneously, and in traditional film shooting, the scene actor of the special effect performs under the background of a green curtain or a blue curtain, and then consumes huge energy in the later stage to synthesize the picture, and carry out long-time rendering to each frame of picture. The mixed reality technology can make special-effect movies in real time, and greatly shortens the movie shooting period and shooting cost.
A prominent problem in the existing mixed reality video shooting process is the camera synchronization problem of a virtual scene and a real environment, and in order to achieve real-time shooting, information such as translation, rotation and parameters of a camera in the virtual scene needs to be consistent with information such as translation, rotation and parameters of a real camera used for shooting in the real environment, high-precision synchronization can improve the reality of the mixed reality video, and conversely, inaccurate synchronization can cause unreal pictures.
In the current movie production, the application of special effect movies is more and more extensive, and the application of the digital special effect technology can realize a plurality of pictures which are difficult to shoot in reality, even illusion scenes which do not exist in reality. The virtual scene and the real shooting of the actors under the blue-green screen are combined to shoot a strong realistic picture, so that the aim of falsifying and disordering is fulfilled. However, in the current special-effect movie shooting, especially in hollywood level, huge manpower and time costs are consumed from live-action shooting, virtual scene modeling, post-synthesis manufacturing and rendering, and a camera synchronization method in real-time mixed reality video shooting is provided for the method.
Disclosure of Invention
The present invention is directed to a method for synchronizing a camera in real-time mixed reality video shooting, so as to solve the problems mentioned in the background art.
In order to achieve the purpose, the invention provides the following technical scheme:
a camera synchronization method in real-time mixed reality video shooting comprises the following steps:
s1, establishing a virtual scene model;
s2, importing the obtained virtual scene model into a virtual reality development platform, and adding a scene virtual camera for the virtual scene model;
s3, establishing communication connection between the virtual reality equipment and the virtual reality development platform;
s4, building an indoor shooting environment, configuring light and a camera, and building communication between the camera and a virtual reality development platform;
s5, configuring optical parameters of the video camera and the scene virtual camera, keeping the optical parameters of the video camera and the scene virtual camera consistent, and synchronizing the motion poses of the video camera and the scene virtual camera;
and S6, synthesizing the pictures obtained by the scene virtual camera and the video camera to form a final special effect video picture.
As a further scheme of the invention: in step S5, the method of synchronizing the motion poses of the video camera and the scene virtual camera includes:
the camera shares the position and posture information thereof to the scene virtual camera, so that the scene virtual camera maintains the same position and posture information as the camera.
As a still further scheme of the invention: the method comprises the steps of installing a tracking module used for obtaining position and posture information of virtual reality equipment on a video camera, establishing communication between the tracking module and a virtual reality development platform, obtaining signals of the tracking module by the virtual reality development platform in real time, and controlling a scene virtual camera and the video camera to keep synchronization of motion poses.
As a still further scheme of the invention: in step S6, the scene obtained by the scene virtual camera is a four-screen image, which is displayed in the foreground, the alpha channel of the foreground, the background, and the virtual reality device, respectively.
As a still further scheme of the invention: in step S6, the actor picture obtained by the camera is scratched to remove the blue-green screen background in the picture, and then the background, the scratched actor picture and the foreground are superimposed and synthesized in sequence to form a final special-effect video picture.
As a still further scheme of the invention: the communication between the virtual reality equipment and the virtual reality development platform and the communication between the camera and the virtual reality development platform are realized through HDMI high-definition data lines.
Compared with the prior art, the invention has the beneficial effects that: the position and posture information of the scene virtual camera and the video camera can be kept completely synchronous, the sense of reality of the mixed reality video can be improved, and the aim of falsifying and falsifying is fulfilled. The live mixed reality video and even live broadcast plug flow can be shot in real time, the production period of special-effect video shooting is greatly shortened, and the production cost is reduced.
Drawings
Fig. 1 is a flow chart of a method of camera synchronization in real-time mixed reality video capture.
Fig. 2 is a schematic diagram of an indoor shooting environment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Example 1
Referring to fig. 1-2, in an embodiment of the present invention, a camera synchronization method in real-time mixed reality video shooting includes the following steps:
s1, establishing a virtual scene model, wherein the virtual scene model can be realized through three-dimensional modeling software, the virtual scene model needs to comprise various auxiliary materials which can enhance the reality of a three-dimensional scene, such as light shadow, a mapping, a particle effect and the like besides a general three-dimensional model, the quality of the three-dimensional scene model determines the effect of real-time synthesis in the later stage, and the currently mainstream three-dimensional modeling software comprises 3dsMax, Maya, Cinema 4D and the like;
s2, importing the obtained virtual scene model into a virtual reality development platform, where the virtual reality development platform is actually a Unity3D development platform running on a high-performance computer, Unity3D is a currently widely applied virtual reality development platform, and adding a corresponding scene virtual camera, a driving script, and the like after importing the virtual scene model;
s3, establishing communication connection between virtual reality equipment and a virtual reality development platform, wherein the virtual reality equipment can be HTCVive and is connected with a high-performance computer running a Unity3D development platform through an HDMI high-definition data line, namely, communication between the virtual reality equipment and the virtual reality development platform is realized, and in addition, an HTC official virtual reality development library, namely a SteamVR library file, can be added to the Unity3D development platform and is used for improving the development efficiency of the virtual reality development platform;
s4, building an indoor shooting environment, configuring light and a camera, and building communication between the camera and a virtual reality development platform;
specifically, the method comprises the following steps: the method comprises the steps of establishing a shooting background environment of a blue curtain or a green curtain, generally adopting the blue-green curtain as a background in special-effect movie shooting, wherein the blue color and the green color are colors greatly different from human skin, the human skin mainly comprises yellow, red and orange colors, and the contrast is strong, so that the later-stage character matting can achieve a better effect, and meanwhile, the curtain needs to be uniformly distributed with light, so that an obvious shadow is prevented from being formed in the curtain; the curtain of the blue curtain or the green curtain needs to be uniformly light-distributed, so that an obvious shadow is prevented from being formed in the curtain;
and then configuring a camera, and connecting the camera with a high-performance computer through a video acquisition card and an HDMI high-definition data line to realize the communication between the camera and the virtual reality development platform.
S5, configuring optical parameters of the video camera and the scene virtual camera, keeping the optical parameters of the video camera and the scene virtual camera consistent, and synchronizing the motion poses of the video camera and the scene virtual camera;
in order to achieve the purpose of synchronizing the virtual scene and the real scene, the optical parameters and the motion pose of the virtual camera of the scene are required to be completely synchronized and consistent with the camera shooting the actor.
If the camera shares the position and posture information of the camera with the scene virtual camera, the scene virtual camera can keep the same position and posture information as the camera.
A tracking module capable of acquiring position and posture information of virtual reality equipment is designed, the tracking module is installed or fixed on a video camera, communication between the tracking module and a virtual reality development platform is established, the virtual reality development platform acquires signals of the tracking module in real time, and the scene virtual camera and the video camera are controlled to keep synchronization of motion poses.
The tracking module can position the three-dimensional space coordinate information of the tracking module in an effective range in real time, and the tracking module is a medium for keeping the video camera and the scene virtual camera synchronous.
Here, the tracking module is designed as a handle (the handle is the same as the default two handles of the virtual reality device HTC Vive, and is referred to as a third handle), and since the virtual reality device HTC Vive only supports two handles by default, the third handle needs to be accessed into a high-performance computer through a USB3.0 interface to enable the third handle, so that the scene virtual camera in the Unity3D development platform is bound to the model of the third handle. In this way, the movement of the camera can be transferred to the scene virtual camera in real time through the third handle.
Cfg configuration files are established, and in order to achieve complete synchronization between an imaging picture of the scene virtual camera and an actor picture shot by a real video camera, the scene virtual camera and the video camera need to be synchronized in real time, and optical parameters of the scene virtual camera and the video camera need to be consistent.
Cfg profile parameters are as follows:
x=0.0
y=0.002
z=-0.02
rx=76
ry=0
rz=0
fov=60
near=0.1
far=100
sceneResolutionScale=0.5
where x, y, z represents the three-dimensional distance of the real camera from the virtual camera represented by the third handle, rx, ry, rz represents the flip angle of the real camera with respect to the virtual camera represented by the third handle, fov represents the vertical field angle of the scene virtual camera, near and far represent the closest and farthest distances of the background picture reality, and sceneresutionscale represents the quality of the picture.
In practical application, the scene virtual camera generates an obtained picture into four split-screen pictures which are respectively a picture displayed in a foreground, a picture displayed in a foreground alpha channel, a picture displayed in a background and a picture displayed in a virtual reality device.
Here, the display device for displaying the four-divided screen image needs to satisfy the resolution of 4K, the resolution of each divided screen of the formed four-divided screen can reach the 1080P composite video image requirement, and then the images of the foreground and the background are used for final composite.
And S6, the background of the blue-green screen is removed by keying the actor picture shot by the camera through chroma keys, and then the actor picture after keying and the foreground are superposed and synthesized in sequence to form the final special-effect video picture.
It should be particularly noted that, in the technical solution, the position and posture information of the scene virtual camera and the video camera can be kept completely synchronous, which may improve the sense of reality of the mixed reality video, and achieve the purpose of falsifying and falsifying. The live mixed reality video and even live broadcast plug flow can be shot in real time, the production period of special-effect video shooting is greatly shortened, and the production cost is reduced.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (6)

1. A camera synchronization method in real-time mixed reality video shooting is characterized by comprising the following steps:
s1, establishing a virtual scene model;
s2, importing the obtained virtual scene model into a virtual reality development platform, and adding a scene virtual camera for the virtual scene model;
s3, establishing communication connection between the virtual reality equipment and the virtual reality development platform;
s4, building an indoor shooting environment, configuring light and a camera, and building communication between the camera and a virtual reality development platform;
s5, configuring optical parameters of the video camera and the scene virtual camera, keeping the optical parameters of the video camera and the scene virtual camera consistent, and synchronizing the motion poses of the video camera and the scene virtual camera;
and S6, synthesizing the pictures obtained by the scene virtual camera and the video camera to form a final special effect video picture.
2. The method for synchronizing the cameras in the real-time mixed reality video shooting process according to claim 1, wherein in step S5, the method for synchronizing the motion poses of the video camera and the scene virtual camera comprises:
the camera shares the position and posture information thereof to the scene virtual camera, so that the scene virtual camera maintains the same position and posture information as the camera.
3. The method for synchronizing the cameras in the real-time mixed reality video shooting process according to claim 2, wherein a tracking module for acquiring the position and posture information of the virtual reality device is installed on the video camera, and meanwhile, communication between the tracking module and a virtual reality development platform is established, and the virtual reality development platform acquires the signal of the tracking module in real time and controls the scene virtual camera and the video camera to keep synchronization of the motion posture.
4. The method for camera synchronization in real-time mixed reality video shooting as claimed in claim 1, wherein in step S6, the scene obtained by the scene virtual camera is a four-screen frame, which is the frame displayed in the foreground, the alpha channel of the foreground, the background and the virtual reality device respectively.
5. The method as claimed in claim 4, wherein in step S6, the actor frame captured by the video camera is scratched to remove the blue-green background in the frame, and then the background, scratched actor frame and foreground are superimposed and combined in sequence to form the final special effect video frame.
6. The method for synchronizing the cameras in the real-time mixed reality video shooting process according to claim 1, wherein the communication between the virtual reality device and the virtual reality development platform and the communication between the video camera and the virtual reality development platform are realized through HDMI high definition data lines.
CN201911083958.0A 2019-11-07 2019-11-07 Camera synchronization method in real-time mixed reality video shooting Pending CN110866978A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911083958.0A CN110866978A (en) 2019-11-07 2019-11-07 Camera synchronization method in real-time mixed reality video shooting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911083958.0A CN110866978A (en) 2019-11-07 2019-11-07 Camera synchronization method in real-time mixed reality video shooting

Publications (1)

Publication Number Publication Date
CN110866978A true CN110866978A (en) 2020-03-06

Family

ID=69653336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911083958.0A Pending CN110866978A (en) 2019-11-07 2019-11-07 Camera synchronization method in real-time mixed reality video shooting

Country Status (1)

Country Link
CN (1) CN110866978A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970415A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Real-time synthesis preview system and method for realizing miniature scene and live-action shooting
CN111970453A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Virtual shooting system and method for camera robot
CN111988535A (en) * 2020-08-10 2020-11-24 山东金东数字创意股份有限公司 System and method for optically positioning fusion picture
CN112181139A (en) * 2020-09-17 2021-01-05 东北大学 Cooperative control interaction method for virtual reality and mixed reality
CN112837425A (en) * 2021-03-10 2021-05-25 西南交通大学 Mixed reality illumination consistency adjusting method
CN112929627A (en) * 2021-02-22 2021-06-08 广州博冠信息科技有限公司 Virtual reality scene implementation method and device, storage medium and electronic equipment
CN113284233A (en) * 2021-06-17 2021-08-20 知守科技(杭州)有限公司 Visual monitoring method, device, system, electronic device and storage medium
CN114760458A (en) * 2022-04-28 2022-07-15 中南大学 Method for synchronizing tracks of virtual camera and real camera of high-reality augmented reality studio
CN115103138A (en) * 2022-07-11 2022-09-23 北京梦想绽放科技有限公司 Method and system for generating virtual-real fusion image based on space-time consistency
CN115150555A (en) * 2022-07-15 2022-10-04 北京字跳网络技术有限公司 Video recording method, device, equipment and medium
CN115802165A (en) * 2023-02-10 2023-03-14 成都索贝数码科技股份有限公司 Lens moving shooting method applied to live connection of different places and same scenes
CN116506559A (en) * 2023-04-24 2023-07-28 江苏拓永科技有限公司 Virtual reality panoramic multimedia processing system and method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104836938A (en) * 2015-04-30 2015-08-12 江苏卡罗卡国际动漫城有限公司 Virtual studio system based on AR technology
CN105407259A (en) * 2015-11-26 2016-03-16 北京理工大学 Virtual camera shooting method
CN106980369A (en) * 2017-03-01 2017-07-25 广州市英途信息技术有限公司 The synthesis of 3rd multi-view video of virtual reality project and output system and method
CN107158695A (en) * 2017-06-16 2017-09-15 苏州蜗牛数字科技股份有限公司 A kind of reality mixes the method and system showed with virtual scene
CN107976811A (en) * 2017-12-25 2018-05-01 河南新汉普影视技术有限公司 A kind of simulation laboratory and its emulation mode based on virtual reality mixing
US20190110004A1 (en) * 2017-10-09 2019-04-11 Tim Pipher Multi-Camera Virtual Studio Production Process
CN109639933A (en) * 2018-12-07 2019-04-16 北京美吉克科技发展有限公司 A kind of method and system of 360 degree of panorama program makings of virtual studio
CN109727314A (en) * 2018-12-20 2019-05-07 初速度(苏州)科技有限公司 A kind of fusion of augmented reality scene and its methods of exhibiting

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104836938A (en) * 2015-04-30 2015-08-12 江苏卡罗卡国际动漫城有限公司 Virtual studio system based on AR technology
CN105407259A (en) * 2015-11-26 2016-03-16 北京理工大学 Virtual camera shooting method
CN106980369A (en) * 2017-03-01 2017-07-25 广州市英途信息技术有限公司 The synthesis of 3rd multi-view video of virtual reality project and output system and method
CN107158695A (en) * 2017-06-16 2017-09-15 苏州蜗牛数字科技股份有限公司 A kind of reality mixes the method and system showed with virtual scene
US20190110004A1 (en) * 2017-10-09 2019-04-11 Tim Pipher Multi-Camera Virtual Studio Production Process
CN107976811A (en) * 2017-12-25 2018-05-01 河南新汉普影视技术有限公司 A kind of simulation laboratory and its emulation mode based on virtual reality mixing
CN109639933A (en) * 2018-12-07 2019-04-16 北京美吉克科技发展有限公司 A kind of method and system of 360 degree of panorama program makings of virtual studio
CN109727314A (en) * 2018-12-20 2019-05-07 初速度(苏州)科技有限公司 A kind of fusion of augmented reality scene and its methods of exhibiting

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111988535A (en) * 2020-08-10 2020-11-24 山东金东数字创意股份有限公司 System and method for optically positioning fusion picture
CN111970453A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Virtual shooting system and method for camera robot
CN111970415A (en) * 2020-09-01 2020-11-20 太仓中科信息技术研究院 Real-time synthesis preview system and method for realizing miniature scene and live-action shooting
CN112181139B (en) * 2020-09-17 2022-04-15 东北大学 Cooperative control interaction method for virtual reality and mixed reality
CN112181139A (en) * 2020-09-17 2021-01-05 东北大学 Cooperative control interaction method for virtual reality and mixed reality
CN112929627A (en) * 2021-02-22 2021-06-08 广州博冠信息科技有限公司 Virtual reality scene implementation method and device, storage medium and electronic equipment
CN112837425A (en) * 2021-03-10 2021-05-25 西南交通大学 Mixed reality illumination consistency adjusting method
CN113284233A (en) * 2021-06-17 2021-08-20 知守科技(杭州)有限公司 Visual monitoring method, device, system, electronic device and storage medium
CN114760458A (en) * 2022-04-28 2022-07-15 中南大学 Method for synchronizing tracks of virtual camera and real camera of high-reality augmented reality studio
CN114760458B (en) * 2022-04-28 2023-02-24 中南大学 Method for synchronizing tracks of virtual camera and real camera of high-reality augmented reality studio
CN115103138A (en) * 2022-07-11 2022-09-23 北京梦想绽放科技有限公司 Method and system for generating virtual-real fusion image based on space-time consistency
CN115150555A (en) * 2022-07-15 2022-10-04 北京字跳网络技术有限公司 Video recording method, device, equipment and medium
CN115150555B (en) * 2022-07-15 2023-12-19 北京字跳网络技术有限公司 Video recording method, device, equipment and medium
CN115802165A (en) * 2023-02-10 2023-03-14 成都索贝数码科技股份有限公司 Lens moving shooting method applied to live connection of different places and same scenes
CN116506559A (en) * 2023-04-24 2023-07-28 江苏拓永科技有限公司 Virtual reality panoramic multimedia processing system and method thereof

Similar Documents

Publication Publication Date Title
CN110866978A (en) Camera synchronization method in real-time mixed reality video shooting
US11076142B2 (en) Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
Matsuyama et al. 3D video and its applications
US8358332B2 (en) Generation of three-dimensional movies with improved depth control
WO2018121333A1 (en) Real-time generation method for 360-degree vr panoramic graphic image and video
CN113099204B (en) Remote live-action augmented reality method based on VR head-mounted display equipment
US20140306995A1 (en) Virtual chroma keying in real time
WO2006047610A2 (en) Method and apparatus for a virtual scene previewing system
EP2476259A1 (en) Virtual insertions in 3d video
CN106331521A (en) Film and television production system based on combination of network virtual reality and real shooting
US20070247518A1 (en) System and method for video processing and display
CN213461894U (en) XR-augmented reality system
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
CN107862718A (en) 4D holographic video method for catching
US11615755B1 (en) Increasing resolution and luminance of a display
Schreer et al. Advanced volumetric capture and processing
CN108632538B (en) CG animation and camera array combined bullet time shooting system and method
CN116320363B (en) Multi-angle virtual reality shooting method and system
CN116320364B (en) Virtual reality shooting method and display method based on multi-layer display
CN112003999A (en) Three-dimensional virtual reality synthesis algorithm based on Unity 3D
CN202171927U (en) Phantom imaging system
CN104202589A (en) Multichannel three-dimensional film video synchronous playing method
CN114885147B (en) Fusion production and broadcast system and method
Zhou et al. RGBD-based real-time volumetric reconstruction system: Architecture design and implementation
KR20230018571A (en) Image photographing solution of extended reality based on virtual production system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200306

RJ01 Rejection of invention patent application after publication