CN115720250A - Synchronization method applied to augmented reality - Google Patents

Synchronization method applied to augmented reality Download PDF

Info

Publication number
CN115720250A
CN115720250A CN202211437455.0A CN202211437455A CN115720250A CN 115720250 A CN115720250 A CN 115720250A CN 202211437455 A CN202211437455 A CN 202211437455A CN 115720250 A CN115720250 A CN 115720250A
Authority
CN
China
Prior art keywords
real
virtual
camera
identification pattern
delay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211437455.0A
Other languages
Chinese (zh)
Inventor
王立新
盛果
李栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Lan Jing Technology Co ltd
Original Assignee
Beijing Lan Jing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Lan Jing Technology Co ltd filed Critical Beijing Lan Jing Technology Co ltd
Priority to CN202211437455.0A priority Critical patent/CN115720250A/en
Publication of CN115720250A publication Critical patent/CN115720250A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a synchronization method applied to augmented reality, which comprises the following steps: s1, measuring the delay between a virtual path and a real path: checking whether the real identification pattern is consistent with the virtual identification pattern through a display of the synthesis server, if not, continuously shaking the camera and simultaneously adjusting the delay time of the tracking data until the real identification pattern is consistent with the virtual identification pattern, wherein the set delay time is the delay between the two paths, and then compensating the delay through compensation software to realize the consistency of the pictures of the two paths; and S2, in the actual shooting process, enabling the pictures in the two groups of data to be consistent by compensation software in the synthesis server according to the delay in the S1, and realizing the interaction between the real actors and the virtual character. The technical scheme of the invention enables the actors to be synchronized with the virtual character and still keep synchronized when the camera moves rapidly.

Description

Synchronization method applied to augmented reality
Technical Field
The invention relates to the technical field of augmented reality, in particular to a synchronization method applied to augmented reality.
Background
Augmented Reality (AR) is a new technology for seamlessly integrating real world information and virtual world information, aims to sleeve a virtual world on a screen in the real world and interact with the real world, and refers to a virtual environment created by various hardware devices such as an LED display screen, a camera, tracking equipment, a virtual engine server and the like. At present, in the process of film and television shooting, the application of the augmented reality technology is very wide, and the main application scenes are as follows:
the 3D digital scene manufactured by a virtual Engine (an unknown Engine or a Notch Engine, etc.) is rendered in real time and transmitted to a synthesis server, a camera shoots a real image (such as an actor), and the synthesis server receives the shooting content of the camera through an acquisition card and synthesizes the shooting content with a virtual picture rendered by the virtual Engine to obtain a final picture.
In the scene, the problem difficult to solve is that the synchronization of the picture shot by the camera and the picture rendered in real time cannot be ensured, and actors and virtual characters are difficult to synchronize when the camera moves rapidly, so that only the speed of the camera operator for moving the camera is limited, and the shooting effect is influenced.
To this end, we propose a synchronization method applied to augmented reality to solve the above problem.
Disclosure of Invention
The present invention is directed to a synchronization method applied to augmented reality to solve the problems set forth in the background art.
In order to achieve the purpose, the invention provides the following technical scheme:
a synchronization method applied to augmented reality comprises the following steps:
s1, measuring the delay between a virtual path and a real path: the camera for shooting is provided with tracking equipment, the tracking equipment acquires tracking data of the camera, the tracking data is transmitted to a synthesis server, the synthesis server generates a virtual identification pattern according to the tracking data, the camera shoots real pictures of performers and display equipment, the real identification pattern is displayed on the display equipment, the real pictures are transmitted to the synthesis server through a real path, whether the real identification pattern and the virtual identification pattern are consistent or not is checked through a display of the synthesis server, if the real identification pattern and the virtual identification pattern are not consistent, the camera is continuously shaken and delay time of the tracking data is adjusted until the real identification pattern and the virtual identification pattern are consistent, the set delay time is delay between the two paths, and then compensation software is used for compensating the delay, so that the pictures of the two paths are consistent;
s2, in the actual shooting process, the tracking device transmits tracking data to the virtual engine, the virtual engine renders images in real time according to the tracking data and transmits the images to the synthesis server, meanwhile, pictures of the display device and the performer shot by the camera are also transmitted to the synthesis server, compensation software in the synthesis server enables the pictures in the two sets of data to be consistent according to the delay in the S1, interaction between a real actor and a virtual character is achieved, and the camera, the synthesis server and the tracking device are all genlock synchronous, namely, synchronous phase locking is achieved, so that the delay of the whole system is guaranteed to be fixed.
In a further embodiment, the tracking device is a redcopy infrared camera tracker, which is fixed on top of and moves with the video camera.
In a further embodiment, the tracking data includes position (x, y, z), pose (pan, tilt, roll) and perspective (fov) information of the camera, and the virtual engine rendering image process includes generating an image from the tracking data that is consistent with the camera position, pose, and perspective.
In a further embodiment, the real identification pattern and the virtual identification pattern are three vertical lines arranged side by side, the upper part is three real vertical lines, the lower part is three virtual vertical lines, and the picture consistency means that the three vertical lines at the upper part and the three vertical lines at the lower part are completely butted.
In a further embodiment, the display device comprises an LED screen or a curtain.
In a further embodiment, the virtual Engine is an unknown Engine or a Notch.
Compared with the prior art, the invention has the beneficial effects that: the actors are synchronized with the virtual character and remain synchronized when the camera is moving rapidly.
Drawings
FIG. 1 is a schematic view of an actual shooting process of the present patent;
FIG. 2 is a schematic diagram of a measurement delay flow of the present patent;
fig. 3 is a test chart of actual measurement delay of this patent.
Detailed Description
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art through specific situations.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 to 3, the present invention provides a synchronization method applied to augmented reality, which includes the following steps:
s1, measuring the delay between a virtual path and a real path: the camera for shooting is provided with a tracking device, the tracking device collects tracking data of the camera, the tracking data is transmitted to a synthesis server, the synthesis server generates a virtual identification pattern according to the tracking data, the camera shoots real pictures of performers and display equipment, the real identification pattern is displayed on the display equipment, the real pictures are transmitted to the synthesis server through a real path, a display of the synthesis server checks whether the real identification pattern is consistent with the virtual identification pattern, if not, the camera is continuously shaken and the delay time of the tracking data is adjusted until the real identification pattern is consistent, the delay time set at the moment is the delay between the two paths, and then compensation software is used for compensating the delay, so that the picture consistency of the two paths can be realized, wherein the camera is shaken mainly to make the virtual and real pictures move so as to conveniently adjust the delay time, finding out the time point when the virtual identification pattern is consistent with the real identification pattern, the principle is that the camera shoots a real picture at a certain position or posture, the virtual engine also has a virtual camera to shoot a virtual scene, the virtual camera of the virtual engine shoots a virtual image according to the position and posture of the real camera which is transmitted by tracking, the virtual image and the real image are synthesized together, then the virtual image and the real image which are synthesized into the frame are required to be at the same position and posture, if the camera is static, the position and posture are fixed, therefore, the camera must move, the position and posture of the virtual image and the real image which are synthesized each time are the same, if the virtual image is a sofa, a person sits on the sofa, and when the camera moves, if the synthesized virtual image and the real image are not the generated image at the same position and posture, the person and the sofa have relative motion phenomena, the person and the sofa are not relatively static, the picture looks like double images and is out of section, in conclusion, not only a certain frame is consistent, but also each frame is consistent after the delay time is adjusted, and the reality of a virtual image is ensured;
s2, in the actual shooting process, the tracking device transmits tracking data to the virtual engine, the virtual engine renders images in real time according to the tracking data and transmits the images to the synthesis server, meanwhile, pictures of the display device and the performer shot by the camera are also transmitted to the synthesis server, compensation software in the synthesis server enables the pictures in the two sets of data to be consistent according to the delay in the S1, interaction between a real actor and a virtual character is achieved, the camera, the synthesis server and the tracking device are all genlock synchronous, namely synchronous phase locking is achieved, and therefore the delay of the whole system is guaranteed to be fixed. In step S2, the delay result tested in step S1 is used to ensure synchronization between the virtual frame and the real frame.
In this embodiment, the tracking device is a redcopy infrared camera tracker, which is fixed on top of the camera and moves with the camera. The tracking data comprises position (x, y, z), posture (pan, tilt, roll) and visual angle (fov) information of the camera, and the virtual engine image rendering process comprises generating an image consistent with the position, posture and visual angle of the camera according to the tracking data.
In this embodiment, the real identification pattern and the virtual identification pattern are three vertical lines arranged side by side, the upper portion is three real vertical lines, the lower portion is three virtual vertical lines, and the uniform picture means that the three vertical lines at the upper portion and the three vertical lines at the lower portion are completely butted.
The display device comprises an LED screen or curtain. The virtual Engine is an unknown Engine or Notch.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (6)

1. A synchronization method applied to augmented reality is characterized by comprising the following steps:
s1, measuring the delay between a virtual path and a real path: the camera for shooting is provided with tracking equipment, the tracking equipment acquires tracking data of the camera, the tracking data is transmitted to a synthesis server, the synthesis server generates a virtual identification pattern according to the tracking data, the camera shoots real pictures of performers and display equipment, the real identification pattern is displayed on the display equipment, the real pictures are transmitted to the synthesis server through a real path, whether the real identification pattern and the virtual identification pattern are consistent or not is checked through a display of the synthesis server, if the real identification pattern and the virtual identification pattern are not consistent, the camera is continuously shaken and delay time of the tracking data is adjusted until the real identification pattern and the virtual identification pattern are consistent, the set delay time is delay between the two paths, and then compensation software is used for compensating the delay, so that the pictures of the two paths are consistent;
s2, in the actual shooting process, the tracking device transmits tracking data to the virtual engine, the virtual engine renders images in real time according to the tracking data and transmits the images to the synthesis server, meanwhile, pictures of the display device and the performer shot by the camera are also transmitted to the synthesis server, compensation software in the synthesis server enables the pictures in the two sets of data to be consistent according to the delay in the S1, interaction between a real actor and a virtual character is achieved, and the camera, the synthesis server and the tracking device are all genlock synchronous, namely, synchronous phase locking is achieved, so that the delay of the whole system is guaranteed to be fixed.
2. The synchronization method applied to augmented reality according to claim 1, wherein: the tracking device is a redcopy infrared camera tracker that is fixed on top of and moves with the video camera.
3. The synchronization method applied to augmented reality according to claim 1, wherein: the tracking data comprises position (x, y, z), attitude (pan, tilt, roll) and view angle (fov) information of the camera, and the virtual engine image rendering process comprises generating an image consistent with the position, attitude and view angle of the camera according to the tracking data.
4. The synchronization method applied to augmented reality according to claim 1, wherein: the real identification pattern and the virtual identification pattern are three vertical lines arranged side by side, the upper part is three real vertical lines, the lower part is three virtual vertical lines, and the picture consistency means that the three vertical lines at the upper part and the three vertical lines at the lower part are completely butted.
5. The synchronization method applied to augmented reality according to claim 1, wherein: the display device comprises an LED screen or curtain.
6. The synchronization method applied to augmented reality according to claim 1, wherein: the virtual Engine is an unknown Engine or a Notch.
CN202211437455.0A 2022-11-16 2022-11-16 Synchronization method applied to augmented reality Pending CN115720250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211437455.0A CN115720250A (en) 2022-11-16 2022-11-16 Synchronization method applied to augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211437455.0A CN115720250A (en) 2022-11-16 2022-11-16 Synchronization method applied to augmented reality

Publications (1)

Publication Number Publication Date
CN115720250A true CN115720250A (en) 2023-02-28

Family

ID=85255374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211437455.0A Pending CN115720250A (en) 2022-11-16 2022-11-16 Synchronization method applied to augmented reality

Country Status (1)

Country Link
CN (1) CN115720250A (en)

Similar Documents

Publication Publication Date Title
EP2161925B1 (en) Method and system for fusing video streams
CN107341832B (en) Multi-view switching shooting system and method based on infrared positioning system
US6084979A (en) Method for creating virtual reality
Gibbs et al. Virtual studios: An overview
Kawanishi et al. Generation of high-resolution stereo panoramic images by omnidirectional imaging sensor using hexagonal pyramidal mirrors
US5479597A (en) Imaging system for producing a sequence of composite images which combine superimposed real images and synthetic images
US5835133A (en) Optical system for single camera stereo video
US6738073B2 (en) Camera system with both a wide angle view and a high resolution view
US7173672B2 (en) System and method for transitioning between real images and virtual images
CA2201680C (en) Processing image data
US5737031A (en) System for producing a shadow of an object in a chroma key environment
US20060165310A1 (en) Method and apparatus for a virtual scene previewing system
CN110866978A (en) Camera synchronization method in real-time mixed reality video shooting
WO2012046371A1 (en) Image display device, and image display method
WO2002096096A1 (en) 3d instant replay system and method
EP1843581A2 (en) Video processing and display
US5886747A (en) Prompting guide for chroma keying
CN213461894U (en) XR-augmented reality system
CN107862718A (en) 4D holographic video method for catching
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
JP2007501950A (en) 3D image display device
EP0878099B1 (en) Chroma keying studio system
CN114125301B (en) Shooting delay processing method and device for virtual reality technology
CA3134758A1 (en) System for capturing and projecting images, use of the system, and method for capturing, projecting and inserting images
CN213126145U (en) AR-virtual interactive system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination