CN213461894U - XR-augmented reality system - Google Patents
XR-augmented reality system Download PDFInfo
- Publication number
- CN213461894U CN213461894U CN202021997017.6U CN202021997017U CN213461894U CN 213461894 U CN213461894 U CN 213461894U CN 202021997017 U CN202021997017 U CN 202021997017U CN 213461894 U CN213461894 U CN 213461894U
- Authority
- CN
- China
- Prior art keywords
- camera
- server
- real
- augmented reality
- distuise
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The utility model relates to a XR-augmented reality system includes: the camera is connected with the camera tracking system; the synchronous phase locking system is connected with the camera tracking system, the camera and the server; the camera, the camera tracking system and the real-time rendering engine are all connected with the server; the server synthesizes and outputs the images captured by the camera in real time and the rendering content output by the real-time rendering engine, so that the expansion of the original real scene is realized, and the visual perception of a user is improved.
Description
Technical Field
The utility model relates to an extended reality technical field especially relates to an XR-extended reality system.
Background
The image playing technology in the current market is mainly as follows:
1. conventional shooting is used for live broadcasting and video recording (consistent with the actual effect on the scene).
2. The conventional shooting and later editing combination is used for recording programs and videos (in actual effect, certain packaging treatment is carried out to enable the effect to be more perfect, but time and cost are needed in later period, and the effect of a moving lens cannot be packaged).
3. AR synthesis technology, wherein virtual pictures are superposed and output pictures are synthesized in real time before live pictures are shot.
4. The keying technology of blue screen or green screen (the picture needs the later stage to be synthesized, the participant does not have the experience of environment, the personage, stage property can not have blue or green when shooing, the shadow effect all needs the later stage to be accomplished, it is higher to the post production requirement).
SUMMERY OF THE UTILITY MODEL
Based on this, the utility model aims at providing a XR-extension reality system realizes the expansion of original reality scene, improves user's visual perception.
In order to achieve the above object, the utility model provides a following scheme:
an XR-augmented reality system, comprising:
the server is used for synthesizing and outputting the images and the rendering content;
the camera is connected with the server and used for sending the images captured in real time to the server;
the camera tracking system is connected with the server and is bound with the camera, and the camera tracking system is used for acquiring attitude data of the camera and sending spatial position data of the camera to the server;
the synchronous phase locking system is respectively connected with the camera tracking system, the camera and the server and is used for sending synchronous signals to the camera tracking system, the camera and the server;
and the real-time rendering engine is connected with the server and used for sending the rendering content by the server.
Optionally, the system further comprises a display carrier, and the display carrier is used for displaying the images synthesized by the server.
Optionally, the display carrier includes an LED screen, a projector, a television, and a mobile terminal.
Optionally, the server is a media server.
Optionally, the media server model comprises a distuise 2x4, a distuise 4x4pro, a distuise gx1, a distuise gx2, a distuise gx2c, or a distuise vx 4.
According to the utility model provides a concrete embodiment, the utility model discloses a following technological effect:
the utility model discloses a XR-augmented reality system includes: the camera is connected with the camera tracking system; the synchronous phase locking system is connected with the camera tracking system, the camera and the server; the camera, the camera tracking system and the real-time rendering engine are all connected with the server; the server synthesizes and outputs the images captured by the camera in real time and the rendering content output by the real-time rendering engine, so that the expansion of the original real scene is realized, and the visual perception of a user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a schematic structural diagram of an XR-augmented reality system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative work belong to the protection scope of the present invention.
The utility model aims at providing a XR-extension reality system realizes the expansion of original reality scene, improves user's visual perception.
In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention is described in detail with reference to the accompanying drawings and the detailed description.
The terms referred to in this example are explained below:
virtual Reality (Virtual Reality VR) refers to a completely Virtual personal experience.
Augmented Reality (Augmented Reality AR) refers to the complete passive superimposition of virtual pictures on a real basis.
Mixed Reality (Mixed Reality MR) refers to a virtual picture, a personal experience, that can be actively interacted on a real basis.
XR-augmented reality is a completely new set of technical concepts. Where X represents both the extended (Xtended) and the unknown variable (X), and R represents reality. XR-augmented reality contains concepts of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), and applies them together in different scenes or industries, blending virtual and real worlds in various combinations, thereby making creation more possible. In the video field, AR and MR technologies are used simultaneously, the motion of a camera and the tracking of a lens are used, the picture of a virtual environment is used for filling and covering a part of a real environment, and the effect of supplementing and enlarging a scene is achieved.
XR-augmented reality technology is a combination of real space and virtual space.
Image matting technology: the KEY image matting technology utilizes later software to scrub specific areas in a picture shot on site, or specific color (blue or green) or brightness in the picture, the color brightness of the areas is scratched out of the picture, the areas become transparent areas to form an alpha transparent channel, and then the areas are subjected to a final synthesis and superposition mode with a digital CG background or other picture materials, so that the wanted picture is finally combined.
Real-time rendering (rendering engine): the real-time rendering refers to drawing three-dimensional data into two-dimensional bitmaps according to a graphics algorithm and displaying the bitmaps in real time. The essence of the method is to calculate and output image data in real time, and the method requires rendering and displaying a picture in a short time and simultaneously rendering and displaying the next picture. This technique cannot be realized only by means of a CPU, and needs to be realized by means of a display card.
Tracking a camera: the camera tracking system (stype, mosys, ncam) is mainly responsible for the spatial positioning of the camera and sends the posture information of the camera to the hub server in real time. The method mainly aims to synchronize the three-dimensional space data of the real cameras with the virtual cameras in real time, and then the real-time rendering engine renders the virtual environment in real time.
Fig. 1 is the utility model relates to a XR-augmented reality system structure sketch map. As shown in fig. 1, the utility model relates to an XR-augmented reality system includes: a Camera (Camera)101, a Camera tracking system (Camera tracking)102, a GenLock system (GenLock)103, a Real-Time Rendering Engine (Real-Time Rendering Engine)104, and a server 105.
The camera 101 is mainly a broadcast-level camera and a movie-level camera; the camera tracking system 102 has camera tracking systems of stype, mosys and ncam; the real-time rendering engine 104 has real-time rendering engines of Unreal, Notch, and Unity.
A camera 101 for transmitting the real-time captured images to the server 105.
A camera tracking system 102, wherein the camera tracking system 102 binds the camera 101, and the camera tracking system 102 is configured to acquire the pose data of the camera 101 and transmit the spatial position data of the camera 101 to the server 105.
A genlock system 103 for sending synchronization signals to the camera tracking system 102, the cameras 101 and the server 105.
A real-time rendering engine 104, configured to send the rendered content to the server 105.
The server 105 is configured to output the image and the rendered content in a composite manner.
The system further comprises a display carrier 106, said display carrier 106 being adapted to display the image synthesized by said server 105.
The Display carrier 106 includes a first Display carrier (Display Screen) which is a real-world content Display and a second Display carrier (combination Display) which is an augmented reality Display.
The display carrier 106 specifically comprises an LED screen, a projector, a television set and a mobile terminal.
The Server 105 is a Media Server (Media Server) 105.
The media server 105 model includes a distuise 2x4, a distuise 4x4pro, a distuise gx1, a distuise gx2, a distuise gx2c, or a distuise vx 4.
In this embodiment, an XR-augmented reality system has a workflow as follows:
the camera 101 is used as the most front-end device in the system and mainly responsible for capturing images in real time; the camera tracking system 102 based on algorithms such as infrared light emitting or reflecting recognition, graphic image recognition, mechanical data sensing and the like binds the real camera 101 and sends spatial position data of the sensor to the media server 105 in real time; after the processing of the media server 105, the camera tracking system 102, the real-time rendering engine 104 and the real space coordinate are unified, so that the real world and the virtual world are completely overlapped; the genlock system 103 sends synchronization signals to keep the frequencies of all parts in the system consistent; at this time, the real camera 101 is matched with the virtual world camera and is in the virtual content constructed by the real-time rendering engine 104, and at the same time, the camera 101 can capture the picture of the real world in real time, the picture is virtually and truly coexisted through the synthesis processing of the media server 105, and the picture can present the augmented reality virtual effect after being output to the synthesis display carrier 106.
The virtual world camera is a camera manually added in a virtual scene and a virtual space corresponding to the real world, and is used for simulating the function and real-time displacement posture of a real camera. In an XR-augmented reality system, there are two virtual cameras, one manually added in the media server 105 and the other manually added in the real-time rendering server.
The utility model discloses a XR-augmented reality system to display screen such as LED screen or projecting apparatus shows the medium as the basis, through computer vision algorithm, combines camera position tracking system and implements the engine system of rendering up, fills virtual content in real time in the part beyond this demonstration medium, covers the real environment beyond the display screen, lets limited real space realize unlimited extension. In the aspect of film and television shooting, the problem of creating a large scene due to the limitation of site space is solved, the problem of the impression of the participants in the traditional blue/green screen matting is solved, the workload of film/television post-production is reduced, and the shooting and production cost is reduced.
Additionally, the utility model discloses a XR-extension reality system and keying comparison of technique:
1. image matting is not needed, the later work is reduced, and what you see is what you get.
2. The lighting effects such as spot light, shadow, reflected light and the like are real.
3. The color of the clothes and props is not required on site. (the clothing property in the green shed can not be green/blue)
4. The actors can see scenes and interactive objects, so that the actors can avoid imagination from space and have a scene sense for leaders.
5. The director can adjust the scene because what you see is what you get, and the problem that the effect is not perfect after the keying synthesis leads to shooting again is avoided.
The XR-augmented reality workflow thus proposes the concept of what you see is what you get, i.e., adding augmented reality in real time while shooting, greatly reducing the work later, the director and audience can see the final effect directly, making it possible that much of the original industry was not possible.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The principle and the implementation of the present invention are explained herein by using specific examples, and the above description of the embodiments is only used to help understand the core idea of the present invention; meanwhile, for the general technical personnel in the field, according to the idea of the present invention, there are changes in the concrete implementation and the application scope. In summary, the content of the present specification should not be construed as a limitation of the present invention.
Claims (5)
1. An XR-augmented reality system, the system comprising:
the server is used for synthesizing and outputting the images and the rendering content;
the camera is connected with the server and used for sending the images captured in real time to the server;
the camera tracking system is connected with the server and is bound with the camera, and the camera tracking system is used for acquiring attitude data of the camera and sending spatial position data of the camera to the server;
the synchronous phase locking system is respectively connected with the camera tracking system, the camera and the server and is used for sending synchronous signals to the camera tracking system, the camera and the server;
and the real-time rendering engine is connected with the server and used for sending the rendering content by the server.
2. The XR-augmented reality system of claim 1, further comprising a display carrier for displaying the server-composited imagery.
3. The XR-augmented reality system of claim 2, wherein the display carrier comprises an LED screen, a projector, a television, and a mobile terminal.
4. The XR-augmented reality system of claim 1, wherein the server is a media server.
5. The XR-augmented reality system of claim 4 wherein the media server model comprises distuise 2x4, distuise 4x4pro, distuise gx1, distuise gx2, distuise gx2c or distuise vx 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021997017.6U CN213461894U (en) | 2020-09-14 | 2020-09-14 | XR-augmented reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021997017.6U CN213461894U (en) | 2020-09-14 | 2020-09-14 | XR-augmented reality system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN213461894U true CN213461894U (en) | 2021-06-15 |
Family
ID=76323719
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202021997017.6U Active CN213461894U (en) | 2020-09-14 | 2020-09-14 | XR-augmented reality system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN213461894U (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113570731A (en) * | 2021-08-03 | 2021-10-29 | 凡创(上海)文化传播股份有限公司 | XR terminal system based on 5G era |
CN115883858A (en) * | 2021-09-29 | 2023-03-31 | 深圳市奥拓电子股份有限公司 | Immersive live broadcasting method, device and system for 3D scene |
WO2023050533A1 (en) * | 2021-09-29 | 2023-04-06 | 歌尔股份有限公司 | Control method for head-mounted device, and picture rendering method |
-
2020
- 2020-09-14 CN CN202021997017.6U patent/CN213461894U/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113570731A (en) * | 2021-08-03 | 2021-10-29 | 凡创(上海)文化传播股份有限公司 | XR terminal system based on 5G era |
CN115883858A (en) * | 2021-09-29 | 2023-03-31 | 深圳市奥拓电子股份有限公司 | Immersive live broadcasting method, device and system for 3D scene |
WO2023050533A1 (en) * | 2021-09-29 | 2023-04-06 | 歌尔股份有限公司 | Control method for head-mounted device, and picture rendering method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11019259B2 (en) | Real-time generation method for 360-degree VR panoramic graphic image and video | |
US11076142B2 (en) | Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene | |
CN213461894U (en) | XR-augmented reality system | |
US20080253685A1 (en) | Image and video stitching and viewing method and system | |
US20060165310A1 (en) | Method and apparatus for a virtual scene previewing system | |
CN110866978A (en) | Camera synchronization method in real-time mixed reality video shooting | |
CN213126145U (en) | AR-virtual interactive system | |
CN111970453A (en) | Virtual shooting system and method for camera robot | |
US20210166485A1 (en) | Method and apparatus for generating augmented reality images | |
JPH11175762A (en) | Light environment measuring instrument and device and method for shading virtual image using same | |
Schreer et al. | Advanced volumetric capture and processing | |
CN116485966A (en) | Video screen rendering method, device, equipment and medium | |
CN110730340A (en) | Lens transformation-based virtual auditorium display method, system and storage medium | |
CN212519183U (en) | Virtual shooting system for camera robot | |
CN112288877B (en) | Video playback method, device, electronic device and storage medium | |
Eisert et al. | Volumetric video–acquisition, interaction, streaming and rendering | |
Kim et al. | 3-d virtual studio for natural inter-“acting” | |
CN112312041A (en) | Image correction method and device based on shooting, electronic equipment and storage medium | |
WO2023236656A1 (en) | Method and apparatus for rendering interactive picture, and device, storage medium and program product | |
Mori et al. | An overview of augmented visualization: observing the real world as desired | |
Zhou et al. | RGBD-based real-time volumetric reconstruction system: Architecture design and implementation | |
CN116708862A (en) | Virtual background generation method for live broadcasting room, computer equipment and storage medium | |
CN206302475U (en) | An Intelligent Virtual Studio System | |
CN114255258A (en) | Object implantation method of multi-degree-of-freedom video, electronic device and storage medium | |
JP7011728B2 (en) | Image data output device, content creation device, content playback device, image data output method, content creation method, and content playback method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |