CN114581634A - Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking - Google Patents
Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking Download PDFInfo
- Publication number
- CN114581634A CN114581634A CN202210191159.0A CN202210191159A CN114581634A CN 114581634 A CN114581634 A CN 114581634A CN 202210191159 A CN202210191159 A CN 202210191159A CN 114581634 A CN114581634 A CN 114581634A
- Authority
- CN
- China
- Prior art keywords
- module
- active stereo
- naked eye
- active
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 16
- 210000001508 eye Anatomy 0.000 claims abstract description 32
- 239000011521 glass Substances 0.000 claims abstract description 16
- 238000012544 monitoring process Methods 0.000 claims abstract description 15
- 230000000694 effects Effects 0.000 claims abstract description 13
- 230000000007 visual effect Effects 0.000 claims abstract description 12
- 230000001360 synchronised effect Effects 0.000 claims abstract description 10
- 238000009877 rendering Methods 0.000 claims abstract description 8
- 238000004458 analytical method Methods 0.000 claims abstract description 5
- 230000004927 fusion Effects 0.000 claims abstract description 5
- 238000006073 displacement reaction Methods 0.000 claims abstract description 4
- 230000003238 somatosensory effect Effects 0.000 claims description 5
- 210000005252 bulbus oculi Anatomy 0.000 claims description 4
- 230000003993 interaction Effects 0.000 description 10
- 238000003384 imaging method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007747 plating Methods 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/22—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
- G02B30/24—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention discloses a naked eye 3D stereoscopic holographic interactive experience system with an automatic tracking and variable perspective point, which comprises a monitoring module, a body sensing motion capture module, a server, a projection module and a display module, wherein the body sensing motion capture module is used for capturing the eye position of an experiencer, carrying out real-time algorithm analysis and simulating continuous displacement real-time rendering of a first visual angle, the server is used for processing signals, the projection module is used for projecting images, and the projection module is matched with active stereoscopic glasses for use. The invention utilizes the active stereo projector to project pictures, forms an active stereo image on a plane, can present a stereo floating picture by matching with the active stereo glasses, supports multi-channel synchronous active stereo fusion, and respectively sees different pictures of two channels by left and right eyes when the frequency of the active stereo projector is completely matched with the active stereo glasses, thereby simulating the actual watching effect of human eyes and forming a stereo image with depth information on a virtual phase plane.
Description
Technical Field
The invention relates to the technical field of holographic interaction, in particular to a naked eye 3D stereoscopic holographic interaction experience system with a variable perspective point automatic tracking function.
Background
The holographic display technology is a comprehensive display technology combining holographic coating reflection images with object transmission, generates a floating imaging effect by combining amplitude, phase and other information of images with actual scenes, is mainly applied to industrial simulation, teaching training, a science museum, a theater and the like, increases immersion compared with the traditional imaging modes such as a display and projection, is more intuitive and easily accepted by audiences due to the floating imaging mode, and is widely applied to the industrial simulation, the education training and the holographic theater, but has the following obvious defects in common holographic display:
1. the floating imaging picture is a virtual plane reflection picture and is not a real three-dimensional picture, so that the picture has insufficient three-dimensional effect and strong surface tiling, and multiple depth display cannot be carried out;
2. the visual angle cannot be calculated in real time, and the image plane and the real object image are overlapped when an experiential person stands at a specific preset point position, so that a virtual-real combination effect can be generated, but when the body of the experiential person is displaced, the image and the real object are staggered, image offset is generated, and the wearing of the user is facilitated and wrong when the body of the experiential person is serious;
3. the interaction degree is weak, and on the interaction level, the common holographic display and the display have no obvious difference, and high-tech interaction such as motion sensing, voice and the like can not be carried out.
Disclosure of Invention
The invention aims to provide a naked eye 3D stereoscopic holographic interactive experience system with a variable perspective point automatic tracking function, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: perspective point automatic tracking variable naked eye 3D stereoscopic holographic interactive experience system comprises:
the monitoring module is used for monitoring the pose of the experiencer;
the body sensing motion capture module is used for capturing the eye position of an experiencer, performing real-time algorithm analysis and simulating continuous displacement real-time rendering of a first visual angle;
the server is used for processing signals of the monitoring module and the body motion capture module;
the projection module is used for projecting images and is matched with active three-dimensional glasses for use;
and the display module is used for displaying the image projected by the projection module.
Preferably, the monitoring module is composed of a somatosensory depth camera and a microphone array.
Preferably, the depth camera is felt to the body through embedded setting, the depth camera is felt to the body can real-time detection experience person's position and orientation information, can operate experience through the body is felt.
Preferably, the microphone array is arranged in an embedded manner, and the microphone array collects voice information, and sends an instruction to the system through recognition of the voice cloud engine, so that voice operation is completed.
Preferably, the body sensing motion capture module is a real-time position tracking original, the real-time position tracking original can judge eyeball position information of experience personnel in real time, and under the high-speed processing of the server, the position of a scene rendering double-machine-position virtual camera and the positions of the eyes of the experience personnel are synchronized in real time and moved and tracked, so that the virtual scene is rendered in real time in a perspective relation of a first person visual angle, and left and right channel pictures are output.
Preferably, the display module comprises a dual-channel display screen and a holographic display screen, and the dual-channel display screen alternately displays pictures of two channels through the frequency of 120 Hz.
Preferably, the projection module is an active stereo projector, the active stereo projector is used for projecting pictures, forms an active stereo image to a plane, can present a stereo floating picture by matching with active stereo glasses, supports multi-channel synchronous active stereo fusion, and displays a super-large picture.
Preferably, when the frequency of the active stereo projector is matched with that of the active stereo glasses, the left and right eyes of the experiencer can respectively see different pictures of the two channels, so that the actual watching effect of human eyes can be simulated.
Preferably, the active stereo projector can be used by a plurality of projectors simultaneously, and is used for projecting a plurality of pictures in an interactive manner.
The invention has the technical effects and advantages that:
(1) the invention utilizes the active stereo projector to replace an LCD display to carry out picture projection, forms an active stereo image to a plane, can present a stereo floating picture by matching with active stereo glasses, supports multi-channel synchronous active stereo fusion, and displays an oversized picture, and when the frequency of the active stereo projector is completely matched with the active stereo glasses, the left and the right eyes respectively see different pictures of two channels, thereby simulating the actual watching effect of human eyes and forming a stereo image with depth information on a virtual phase plane;
(2) according to the invention, the embedded motion sensing depth camera and the microphone array are utilized, the pose information of an experiencer can be detected in real time, operation experience can be carried out, meanwhile, the microphone array collects voice information, and a command is sent to a system through recognition of a voice cloud engine, so that voice operation is completed, and thus the holographic equipment supports motion sensing pose interaction and voice cloud interaction;
(3) according to the invention, the eyeball position information of the experiential person can be judged in real time by utilizing the real-time position tracking original setting in the body sensing motion capture module, and under the high-speed processing of the server, the position of the scene rendering double-machine position virtual camera and the positions of the two eyes of the experiential person are synchronized in real time and movably tracked, so that the virtual scene is rendered in real time in a perspective relation of a first person visual angle, and left and right channel pictures are output.
Drawings
FIG. 1 is a schematic diagram of the overall structure of the system of the present invention.
FIG. 2 is a schematic diagram of a monitoring module according to the present invention.
FIG. 3 is a schematic diagram of a display module according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a naked eye 3D stereoscopic holographic interactive experience system with a variable perspective point automatic tracking function as shown in figures 1-3, which comprises a monitoring module, wherein the monitoring module is used for monitoring the pose of an experiencer;
the body sensing motion capture module is used for capturing the eye position of the experiencer, performing real-time algorithm analysis and simulating continuous displacement real-time rendering of a first visual angle;
the server is used for processing signals of the monitoring module and the body motion capture module;
the projection module is used for projecting images and is matched with active three-dimensional glasses for use;
the display module is used for displaying the image projected by the projection module;
the monitoring module consists of a somatosensory depth camera and a microphone array;
the motion sensing depth camera is arranged in an embedded mode, the motion sensing depth camera can detect pose information of an experiencer in real time, operation experience can be conducted through motion sensing, and therefore the holographic equipment has a motion sensing pose interaction function;
the microphone array is arranged in an embedded mode, collects voice information and sends an instruction to a system through recognition of a voice cloud engine, so that the holographic equipment which does not have any voice operation has a voice cloud interaction function, the direction and the change of a received voice signal sound source can be analyzed in frequency response according to application of time domain beam forming and a spatial filter, the analysis can display the strength and the angle of the voice signal in a beam mode through a polar coordinate graph, and meanwhile, the effects of noise suppression, echo suppression, dereverberation, single or multiple sound source positioning, sound source number estimation, source separation, cocktail party effect and the like can be solved in the microphone array mode, so that the voice cloud interaction function of the holographic equipment is better;
the image information of a conventional virtual phase plane is of a fixed visual angle, so that the matching with a real object landscape can only be designed at the fixed visual angle, when an experiential person stands at the optimal visual angle, the effect is optimal, when the experiential person moves, because a real object scene has actual front and back depth, and a holographic picture only has fixed depth-of-field floating plane information, the holographic picture and the real scene can be dislocated, so that the using effect of holographic equipment is deteriorated, a body sensing motion capture module is a real-time position tracking original, the real-time position tracking original can judge eyeball position information of the experiential person in real time, under the high-speed processing of a server, the position of a scene rendering double-camera and the positions of eyes of the experiential person are synchronized in real time and move and track, so that the virtual scene is rendered in real time according to the perspective relation of a first human visual angle, and left and right channel pictures are output;
the display module comprises a dual-channel display screen and a holographic display screen, the dual-channel display screen alternately displays pictures of two channels through 120Hz frequency, the holographic plating structure does not influence the frequency characteristic of the active stereoscopic image, the picture information of the two channels can still be displayed in a virtual phase plane after the image is subjected to holographic reflection, and meanwhile, the synchronous information is sent to the glasses with the LCD shutter in real time;
the projection module is an active stereo projector which is used for projecting pictures to form an active stereo image to a plane, can present a stereo floating picture by matching with active stereo glasses, supports multi-channel synchronous active stereo fusion and displays an oversized picture;
when the frequency of the active stereo projector is matched with that of the active stereo glasses, the left eye and the right eye of an experiencer can respectively see different pictures of the two channels, so that the actual watching effect of human eyes can be simulated, and meanwhile, the experiencer can be directly displayed in a common holographic form on occasions where the active stereo glasses are inconvenient to wear;
the active stereo projectors can be used simultaneously and used for projecting a plurality of pictures in an interactive mode, and the plurality of pictures can be projected simultaneously through the active stereo projectors, so that the projection of the pictures is richer.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments or portions thereof without departing from the spirit and scope of the invention.
Claims (9)
1. Perspective point automatic tracking changeable naked eye 3D stereoscopic holographic interactive experience system, which is characterized by comprising:
the monitoring module is used for monitoring the pose of the experiencer;
the body sensing motion capture module is used for capturing the eye position of an experiencer, performing real-time algorithm analysis and simulating continuous displacement real-time rendering of a first visual angle;
the server is used for processing signals of the monitoring module and the body motion capture module;
the projection module is used for projecting images and is matched with active three-dimensional glasses for use;
and the display module is used for displaying the image projected by the projection module.
2. The system for naked eye 3D stereoscopic holographic interactive experience with the variable perspective point automatic tracking according to claim 1, wherein the monitoring module is composed of a somatosensory depth camera and a microphone array.
3. The system for the naked eye 3D stereoscopic holographic interactive experience with the perspective point capable of automatically tracking and changing according to claim 1, wherein the somatosensory depth camera is arranged in an embedded mode, can detect pose information of an experiencer in real time, and can perform operation experience through somatosensory.
4. The system for the naked eye 3D stereoscopic holographic interactive experience with the perspective point capable of automatically tracking and changing according to claim 2, wherein the microphone array is arranged in an embedded mode, collects voice information, is identified by a voice cloud engine, and sends an instruction to the system so as to complete voice operation.
5. The system for naked eye 3D stereoscopic holographic interactive experience with the viewpoint being automatically tracked and changeable according to claim 1, wherein the body-sensing motion capture module is a real-time position tracking original, the real-time position tracking original can judge eyeball position information of an experiencer in real time, and under high-speed processing of the server, the position of the scene rendering double-machine-position virtual camera and the positions of the two eyes of the experiencer are synchronized in real time and moved and tracked, so that the virtual scene is rendered in real time in a perspective relation of a first human visual angle, and left and right channel pictures are output.
6. The system for naked eye 3D stereoscopic holographic interactive experience with the variable perspective point automatic tracking according to claim 1, wherein the display module comprises a dual-channel display screen and a holographic display screen, and the dual-channel display screen alternately displays pictures of two channels through a frequency of 120 Hz.
7. The system of claim 1, wherein the projection module is an active stereo projector, the active stereo projector is used for image projection, forms an active stereo image on a plane, can present a stereo floating image by matching with active stereo glasses, supports multi-channel synchronous active stereo fusion, and displays a super-large image.
8. The system for naked eye 3D holographic interactive experience with the variable perspective point automatic tracking according to claim 7, wherein when the frequency of the active stereo projector is matched with that of the active stereo glasses, the left and right eyes of an experiencer can respectively see different pictures of two channels, so as to simulate the actual watching effect of human eyes.
9. The system of claim 7, wherein the active stereo projector can be used in multiple simultaneous modes for projecting multiple interactive views.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210191159.0A CN114581634A (en) | 2022-03-01 | 2022-03-01 | Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210191159.0A CN114581634A (en) | 2022-03-01 | 2022-03-01 | Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114581634A true CN114581634A (en) | 2022-06-03 |
Family
ID=81772215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210191159.0A Pending CN114581634A (en) | 2022-03-01 | 2022-03-01 | Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114581634A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106980368A (en) * | 2017-02-28 | 2017-07-25 | 深圳市未来感知科技有限公司 | A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit |
CN206961066U (en) * | 2017-02-28 | 2018-02-02 | 深圳市未来感知科技有限公司 | A kind of virtual reality interactive device |
CN112929643A (en) * | 2019-12-05 | 2021-06-08 | 北京芯海视界三维科技有限公司 | 3D display device, method and terminal |
CN114035682A (en) * | 2021-10-29 | 2022-02-11 | 王朋 | Naked eye 3D interactive immersive virtual reality CAVE system |
-
2022
- 2022-03-01 CN CN202210191159.0A patent/CN114581634A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106980368A (en) * | 2017-02-28 | 2017-07-25 | 深圳市未来感知科技有限公司 | A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit |
CN206961066U (en) * | 2017-02-28 | 2018-02-02 | 深圳市未来感知科技有限公司 | A kind of virtual reality interactive device |
CN112929643A (en) * | 2019-12-05 | 2021-06-08 | 北京芯海视界三维科技有限公司 | 3D display device, method and terminal |
US20220408077A1 (en) * | 2019-12-05 | 2022-12-22 | Beijing Ivisual 3d Technology Co., Ltd. | 3d display device, method and terminal |
CN114035682A (en) * | 2021-10-29 | 2022-02-11 | 王朋 | Naked eye 3D interactive immersive virtual reality CAVE system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6583808B2 (en) | Method and system for stereo videoconferencing | |
US11010958B2 (en) | Method and system for generating an image of a subject in a scene | |
Beck et al. | Immersive group-to-group telepresence | |
CN101771830B (en) | Three-dimensional panoramic video stream generating method and equipment and video conference method and equipment | |
US20050264857A1 (en) | Binaural horizontal perspective display | |
US20170237941A1 (en) | Realistic viewing and interaction with remote objects or persons during telepresence videoconferencing | |
KR100904505B1 (en) | Communications system | |
CN102084650A (en) | Telepresence system, method and video capture device | |
CN110324553B (en) | Live-action window system based on video communication | |
CN110324554B (en) | Video communication apparatus and method | |
US10645340B2 (en) | Video communication device and method for video communication | |
US10972699B2 (en) | Video communication device and method for video communication | |
CN210605808U (en) | Three-dimensional image reconstruction system | |
CN114581634A (en) | Naked eye 3D (three-dimensional) holographic interactive experience system with variable perspective automatic tracking | |
US10701313B2 (en) | Video communication device and method for video communication | |
CN113891063A (en) | Holographic display method and device | |
CN106886289B (en) | Double-user interaction holographic display device | |
CN108957769B (en) | Naked eye 3D image automatic generation display system | |
KR20240134899A (en) | Autostereoscopic display device that presents three-dimensional views and three-dimensional sounds | |
KR20240132320A (en) | Scaling of three-dimensional content for display on stereoscopic display devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220603 |