CN212163542U - Three-dimensional interactive display system - Google Patents
Three-dimensional interactive display system Download PDFInfo
- Publication number
- CN212163542U CN212163542U CN202020973632.7U CN202020973632U CN212163542U CN 212163542 U CN212163542 U CN 212163542U CN 202020973632 U CN202020973632 U CN 202020973632U CN 212163542 U CN212163542 U CN 212163542U
- Authority
- CN
- China
- Prior art keywords
- unit
- display
- dimensional
- space
- presenter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The utility model discloses a three-dimensional mutual display system. The utility model discloses set up forward to the viewer visual angle and played up unit and forward display element, set up reverse play up unit and reverse display element to the presenter visual angle, utilize the beam split unit will make two display directions can not disturb, the presenter can monitor the synthetic image of oneself and virtual scene in real time simultaneously, and reverse play up the unit and can superpose information or virtual scene, generate three-dimensional formation of image visual effect for the presenter, better support the show or the performance of presenter; meanwhile, the action information of the presenter also enters the forward rendering unit, the three-dimensional scene and the content output by the forward rendering unit are controlled in real time, the presentation or performance of the presenter is matched, the three-dimensional scene and the content can be adjusted at any time, the visual angle of a viewer is guaranteed to obtain a vivid and natural three-dimensional imaging visual effect, and the excellent presentation effect can be achieved for stage performance, remote teaching and product release.
Description
Technical Field
The utility model relates to a three-dimensional demonstration and interactive technology, concretely relates to three-dimensional mutual display system.
Background
Humans live in a stereoscopic world, which is perceived using stereoscopic vision mechanisms. To express this world, many ways have been proposed and developed, with images being the most intuitive way of expression. However, most display devices can only realize 2D (two-dimensional) display, and can express the content of a scene but ignore depth information, so people can only judge the front-back position relationship between objects through experience accumulated in daily life and information such as shadows in 2D images. In the information and digital age, 2D display is gradually unable to meet the requirements of human beings along with the development of society, and 3D display has become a new research target of researchers and a new development trend in the display field. As people have been intensively researched for 3D display, various technologies have been proposed to realize various 3D display modes. Among them, the aerial imaging technology is one of three-dimensional technologies, has attracted attention because of its simple principle and good realization effect, and is widely used in exhibition display and teaching display systems. With the development of space perception and computer real-time rendering technology, interaction and mutual influence between a presenter and an object in a digital environment are carried out by means of necessary equipment, and the design of visual effect is carried out, so that the audience can superpose a real presenter and a scene generated by real-time rendering, a strong three-dimensional space reality is generated, and various shocking visual experiences are realized.
However, in a general display method, a presenter cannot see his/her own motion and display content in real time, and is difficult to respond in real time, which limits the application of the system.
Disclosure of Invention
To the problem that exists among the above prior art, the utility model provides a three-dimensional mutual display system.
The utility model discloses a three-dimensional mutual display system includes: the device comprises a display space, a sensing unit, a reverse rendering unit, a reverse display unit, a presenter, a viewing space, a camera acquisition unit, a forward rendering unit, a forward display unit, a viewer and a light splitting unit; the display space is a three-dimensional space, and the presenter, the reverse display unit and the sensing unit are positioned in the display space; the reverse display unit is positioned on the horizontal top wall in the display space; the viewing space faces the display space, and a light splitting unit is arranged between the viewing space and the display space; the light splitting unit forms an angle of 45 degrees with the horizontal ground, and the light splitting unit adopts a semi-transparent semi-reflective film; the viewer, the forward display unit and the camera acquisition unit are positioned in the viewing space, and the camera acquisition unit and the viewer are positioned at the same visual angle; the forward display unit is positioned on the horizontal bottom wall in the viewing space; the backward display unit and the forward display unit are respectively positioned at two sides of the light splitting unit and form an angle of 45 degrees with the light splitting unit; the sensing unit is connected to the reverse rendering unit, and the reverse rendering unit is connected to the reverse display unit; the sensing unit is also connected to the forward rendering unit, and the forward rendering unit is connected to the forward display unit; the camera acquisition unit is connected to the reverse rendering unit; the forward rendering unit and the backward rendering unit are connected to each other.
The sensing unit collects the spatial information and the action information of a presenter in a presentation space and transmits the spatial information and the action information to the forward rendering unit in real time; the forward rendering unit triggers or controls the forward rendering unit to generate a corresponding rendering result according to the spatial information and the action information of the presenter and outputs the rendering result to the forward display unit; the forward direction display unit displays a rendering result; the display content of the forward display unit is reflected by the light splitting unit to form a forward image; the forward image is a virtual image, and is a vertical plane forward image in the display space from the view of an observer, the performance of the presenter in the display space is transmitted through the light splitting unit, and the viewer sees that the performance of the presenter is superposed on the forward image of the forward display unit in the display space, so that a vivid and natural three-dimensional imaging visual effect is realized; the camera acquisition unit acquires images acquired from the visual angle of a viewer in real time, namely the acquired images are obtained by superposing the forward image of the forward display unit on the performance of the presenter; the camera acquisition unit transmits the image acquired from the view angle of the viewer to the reverse rendering unit in real time; meanwhile, the sensing unit collects the spatial information and the action information of the presenter in the presentation space and transmits the spatial information and the action information to the reverse rendering unit in real time; the reverse rendering unit generates corresponding prompt contents according to the spatial information and the action information, superimposes the prompt contents on the image acquired from the visual angle of the viewer, renders the prompt contents to obtain a rendered image, and outputs the rendered image to the reverse display unit; the forward rendering unit and the reverse rendering unit perform rendering synchronization in real time; the reverse display unit displays an image which is acquired from the visual angle of a viewer and is superimposed with prompt content; the display content of the reverse display unit forms a reverse image through reflection of the light splitting unit and is displayed to a presenter, the reverse image is a virtual image and is a positive image which is formed in the observation space and is located on a vertical plane, so that the presenter can view the display effect of the presenter and the prompt content added by the reverse rendering unit in real time.
The display space is a built three-dimensional internal space, each surface of the three-dimensional space except the viewing space is covered by black light absorption cloth, the lower sensing units and the display persons can be arranged in the three-dimensional space, and personnel entrances and exits are arranged on the side surfaces of the three-dimensional space. A light splitting unit is arranged on one side facing the viewing space.
The sensing unit adopts a tracking and positioning sensor to perform human-computer interaction between a presenter and the three-dimensional interactive presentation system, perform perception transfer between the presenter and a virtual scene, and acquire spatial information tracked by three-dimensional spatial orientation and motion information of limb motion interaction behaviors. The sensing unit adopts a three-dimensional space tracking and positioning method, and captures the space information and the motion information of the presenter through the tracking and positioning device, so that the presenter has an interactive space capable of moving freely, the flexibility of interactive operation of the presenter is improved, and the motion information comprises the head position, the angle and the limb information of the presenter. The tracking and positioning sensor is divided into an active tracking and positioning sensor and a passive tracking and positioning sensor. Wherein the active tracking position sensor has a transmitter and a receiver capable of determining the motion information of the presenter through the physical link between the transmitted and received signals. The passive tracking and positioning sensor has no active signal source, and only measures the change of a received signal through a receiver to determine the position and the posture of a tracked object. The tracking and positioning method adopts one of a laser positioning method, an optical positioning method, an infrared active optical method and a visible light active optical method. The sensing unit transmits the collected motion information of the presenter to the forward rendering unit.
The camera acquisition unit comprises an acquisition camera and a transmission line, wherein the acquisition camera has proper resolution, field angle, exposure time, white balance, color correction, image cutting area, Bayer conversion type and the like, and can obtain low-delay images under set light conditions, so that a good effect can be realized on the reverse display of a presenter. According to the specification and size of the three-dimensional display screen of the reverse display unit, the distance between the acquisition cameras and the shooting position are calculated, the shot image of the presenter and the image of the reverse display unit can be well fused, and a good imaging effect is achieved.
In order to reduce the operation time consumption of the whole system and improve the system frame rate, the data acquisition in the invention adopts a USB3.0 and other high-speed interface camera acquisition scheme, acquires the data of an actual camera by calling a multi-core CPU, and then sends the acquired data to a reverse rendering unit, thereby improving the system operation frame rate to a great extent.
The forward rendering unit stores a three-dimensional digital virtual scene and a preset rule path. Firstly, according to the space and motion information of a presenter collected by a sensing unit, a virtual scene response is triggered or controlled, the rendering content of a forward rendering unit is set according to the position and the angle of the head of the presenter, the image of the virtual scene where the presenter interacts is rendered and transmitted to a forward display unit to be displayed, so that the presenter can control different three-dimensional virtual scenes along with own motion, the virtual scene interacts with the virtual scene, and further immersive display experience is generated. The visual angle of the image in the forward rendering unit is changed through the position and the angle of the head, the display person corrects the limb movement according to the image through the image of the reverse display unit, and the forward display unit corrects and adjusts the displayed virtual scene according to the movement of the display person, so that the display is more vivid, the visual system and the motion perception system of the display person can be connected, and the user can feel more vivid.
The reverse rendering unit stores prompt contents, wherein the prompt contents comprise language contents and action contents corresponding to the spatial information and the action information of the presenter, namely, the prompt contents serve as a prompter for prompting the presenter about the language or action to be presented.
The forward rendering unit and the reverse rendering unit are communicated through a data transmission line, data are rendered synchronously, and delay of display contents of the forward display unit and the reverse display unit is reduced.
The forward display unit and the reverse display unit both adopt three-dimensional display screens.
The utility model has the advantages that:
the utility model discloses set up forward to the viewer visual angle and played up unit and forward display element, set up reverse play up unit and reverse display element to the presenter visual angle, utilize the beam split unit will make two display directions can not disturb, the presenter can monitor the synthetic image of oneself and virtual scene in real time simultaneously, and reverse play up the unit and can superpose information or virtual scene, generate three-dimensional formation of image visual effect for the presenter, better support the show or the performance of presenter; meanwhile, the action information of the presenter also enters the forward rendering unit, the three-dimensional scene and the content output by the forward rendering unit are controlled in real time, the presentation or performance of the presenter is matched, the three-dimensional scene and the content can be adjusted at any time, the visual angle of a viewer is guaranteed to obtain a vivid and natural three-dimensional imaging visual effect, the three-dimensional imaging visual effect is used for stage performance, remote teaching, product release and the like, and excellent presentation effects can be achieved.
Drawings
Fig. 1 is a structural diagram of a three-dimensional interactive display system of the present invention;
fig. 2 is a schematic diagram of an embodiment of the three-dimensional interactive display system of the present invention.
Detailed Description
The invention will be further elucidated by means of specific embodiments in the following with reference to the drawings.
As shown in fig. 1 and 2, the three-dimensional interactive display system of the present embodiment includes: the system comprises a display space, a sensing unit 1, a reverse rendering unit 4, a reverse display unit 8, a presenter P, a viewing space, a camera acquisition unit 2, a forward rendering unit 3, a forward display unit 7, a viewer L and a light splitting unit 5; the exhibition space 6 is a three-dimensional space, and the exhibitor P, the reverse display unit 8 and the sensing unit 1 are positioned in the exhibition space 6; the counter display unit 8 is located on a horizontal top wall within the display space 6; the viewing space faces the display space 6, and a light splitting unit 5 is arranged between the viewing space and the display space; the light splitting unit 5 forms an angle of 45 degrees with the horizontal ground, and the light splitting unit 5 adopts a semi-transparent semi-reflective film; the viewer L, the forward display unit 7 and the camera acquisition unit 2 are positioned in the viewing space; the forward display unit 7 is located on a horizontal bottom wall within the viewing space; the reverse display unit 8 and the forward display unit 7 are respectively positioned at two sides of the light splitting unit 5 and form an angle of 45 degrees with the light splitting unit; the sensing unit 1 is connected to the reverse rendering unit 4, and the reverse rendering unit 4 is connected to the reverse display unit 8; the sensing unit 1 is also connected to a forward rendering unit 3, and the forward rendering unit 3 is connected to a forward display unit 7; the camera acquisition unit 2 is connected to the reverse rendering unit 4; the backward rendering unit 4 is interconnected with the forward rendering unit 3.
The exhibition space is the inner space of the built cuboid, and each surface in the cuboid is covered by black light absorption cloth except for the surface facing the watching space.
1. Sensing unit
In this embodiment, the sensing unit adopts a Lighthouse indoor positioning technology, belongs to a laser scanning positioning technology, and determines the position of a moving object by laser and a photosensitive sensor. Two laser emitters are arranged at opposite corners to form a rectangular area with adjustable size, and the maximum tracking position is 4.5 x 4.5 m. The laser beams are emitted by two rows of fixed LED lamps inside the emitter, 6 times per second. Two scanning units are arranged in each laser transmitter, and laser scanning positioning spaces are transmitted to the display space in turn in the horizontal direction and the vertical direction respectively.
The method comprises the steps that a laser scanning positioning method is adopted, a plurality of light-sensitive sensors are worn on the head and the body of a presenter, a plurality of laser emitters are arranged in a presentation space, scanning units in the laser emitters alternately emit laser to the presentation space for scanning the presentation space, the light-sensitive sensors receive the laser and transmit the laser to a computing unit, the computing unit distinguishes different light-sensitive sensors of the head and the body and respectively computes the time for receiving the laser so as to respectively determine the position information of the head and the body. The accurate position of the sensor position relative to the laser emitter is calculated by calculating the time for receiving the laser, and the position and the direction of the head, the body action and the hand action can be detected by the plurality of photosensitive sensors. It should be noted here that, with the laser positioning method, the ID of the photosensor is transmitted to the computing unit along with the data it receives during the positioning process, that is, the computing unit can directly distinguish different photosensors, so as to finally construct a three-dimensional model of the movement of the human body according to the position of each photosensor fixed on the human body and other information, thereby obtaining the movement information of the presenter.
2. Camera acquisition unit
The camera acquisition unit adopts an industrial camera and is arranged on the camera bracket, so that the shooting direction and position can be conveniently adjusted.
The specifications of the industrial camera and the lens in the camera acquisition unit are respectively as follows:
TABLE 1 Camera Specification parameter Table
TABLE 2 lens Specification parameter Table
3. Rendering unit
The forward rendering unit and the reverse rendering unit both adopt computers, and the forward rendering unit is responsible for controlling the three-dimensional virtual scene to trigger or control response according to the motion information acquired by the sensing unit, setting the state of the virtual scene in real time, and rendering images and outputting the images to the forward display unit; and the reverse rendering unit generates corresponding prompt contents according to the spatial information and the action information, and the prompt contents are superposed on the image acquired from the visual angle of the viewer for rendering to form the effect of the prompter.
4. Display unit
The forward display unit and the reverse display unit are 65-inch LCDs, the single-block resolution can reach 1920 multiplied by 1080, the refresh rate is 60FPS, and the display units are arranged at the positions shown in the figure. Depending on the actual scene, a projector or an LED display screen may be used as the display unit.
5. Light splitting unit
The light splitting unit adopts a semi-transparent semi-reflective film, and is stretched and flattened by a mechanical structure, so that the reflectivity and the transmittance are respectively ensured to be 50%, and reflective imaging is realized for the forward display unit and the reverse display unit. The display content of the forward display unit is reflected by the light splitting unit to form a forward image R; the display content of the reverse display unit is reflected by the light splitting unit to form a reverse image R'.
The display method of the three-dimensional interactive display system of the embodiment is divided into a presenter part and a viewer part according to different viewing objects:
(one) viewer section:
1) the sensing unit collects the spatial information and the action information of a presenter in a presentation space and transmits the spatial information and the action information to the forward rendering unit in real time;
2) the forward rendering unit triggers or controls the forward rendering unit to generate a corresponding rendering result according to the spatial information and the action information of the presenter and outputs the rendering result to the forward display unit;
3) the forward direction display unit displays a rendering result;
4) the display content of the forward display unit is reflected by the light splitting unit to form a forward image; the forward image is a virtual image, and is a vertical plane forward image in the display space from the view of an observer, the performance of the presenter in the display space is transmitted through the light splitting unit, and the viewer sees that the performance of the presenter is superposed on the forward image of the forward display unit in the display space, so that a vivid and natural three-dimensional imaging visual effect is realized;
(II) a presenter part:
1) the camera acquisition unit acquires images acquired from the visual angle of a viewer in real time, namely the acquired images are obtained by superposing the forward image of the forward display unit on the performance of the presenter;
2) the camera acquisition unit transmits the image acquired from the view angle of the viewer to the reverse rendering unit in real time; meanwhile, the sensing unit collects the spatial information and the action information of the presenter in the presentation space and transmits the spatial information and the action information to the reverse rendering unit in real time;
3) the reverse rendering unit generates corresponding prompt contents according to the spatial information and the action information, and the prompt contents are superposed on the image acquired from the view angle of the viewer for rendering; the forward rendering unit and the reverse rendering unit perform rendering synchronization in real time; the reverse rendering unit outputs the rendered image to a reverse display unit;
4) the reverse display unit displays an image which is acquired from the visual angle of a viewer and is superimposed with prompt content;
5) the display content of the reverse display unit forms a reverse image through reflection of the light splitting unit and is displayed to a presenter, the reverse image is a virtual image and is a positive image which is formed in the observation space and is located on a vertical plane, so that the presenter can view the display effect of the presenter and the prompt content added by the reverse rendering unit in real time.
Finally, it is noted that the disclosed embodiments are intended to aid in further understanding of the invention, but will be understood by those skilled in the art that: various substitutions and modifications are possible without departing from the spirit and scope of the present invention and the appended claims. Therefore, the present invention should not be limited to the embodiments disclosed, and the scope of the present invention is defined by the appended claims.
Claims (7)
1. A three-dimensional interactive display system, characterized in that the three-dimensional interactive display system comprises: the device comprises a display space, a sensing unit, a reverse rendering unit, a reverse display unit, a presenter, a viewing space, a camera acquisition unit, a forward rendering unit, a forward display unit, a viewer and a light splitting unit; the display space is a three-dimensional space, and the presenter, the reverse display unit and the sensing unit are positioned in the display space; the reverse display unit is positioned on the horizontal top wall in the display space; the viewing space faces the display space, and a light splitting unit is arranged between the viewing space and the display space; the light splitting unit forms an angle of 45 degrees with the horizontal ground, and the light splitting unit adopts a semi-transparent semi-reflective film; the viewer, the forward display unit and the camera acquisition unit are positioned in the viewing space, and the camera acquisition unit and the viewer are positioned at the same visual angle; the forward display unit is positioned on the horizontal bottom wall in the viewing space; the backward display unit and the forward display unit are respectively positioned at two sides of the light splitting unit and form an angle of 45 degrees with the light splitting unit; the sensing unit is connected to the reverse rendering unit, and the reverse rendering unit is connected to the reverse display unit; the sensing unit is also connected to the forward rendering unit, and the forward rendering unit is connected to the forward display unit; the camera acquisition unit is connected to the reverse rendering unit; the forward rendering unit and the backward rendering unit are connected to each other.
2. The three-dimensional interactive display system according to claim 1, wherein the display space is an internal space of a built three-dimensional body, each surface of the inside of the three-dimensional body except the viewing space is covered by a black light absorption cloth, the inside of the three-dimensional body is provided with a lower sensing unit and a display, and the side surface of the three-dimensional body is provided with a personnel entrance and exit.
3. The three-dimensional interactive display system of claim 1, wherein the sensing unit employs a tracking and positioning sensor.
4. The three-dimensional interactive display system of claim 1, wherein the camera acquisition unit comprises an acquisition camera and a transmission line.
5. The three-dimensional interactive display system of claim 1, wherein the forward rendering unit stores therein a three-dimensional digitized virtual scene and a preset rule path.
6. The three-dimensional interactive display system according to claim 1, wherein the backward rendering unit stores prompt contents including language contents and action contents corresponding to the spatial information and the action information of the presenter.
7. The three-dimensional interactive display system of claim 1, wherein the forward display unit and the backward display unit both use three-dimensional display screens.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202020399438 | 2020-03-25 | ||
CN2020203994382 | 2020-03-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN212163542U true CN212163542U (en) | 2020-12-15 |
Family
ID=73702492
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202020973632.7U Active CN212163542U (en) | 2020-03-25 | 2020-06-01 | Three-dimensional interactive display system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN212163542U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114660918A (en) * | 2020-12-24 | 2022-06-24 | 腾讯科技(深圳)有限公司 | Holographic three-dimensional image display device, method, device and medium |
-
2020
- 2020-06-01 CN CN202020973632.7U patent/CN212163542U/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114660918A (en) * | 2020-12-24 | 2022-06-24 | 腾讯科技(深圳)有限公司 | Holographic three-dimensional image display device, method, device and medium |
CN114660918B (en) * | 2020-12-24 | 2023-05-23 | 腾讯科技(深圳)有限公司 | Holographic stereoscopic image display device, method, device and medium |
EP4194957A4 (en) * | 2020-12-24 | 2024-05-08 | Tencent Tech Shenzhen Co Ltd | Device, method and apparatus for generating holographic stereoscopic images, and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11199706B2 (en) | Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking | |
US11989842B2 (en) | Head-mounted display with pass-through imaging | |
CN106131536A (en) | A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof | |
US8199186B2 (en) | Three-dimensional (3D) imaging based on motionparallax | |
CN1197372C (en) | Communication system | |
CN205901977U (en) | Interactive display system of bore hole 3D augmented reality | |
KR101080040B1 (en) | Method for display spatial augmented reality-based interactive | |
JP2007501950A (en) | 3D image display device | |
CN212163542U (en) | Three-dimensional interactive display system | |
WO2007085194A1 (en) | Stereo image display device with liquid crystal shutter and display method thereof | |
US20210352257A1 (en) | Illumination-based system for distributing immersive experience content in a multi-user environment | |
CN111586394A (en) | Three-dimensional interactive display system and display method thereof | |
CN210605808U (en) | Three-dimensional image reconstruction system | |
CN113891063B (en) | Holographic display method and device | |
CN111782040A (en) | Naked eye virtual reality screen | |
KR102000487B1 (en) | Aquarium with fish robot and display | |
WO2023277020A1 (en) | Image display system and image display method | |
KR200466901Y1 (en) | Stereoscopic kiosk capable indicating observation position | |
CN111727924B (en) | Mixed reality fish tank system in stereoscopic display environment and generation method | |
KR101926819B1 (en) | Pointing device using three dimensional virtual button | |
CN116866541A (en) | Virtual-real combined real-time video interaction system and method | |
Lancelle | Visual Computing | |
CN115097973A (en) | Desktop holographic display all-in-one machine device, method, computer program product and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |