CN114866757B - Stereoscopic display system and method - Google Patents
Stereoscopic display system and method Download PDFInfo
- Publication number
- CN114866757B CN114866757B CN202210431861.XA CN202210431861A CN114866757B CN 114866757 B CN114866757 B CN 114866757B CN 202210431861 A CN202210431861 A CN 202210431861A CN 114866757 B CN114866757 B CN 114866757B
- Authority
- CN
- China
- Prior art keywords
- information
- unit
- data
- stereoscopic display
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 230000002452 interceptive effect Effects 0.000 claims abstract description 34
- 230000010365 information processing Effects 0.000 claims abstract description 33
- 230000003993 interaction Effects 0.000 claims abstract description 17
- 238000010276 construction Methods 0.000 claims description 14
- 230000000694 effects Effects 0.000 claims description 12
- 210000001747 pupil Anatomy 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
Abstract
The invention discloses a stereoscopic display system and a stereoscopic display method. The three-dimensional display system comprises an information collection unit, an information processing unit and a three-dimensional display unit; the information collecting unit is used for collecting eye information and human body information of a user; the information processing unit is used for constructing a virtual space, constructing a human eye model and a human body model corresponding to a user in the virtual space according to human eye information and human body information, acquiring three-dimensional display data in the virtual space through the human eye model, and forming interactive display data through the human body model; and the stereoscopic display unit is used for displaying stereoscopic display pictures according to the stereoscopic display data and the interactive display data. According to the collected human eye information, a viewer can feel that the virtual space coincides with the real space, and a vivid stereoscopic picture with motion parallax can be seen; and the human body model can realize the interactive operation of the user in the virtual space through the human body information, and the virtual interaction can be realized without external equipment or handheld equipment.
Description
Technical Field
The invention relates to the technical field of display, in particular to a three-dimensional display system and method.
Background
At present, 3D (stereoscopic) display is paid attention to, and compared with common 2D picture display, 3D technology can make pictures become stereoscopic, images are not limited to a screen plane any more, and as if the images can go out of the screen, viewers feel as if the viewers were personally on the scene.
However, in the existing stereoscopic display system, the interactive system still adopts a 2D scheme, for example, external devices such as a keyboard and a mouse are used for interaction, and the vision of a user needs to be frequently switched between 2D and 3D, so that eye fatigue is generated; or the hand-held device (such as a handle) is needed to realize, so that hands cannot be completely liberated, and the real stereoscopic impression of the 3D display is reduced.
Disclosure of Invention
Based on the shortcomings in the prior art, the invention aims to provide a stereoscopic display system and a stereoscopic display method, which can realize stereoscopic display and virtual interaction without external equipment or handheld equipment.
To achieve the above object, the present invention provides a stereoscopic display system, at least comprising:
an information collection unit for collecting eye information and body information of a user;
the information processing unit is used for constructing a virtual space, constructing a human eye model and a human body model corresponding to a user in the virtual space according to human eye information and human body information, acquiring three-dimensional display data in the virtual space through the human eye model, and forming interactive display data through the human body model;
and the stereoscopic display unit is used for displaying stereoscopic display pictures according to the stereoscopic display data and the interactive display data.
Optionally, the information processing unit is further configured to preset a virtual object model in the virtual space, and the interactive display data further includes interactive data of the human body model and the virtual object model.
Optionally, the information collecting unit further includes:
the first collecting unit is used for collecting eye information of a user, wherein the eye information comprises eye coordinate data and interpupillary distance data;
and the second collecting unit is used for collecting human body information of the user, wherein the human body information comprises hand coordinate data and gesture information.
Optionally, the information collecting unit further includes:
the first acquisition unit is used for acquiring a human eye color chart and a human body color chart of a user;
and the second acquisition unit is used for acquiring the human eye depth map and the human body depth map of the user.
Optionally, the information processing unit further includes:
a first construction unit configured to construct a virtual space;
the second building unit is used for building a human eye model and a human body model in the virtual space according to human eye information and human body information;
the data processing unit is used for acquiring three-dimensional display data in the virtual space through the human eye model and forming interactive display data through the human body model.
Optionally, the data processing unit further comprises:
the format conversion unit is used for converting the format of the stereoscopic display data into a format matched with the stereoscopic display unit according to the parameters of the stereoscopic display unit;
and the data transmission unit is used for transmitting the stereoscopic display data with the converted format to the stereoscopic display unit.
The invention also provides a three-dimensional display method, which is applied to an information processing unit of a three-dimensional display system, wherein the three-dimensional display system also comprises an information collecting unit and a three-dimensional display unit, and the three-dimensional display method at least comprises the following steps:
constructing a virtual space;
receiving the human eye information and the human body information of the user sent by the information collecting unit;
according to the human eye information and the human body information, constructing a human eye model and a human body model corresponding to a user in a virtual space;
and transmitting the three-dimensional display data acquired by the human eye model in the virtual space and the human body interaction data of the human body model to the three-dimensional display unit so that the three-dimensional display unit displays three-dimensional display pictures.
Optionally, the step of transmitting the stereoscopic display data acquired by the human eye model in the virtual space and the human interactive data of the human body model to the stereoscopic display unit further includes:
and interacting with a virtual object model preset in the virtual space through the human body model to form human body interaction data.
Optionally, the human eye information includes human eye coordinate data and pupil distance data, the human body information includes human hand coordinate data and gesture information, and the step of receiving the human eye information and the human body information of the user sent by the information collecting unit further includes:
and receiving human eye coordinate data and pupil distance data, human hand coordinate data and gesture information of a user, which are sent by the information collecting unit.
Optionally, the step of transmitting the stereoscopic display data acquired by the human eye model in the virtual space and the human interactive data of the human body model to the stereoscopic display unit further includes:
and converting the format of the stereoscopic display data into a format matched with the stereoscopic display unit according to the parameters of the stereoscopic display unit.
Compared with the prior art, the invention has the beneficial effects that: the stereoscopic display system comprises an information collecting unit, an information processing unit and a stereoscopic display unit; the information collecting unit is used for collecting eye information and human body information of a user; the information processing unit is used for constructing a virtual space, constructing a human eye model and a human body model corresponding to a user in the virtual space according to human eye information and human body information, acquiring three-dimensional display data in the virtual space through the human eye model, and forming interactive display data through the human body model; and the stereoscopic display unit is used for displaying stereoscopic display pictures according to the stereoscopic display data and the interactive display data. According to the collected human eye information and human body information, the human eyes and the human bodies in the real space are mapped into the virtual space to form a human eye model and a human body model. The information in the real space is synchronous with the information in the virtual space, so that a viewer can feel that the virtual space coincides with the real space, and a vivid stereoscopic picture with motion parallax can be seen; and the human body model can realize the interactive operation of the user in the virtual space through the human body information, and the virtual interaction can be realized without external equipment or handheld equipment.
Drawings
In order to more clearly illustrate the embodiments or the technical solutions in the prior art, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a frame diagram of a stereoscopic display system according to an embodiment of the invention;
FIG. 2 is a block diagram of an information collecting unit according to an embodiment of the present invention;
FIG. 3 is a second frame diagram of an information collection unit according to an embodiment of the present invention;
fig. 4 is a frame diagram of an information processing unit according to an embodiment of the present invention;
FIG. 5 is a block diagram of a data processing unit according to an embodiment of the present invention;
fig. 6 is a flowchart of a stereoscopic display method according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments refers to the accompanying drawings, which illustrate specific embodiments in which the invention may be practiced. In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically connected, electrically connected or can be communicated with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. Meanwhile, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated in the present invention. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
The embodiment of the invention provides a stereoscopic display system, as shown in fig. 1, which comprises an information collecting unit 100, an information processing unit 200 and a stereoscopic display unit 300; wherein, the information collection unit 100 is used for collecting human eye information and human body information of a user; the information processing unit 200 is configured to construct a virtual space a, construct a human eye model A1 and a human body model A2 corresponding to a user in the virtual space a according to human eye information and human body information, collect stereoscopic display data in the virtual space a through the human eye model A1, and form interactive display data through the human body model A2; the stereoscopic display unit 300 is used for displaying stereoscopic display pictures according to the stereoscopic display data and the interactive display data.
The stereoscopic display system of the embodiment maps human eyes and human bodies in the real space B into the virtual space a according to the collected human eye information and human body information to form a human eye model and a human body model. The information in the real space B is synchronous with the information in the virtual space A, so that a viewer can feel that the virtual space A coincides with the real space B, and a vivid stereoscopic picture with motion parallax can be seen; and the human body model A2 realizes the interactive operation of the user in the virtual space A through human body information, and virtual interaction can be realized without external equipment or handheld equipment.
In one embodiment, the information processing unit 200 is further configured to preset a virtual object model A3 in the virtual space a, and the interactive display data further includes interactive data between the human body model A2 and the virtual object model A3. For example, the virtual object may be a virtual piano, the user demonstrates the action of playing the piano, and the manikin A2 plays the virtual piano in the virtual space a, so as to realize the contactless interaction between the user and the virtual piano in the virtual space a. The manikin A2 may also collide or contact with the virtual object model A3, producing a corresponding special effect.
In one embodiment, as shown in fig. 2, the information collecting unit 100 further includes a first collecting unit 110 and a second collecting unit 120, wherein:
the first collecting unit 110 is configured to collect eye information of a user, where the eye information includes eye coordinate data and pupil distance data; the specific position of the human eye model A1 in the virtual space A can be positioned by the human eye coordinate data, and the relative distance (namely the pupil distance) of the human eye model A1 is as accurate as possible so as to realize the superposition effect of the subsequent three-dimensional display picture and reality and improve the vivid effect of three-dimensional display.
The second collecting unit 120 is configured to collect human body information of a user, including human hand coordinate data and gesture information; the human model A2 may include a human hand model, according to the human hand coordinate data, a specific position of the human hand model in the virtual space a and a size of the human hand model may be located, and according to the gesture information, the human hand model may present a corresponding hand gesture, for example, a hand motion such as fist making, waving, etc.
In the present embodiment, the ratio of the sizes of the human eye model A1 and the human body model A2 to the sizes of human eyes and human bodies in the real space is related to the size of the stereoscopic display screen.
In one embodiment, as shown in fig. 3, the information collecting unit 100 further includes a first acquiring unit 130 and a second acquiring unit 140, wherein:
a first acquiring unit 130, configured to acquire a human eye color chart and a human body color chart of a user;
the second obtaining unit 140 is configured to obtain a human eye depth map and a human body depth map of a user.
Specifically, the first acquisition unit 130 includes a color (RGB) camera. The second acquisition unit 140 may be an infrared camera. This can improve the accuracy with which the information collecting unit 100 collects information.
In one embodiment, as shown in fig. 4, the information processing unit 200 further includes a first construction unit 210, a second construction unit 220, and a data processing unit 230, where:
the first construction unit 210 is configured to construct a virtual space a;
the second construction unit 220 is configured to construct a human eye model A1 and a human body model A2 in the virtual space a according to human eye information and human body information;
the data processing unit 230 is configured to collect stereoscopic display data in the virtual space a through the human eye model A1 and form interactive display data through the human body model A2.
In one embodiment, as shown in fig. 5, the data processing unit 230 further includes a format conversion unit 231 and a data transmission unit 232, wherein:
the format conversion unit 231 is configured to convert the format of the stereoscopic display data into a format matched with the stereoscopic display unit 300 according to the parameters of the stereoscopic display unit 300; parameters of the stereoscopic display unit 300 may be input in the data processing unit 230 in advance, that is, a play format of the stereoscopic display data may be adjusted.
The data transmission unit 232 is configured to transmit the stereoscopic display data after the format conversion to the stereoscopic display unit 300.
Under the conditions of accurate capture and modeling of human eye coordinates and interpupillary distances, the stereoscopic effect seen by human eyes can be consistent with the feeling of the real space B. The real human hand and the human hand model are overlapped in the visual field, so that the triggering interaction and special effects of the real human hand in the virtual space A are realized.
In one embodiment, the information collecting unit 100 may be an external collecting device, or may be integrated on the stereoscopic display unit 300. When the information collection unit 100 is integrated on the stereoscopic display unit 300, it is required that the photographing range of the information collection unit 100 can cover at least the position where a viewer is generally likely to be located in front of the screen of the stereoscopic display unit 300, and the larger the range, the better. A color map and a depth map are obtained by the information collecting unit 100, the color map is photographed by an RGB camera, and the depth map is photographed by an infrared camera. The information processing unit 200 is a PC (personal computer) or an embedded system that can complete the above information processing contents.
In an embodiment, the stereoscopic display unit 300 adopts a binocular parallax type stereoscopic display technology, and the stereoscopic display unit 300 may be a naked eye 3D display, or may be a vision-assisting type 3D display device (including 3D glasses and a head-mounted device), and the 3D glasses are more convenient to use than the head-mounted device, and in this embodiment, the vision-assisting type 3D display device is preferably 3D glasses. The naked eye 3D display comprises a parallax barrier 3D display, a cylindrical lens grating 3D display and a directional light source 3D display; the vision-aiding type three-dimensional display device comprises a color separation 3D display device, a polarization 3D display device and a shutter 3D display device.
In one embodiment, the stereoscopic display unit 300 is a lenticular 3D display; the information collecting unit 100 includes a human eye tracking unit and a gesture floating touch unit, wherein the human eye tracking unit is a realsense depth camera, and the gesture floating touch unit is a leap motion (gesture controller); the information processing unit 200 is a PC console, and establishes a hardware system, and the algorithm is partially implemented by eye tracking software and a unit-based (a 3D engine) plug-in. A virtual space a is created in the unit, and a virtual object (square), a virtual human hand (model of human hand), a virtual human eye (model of human eye A1, a camera in the unit) is created therein. The virtual eyes and the human hands correspond to and are linked with the real eyes and the human hands, so that the three-dimensional display effect is vivid, and the virtual interaction feeling of the user is real.
With the stereoscopic display system of the embodiment, a user can perform touch operation on the content of the screen-out picture in the stereoscopic display unit 300 by using gesture operation, so that the interactive effect of stereoscopic display is enhanced, and immersive experience is brought.
In one embodiment, the stereoscopic display system includes an information collecting unit 100, an information processing unit 200, and a stereoscopic display unit 300.
The information collection unit 100 is responsible for completing the collection of real space information including human eye coordinates and interpupillary distances, human hand coordinates and gestures, etc., and the collection of the real space information can be achieved by a color camera plus an infrared camera, and then the real space information is sent to the information processing unit 200.
The information processing unit 200 is a PC (personal computer) or an embedded system capable of completing the above information processing contents, and can realize two functions:
first, a virtual space a is created and modeled. The virtual space a may contain a plurality of preset virtual objects. After receiving the real space information, the information processing unit 200 builds a model of the real object in the virtual space a, and the coordinate form corresponds to reality in real time. The ratio of the model size to the actual size, and the display screen size. The human hand model incorporates interactive functions such as collision with a virtual object or triggering special effects. The human eye model A1 is established, and its coordinates and relative distance (i.e. pupil distance) should be as accurate as possible to enhance the effect of the subsequent stereoscopic display overlapping with reality.
Secondly, the parallax image captured by the eye model A1 in the virtual space a is converted into a format playable by the stereoscopic display unit 300. Parameters of the stereoscopic display unit 300 are input in the information processing unit 200 in advance, and thus the play format can be adjusted. Under the conditions of accurate capture and modeling of human eye coordinates and interpupillary distances, the stereoscopic effect seen by human eyes can be consistent with the feeling of the real space B. The real human hand and the human hand model are overlapped in the visual field, so that the triggering interaction and special effects of the real human hand in the virtual space A are realized.
The stereoscopic display unit 300 adopts a binocular parallax type stereoscopic display technology, and can be naked eye 3D display or vision-assisting type 3D display (but not including a head-mounted type), wherein the naked eye 3D display comprises a parallax barrier 3D display, a cylindrical lens grating 3D display and a directional light source 3D display; the vision-aiding type three-dimensional display device comprises a color separation 3D display device, a polarization 3D display device and a shutter 3D display device.
The information collecting unit 100 may be an external device or may be integrated inside the screen, where the shooting range is required to cover at least the position where the viewer may be located before the screen, and the larger the better. And obtaining a color image and a depth image through the camera, wherein the color image is shot by the RGB camera, and the depth image is shot by the infrared camera.
The embodiment of the invention provides a stereoscopic display method which is applied to an information processing unit 200 of a stereoscopic display system, wherein the stereoscopic display system further comprises an information collecting unit 100 and a stereoscopic display unit 300; as shown in fig. 6, the stereoscopic display method at least includes step S1, step S2, step S3 and step S4, specifically:
step S1, constructing a virtual space A.
Step S2, receiving the human eye information and the human body information of the user transmitted by the information collecting unit 100. The human eye information comprises human eye coordinate data and pupil distance data, and the human body information comprises human hand coordinate data and gesture information.
The above steps specifically include receiving the human eye coordinate data and the pupil distance data, the human hand coordinate data and the gesture information of the user sent by the information collecting unit 100.
And S3, constructing a human eye model A1 and a human body model A2 corresponding to the user in the virtual space A according to the human eye information and the human body information. I.e. the human eyes and body in real space B are mapped into virtual space a.
Step S4, the three-dimensional display data acquired by the human eye model A1 in the virtual space A and the human interactive data of the human body model A2 are transmitted to the three-dimensional display unit, so that the three-dimensional display unit 300 displays three-dimensional display pictures.
In one embodiment, step S4 specifically includes converting the format of the stereoscopic display data into a format matching the stereoscopic display unit 300 according to the parameters of the stereoscopic display unit 300.
In one embodiment, step S4 further comprises:
human body interaction data are formed by interaction between the human body model A2 and a virtual object model A3 preset in the virtual space A.
The stereoscopic display method of the present embodiment is applied to the stereoscopic display system provided in the above embodiment, and the stereoscopic display system includes an information collecting unit 100, an information processing unit 200, and a stereoscopic display unit 300; wherein, the information collection unit 100 is used for collecting human eye information and human body information of a user; the information processing unit 200 is configured to construct a virtual space a, construct a human eye model A1 and a human body model A2 corresponding to a user in the virtual space a according to human eye information and human body information, collect stereoscopic display data in the virtual space a through the human eye model A1, and form interactive display data through the human body model A2; the stereoscopic display unit 300 is used for displaying stereoscopic display pictures according to the stereoscopic display data and the interactive display data.
The stereoscopic display system of the embodiment maps human eyes and human bodies in the real space B into the virtual space a according to the collected human eye information and human body information to form a human eye model and a human body model. The information in the real space B is synchronous with the information in the virtual space A, so that a viewer can feel that the virtual space A coincides with the real space B, and a vivid stereoscopic picture with motion parallax can be seen; and the human body model A2 realizes the interactive operation of the user in the virtual space A through human body information, and virtual interaction can be realized without external equipment or handheld equipment.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.
Claims (7)
1. A stereoscopic display system, comprising at least:
an information collection unit for collecting eye information and body information of a user; the information collecting unit comprises a first collecting unit and a second collecting unit; the first collecting unit is used for collecting eye information of the user, and the eye information comprises eye coordinate data and interpupillary distance data; the second collecting unit is used for collecting human body information of the user, and the human body information comprises hand coordinate data and gesture information;
the information processing unit is used for constructing a virtual space, constructing a human eye model and a human body model corresponding to the user in the virtual space according to the human eye information and the human body information, acquiring three-dimensional display data in the virtual space through the human eye model, and forming interactive display data through the human body model; the information processing unit completes information processing by adopting eye tracking software and a unit-based plug-in; the information processing unit comprises a first construction unit, a second construction unit and a data processing unit; the first construction unit is used for constructing the virtual space; the second construction unit is used for constructing the human eye model and the human body model in the virtual space according to the human eye information and the human body information; the data processing unit is used for acquiring three-dimensional display data in the virtual space through the human eye model and forming interactive display data through the human body model; the human eye coordinate data are used for positioning the specific position of the human eye model in the virtual space, and the pupil distance data are used for realizing the effect of overlapping the subsequent three-dimensional display picture with reality; the hand coordinate data are used for positioning the specific position of the hand model in the virtual space and the size of the hand model, and the gesture information are used for enabling the hand model to present the corresponding hand gesture;
and the stereoscopic display unit is used for displaying a stereoscopic display picture according to the stereoscopic display data and the interactive display data.
2. The stereoscopic display system according to claim 1, wherein the information processing unit is further configured to establish a preset virtual object model in the virtual space, and the interactive display data further includes interactive data of the human body model and the virtual object model.
3. The stereoscopic display system of claim 1, wherein the information collecting unit further comprises:
the first acquisition unit is used for acquiring a human eye color chart and a human body color chart of the user;
and the second acquisition unit is used for acquiring the human eye depth map and the human body depth map of the user.
4. The stereoscopic display system of claim 1, wherein the data processing unit further comprises:
the format conversion unit is used for converting the format of the stereoscopic display data into a format matched with the stereoscopic display unit according to the parameters of the stereoscopic display unit;
and the data transmission unit is used for transmitting the stereoscopic display data with the converted format to the stereoscopic display unit.
5. The stereoscopic display method is characterized by being applied to an information processing unit of a stereoscopic display system, wherein the information processing unit adopts eye tracking software and a unit-based plug-in to complete information processing, and comprises a first construction unit, a second construction unit and a data processing unit; the stereoscopic display system further comprises an information collection unit and a stereoscopic display unit, wherein the information collection unit comprises a first collection unit and a second collection unit;
the stereoscopic display method at least comprises the following steps:
the first construction unit constructs a virtual space;
the second construction unit receives human eye information of the user collected by the first collection unit and human body information of the user collected by the second collection unit; the human eye information comprises human eye coordinate data and interpupillary distance data; the human body information comprises human hand coordinate data and gesture information;
the second construction unit constructs a human eye model and a human body model corresponding to the user in the virtual space according to the human eye information and the human body information; the human eye coordinate data are used for positioning the specific position of the human eye model in the virtual space, and the pupil distance data are used for realizing the effect of overlapping the subsequent three-dimensional display picture with reality; the hand coordinate data are used for positioning the specific position of the hand model in the virtual space and the size of the hand model, and the gesture information are used for enabling the hand model to present the corresponding hand gesture;
the data processing unit is used for acquiring three-dimensional display data and interactive display data in the virtual space through the human eye model, wherein the interactive display data comprise human interactive data formed by the human body model;
and transmitting the three-dimensional display data and the human interaction data to the three-dimensional display unit so that the three-dimensional display unit displays three-dimensional display pictures.
6. The stereoscopic display method according to claim 5, wherein the method comprises:
and interacting the human body model with a virtual object model preset in the virtual space through the human body model to form human body interaction data.
7. The stereoscopic display method according to claim 5, wherein the method comprises:
and converting the format of the stereoscopic display data into a format matched with the stereoscopic display unit according to the parameters of the stereoscopic display unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210431861.XA CN114866757B (en) | 2022-04-22 | 2022-04-22 | Stereoscopic display system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210431861.XA CN114866757B (en) | 2022-04-22 | 2022-04-22 | Stereoscopic display system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114866757A CN114866757A (en) | 2022-08-05 |
CN114866757B true CN114866757B (en) | 2024-03-05 |
Family
ID=82632600
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210431861.XA Active CN114866757B (en) | 2022-04-22 | 2022-04-22 | Stereoscopic display system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114866757B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622081A (en) * | 2011-01-30 | 2012-08-01 | 北京新岸线网络技术有限公司 | Method and system for realizing somatic sensory interaction |
CN104820497A (en) * | 2015-05-08 | 2015-08-05 | 东华大学 | A 3D interaction display system based on augmented reality |
CN106131536A (en) * | 2016-08-15 | 2016-11-16 | 万象三维视觉科技(北京)有限公司 | A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof |
CN109460150A (en) * | 2018-11-12 | 2019-03-12 | 北京特种机械研究所 | A kind of virtual reality human-computer interaction system and method |
CN110850977A (en) * | 2019-11-06 | 2020-02-28 | 成都威爱新经济技术研究院有限公司 | Stereoscopic image interaction method based on 6DOF head-mounted display |
CN111984114A (en) * | 2020-07-20 | 2020-11-24 | 深圳盈天下视觉科技有限公司 | Multi-person interaction system based on virtual space and multi-person interaction method thereof |
CN113866987A (en) * | 2021-09-29 | 2021-12-31 | 北京理工大学 | Method for interactively adjusting interpupillary distance and image surface of augmented reality helmet display by utilizing gestures |
-
2022
- 2022-04-22 CN CN202210431861.XA patent/CN114866757B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622081A (en) * | 2011-01-30 | 2012-08-01 | 北京新岸线网络技术有限公司 | Method and system for realizing somatic sensory interaction |
CN104820497A (en) * | 2015-05-08 | 2015-08-05 | 东华大学 | A 3D interaction display system based on augmented reality |
CN106131536A (en) * | 2016-08-15 | 2016-11-16 | 万象三维视觉科技(北京)有限公司 | A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof |
CN109460150A (en) * | 2018-11-12 | 2019-03-12 | 北京特种机械研究所 | A kind of virtual reality human-computer interaction system and method |
CN110850977A (en) * | 2019-11-06 | 2020-02-28 | 成都威爱新经济技术研究院有限公司 | Stereoscopic image interaction method based on 6DOF head-mounted display |
CN111984114A (en) * | 2020-07-20 | 2020-11-24 | 深圳盈天下视觉科技有限公司 | Multi-person interaction system based on virtual space and multi-person interaction method thereof |
CN113866987A (en) * | 2021-09-29 | 2021-12-31 | 北京理工大学 | Method for interactively adjusting interpupillary distance and image surface of augmented reality helmet display by utilizing gestures |
Also Published As
Publication number | Publication date |
---|---|
CN114866757A (en) | 2022-08-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113168007A (en) | System and method for augmented reality | |
CA2694095C (en) | Virtual interactive presence systems and methods | |
CN107517370B (en) | The system and method for the image registration of multi-path video stream | |
CN102959616B (en) | Interactive reality augmentation for natural interaction | |
US20150042640A1 (en) | Floating 3d image in midair | |
CN103810353A (en) | Real scene mapping system and method in virtual reality | |
US20060126925A1 (en) | Horizontal perspective representation | |
JPH07311857A (en) | Picture compositing and display device and simulation system | |
CN104536579A (en) | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method | |
KR20110009002A (en) | Image system | |
CN203746012U (en) | Three-dimensional virtual scene human-computer interaction stereo display system | |
US20120293549A1 (en) | Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method | |
CN105068649A (en) | Binocular gesture recognition device and method based on virtual reality helmet | |
CN105639818A (en) | Intelligent safety helmet based on augmented reality, space scanning and gesture recognition technologies | |
CN102647606A (en) | Stereoscopic image processor, stereoscopic image interaction system and stereoscopic image display method | |
CN104599317A (en) | Mobile terminal and method for achieving 3D (three-dimensional) scanning modeling function | |
US10652525B2 (en) | Quad view display system | |
CN102508548A (en) | Operation method and system for electronic information equipment | |
US20150326847A1 (en) | Method and system for capturing a 3d image using single camera | |
CN113382224B (en) | Interactive handle display method and device based on holographic sand table | |
CN102799378B (en) | A kind of three-dimensional collision detection object pickup method and device | |
JP2013168120A (en) | Stereoscopic image processing apparatus, stereoscopic image processing method, and program | |
CN114866757B (en) | Stereoscopic display system and method | |
EP2244170A1 (en) | Stereo imaging touch device | |
CN109426336A (en) | A kind of virtual reality auxiliary type selecting equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |