CN108052205A - Virtual reality system based on 3D interactions - Google Patents
Virtual reality system based on 3D interactions Download PDFInfo
- Publication number
- CN108052205A CN108052205A CN201711452614.3A CN201711452614A CN108052205A CN 108052205 A CN108052205 A CN 108052205A CN 201711452614 A CN201711452614 A CN 201711452614A CN 108052205 A CN108052205 A CN 108052205A
- Authority
- CN
- China
- Prior art keywords
- point touch
- touch screen
- display screen
- projection display
- curtain
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04109—FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention provides a kind of virtual reality system based on 3D interactions.It includes the data helmet, ray trace camera group, multi-point touch screen, rear-projection display screen curtain, back-projection instrument, front-end projectors, Master Control Unit, microcomputer group, HDMI wire, the ray trace camera group is positioned over room top, and distance is equal between adjacent camera;The rear-projection display screen curtain is put in perpendicular to ground above multi-point touch screen, and the multi-point touch screen is horizontal positioned, this two screen etc. is big, long side connects and orthogonal;The data helmet is worn on observer head, facing to rear-projection display screen curtain;The back-projection instrument face rear-projection display screen curtain is positioned over rear-projection display screen curtain back side suitable distance, and the front-end projectors face multi-point touch screen is positioned at the suitable distance away from multi-point touch screen;The microcomputer group connects rear-projection display screen curtain and multi-point touch screen by HDMI wire respectively, and the Master Control Unit passes through radio contact with ray trace camera group.Present system connection is simple, and gained image resolution is high, interactivity is strong.
Description
Technical field
The present invention relates to a kind of virtual reality systems based on 3D interactions, belong to stereo display technique and virtual reality technology
Field.
Background technology
What three-dimensional display system provided is not only flat image, it can be that spectators bring the three-dimensional shadow for possessing depth information
Picture deepens the telepresenc of spectators;The more long and more ripe three-dimensional display system of development is watched by ancillary equipment at present.
On the other hand, virtual reality system makes spectators incorporate in the scene created into computer using a variety of different types of sensors
And can realize interaction, the final purpose of virtual reality is that the system being made up of computer and sensor meets spectators' as far as possible
It needs.
Traditional three-dimensional display system critical piece includes stereoscopic display, red indigo plant anaglyph spectacles, polarising glass or data
The helmet.Traditional virtual reality interactive system critical piece includes recording device, microcomputer, control unit, display device, multiple types
Sensor of type etc..Wherein recording device (keyboard, mouse have cotton gloves etc.) records the user action in true environment and leads to
It crosses sensor and is transmitted to control unit, the scene or object in control unit order display device generate corresponding reflection, observer
Interactive rear result is observed by display device.
Although said program realizes interactive display, but Shortcomings.Major part interaction technique is all confined to two dimension at present
Image, and it is all relatively simpler for the interaction effect of the more complicated three-dimensional environment of spatial relationship, and apparent interactive markup is very
It is beautiful that final image is affected in big degree.
The content of the invention
Problem and shortage existing in the prior art in view of the above, the present invention provide a kind of based on the virtual existing of 3D interactions
Real system.The system make observer by the data helmet obtain high definition 3-D view, and according to the observation viewing angle with
The gesture of observer changes scene angle and is shown with subjective situation so as to fulfill three-dimension interaction in real time.
In order to achieve the above objectives, the present invention uses following technical scheme:
A kind of virtual reality system based on 3D interactions, including the data helmet, ray trace camera group, multi-point touch screen
Curtain, rear-projection display screen curtain, back-projection instrument, front-end projectors, Master Control Unit, microcomputer group, HDMI wire, it is characterised in that:
(1) the ray trace camera group is positioned over room top, and distance is equal between adjacent camera;
(2) the rear-projection display screen curtain is put in perpendicular to ground above multi-point touch screen, the multi-point touch screen water
Placing flat, this two screen etc. is big, long side connects and orthogonal;
(3) the data helmet is worn on observer head, facing to rear-projection display screen curtain;
(4) the back-projection instrument face rear-projection display screen curtain is positioned over rear-projection display screen curtain back side suitable distance, described
Front-end projectors face multi-point touch screen is positioned at the suitable distance away from multi-point touch screen;
(5) the microcomputer group connects rear-projection display screen curtain and multi-point touch screen, the master control list by HDMI wire respectively
Member passes through radio contact with ray trace camera group.
Master Control Unit is made of modules such as 89C52 microcontrollers, CC2530F64 chips, power supplys.Ray trace camera group is adopted
Touch information is transmitted to Master Control Unit by the image wireless transmission collected to Master Control Unit, multi-point touch screen, and Master Control Unit receives
And it handles signal and is then wirelessly sent to microcomputer;It is corresponding to main body progress to the scene to prestore after microcomputer group wireless reception instruction to adjust
It is whole, while information is transmitted to projection unit and the data helmet by HDMI wire.
The present invention compared with prior art, has following obvious substantive distinguishing features and technological progress:
It is of the invention that the action of the number of people is recorded using ray trace camera compared with traditional interactive display system, so as to
Realizing that virtual scene is responded according to number of people direction transformation visual angle, spectators observe three-dimension virtual reality image by the data helmet,
It and can be with system interaction.
Present system connection is simple, and gained image resolution is high, interactivity is strong.
Description of the drawings
A kind of structure diagrams of the virtual reality system based on 3D interactions of Fig. 1.
Fig. 2 Master Control Unit block diagrams.
Specific embodiment
Details are as follows for a preferred embodiment of the present invention combination attached drawing:
Embodiment one:
Referring to Fig. 1 and Fig. 2, based on the virtual reality system that 3D is interacted, including the data helmet 1, ray trace camera
Group 2, multi-point touch screen 3, rear-projection display screen curtain 4, back-projection instrument 5, front-end projectors 6, Master Control Unit 7, microcomputer group 8,
HDMI wire 9, it is characterised in that:
(1) the ray trace camera group 2 is positioned over room top, and distance is equal between adjacent camera;
(2) the rear-projection display screen curtain 4 is put in the top of multi-point touch screen 3, the multi-point touch screen perpendicular to ground
Curtain 3 is horizontal positioned, this two screen (4.3) etc. is big, long side connects and orthogonal;
(3) the data helmet 1 is worn on observer head, facing to rear-projection display screen curtain 4;
(4) the 5 face rear-projection display screen curtain 4 of back-projection instrument is positioned over 4 back side rear of rear-projection display screen curtain, before described
6 face multi-point touch screen 3 of end projection instrument is positioned over the top away from multi-point touch screen 3;
(5) the microcomputer group 8 connects rear-projection display screen curtain 4, multi-point touch screen 3, back-projection by HDMI wire 9 respectively
Instrument 5, front-end projectors 6 and the data helmet 1, the Master Control Unit 7 pass through radio contact with ray trace camera group 2.
Embodiment two:
The present embodiment and embodiment one are essentially identical, and special feature is:The multi-point touch screen 3 and rear-projection display screen
The resolution ratio of curtain 4 reaches 1920*1080 pixels, and multi-point touch screen 3 is the multi-point touch screen based on FTIR infrared spectrum analysis;
Microcomputer group 8 is made of three dimension microcomputers, and ray trace camera shooting unit 2 is made of three-dimensional camera;The scene that prestores in microcomputer and master
Body image information.
Embodiment three:
The present embodiment is described with reference to the drawings as follows:Referring to Fig. 1, the virtual reality system based on 3D interactions includes data head
It is helmet 1, ray trace camera group 2, multi-point touch screen 3, rear-projection display screen curtain 4, back-projection instrument 5, front-end projectors 6, total
Control unit 7, microcomputer group 8, HDMI wire 9:The ray trace camera group 2 is positioned over room top, and between adjacent camera
Apart from equal;The rear-projection display screen curtain 4 is put in certain plane, the 3 horizontal positioned Mr. Yu of multi-point touch screen perpendicular to ground
Plane, this two screen etc. is big, long side connects and orthogonal;The data helmet 1 is worn on observer head, facing to rear-projection
Display screen 4;The 5 face rear-projection display screen curtain 4 of back-projection instrument is positioned over screen back side suitable distance, the front projection
6 face multi-point touch screen 3 of instrument is positioned over away from screen suitable distance;The microcomputer group 8 connects rear-projection by HDMI wire 9 respectively
Display screen 4 and multi-point touch screen 3, the Master Control Unit 7 pass through radio contact with ray trace camera group 2.
Multi-point touch screen and the resolution ratio of rear-projection display screen curtain reach 1920*1080 pixels, and multi-point touch screen is
Multi-point touch screen based on FTIR infrared spectrum analysis;Microcomputer group is made of three dimension microcomputers, and ray trace images unit by three
Tie up video camera composition;The scene that prestores in microcomputer and main body image information, according to the head of spectators and hand motion to prestored presentation
It is adjusted and real-time rendering output, this virtual reality system depends on the data helmet and based on WIM
(windowsinminiature) breviary diagram technology.
Fig. 2 shows the structure of Master Control Unit 7.The Master Control Unit is by moulds such as 89C52 microcontrollers, CC2530F64 chips, power supplys
Block forms.Ray trace camera group the image collected radios to Master Control Unit, and multi-point touch screen is by touch information
Master Control Unit is transmitted to, Master Control Unit receives and processes all signals and is then wirelessly sent to microcomputer group and controls microcomputer group in real time;
The scene to prestore with main body is adjusted accordingly after microcomputer group wireless reception instruction, while information is transmitted to throwing by HDMI wire
Shadow unit and the data helmet.
Claims (2)
1. based on the virtual reality system of 3D interactions, including the data helmet (1), ray trace camera group (2), multi-point touch screen
Curtain (3), rear-projection display screen curtain (4), back-projection instrument (5), front-end projectors (6), Master Control Unit (7), microcomputer group (8), HDMI
Line (9), it is characterised in that:
(1) the ray trace camera group (2) is positioned over room top, and distance is equal between adjacent camera;
(2) the rear-projection display screen curtain (4) is put in the top of multi-point touch screen (3), the multi-point touch screen perpendicular to ground
Curtain (3) is horizontal positioned, this two screen (4.3) etc. is big, long side connects and orthogonal;
(3) the data helmet (1) is worn on observer head, facing to rear-projection display screen curtain (4);
(4) back-projection instrument (5) the face rear-projection display screen curtain (4) is positioned over rear-projection display screen curtain (4) back side rear, described
Front-end projectors (6) face multi-point touch screen (3) is positioned over the top away from multi-point touch screen (3);
(5) the microcomputer group (8) connects rear-projection display screen curtain (4), multi-point touch screen (3), back by HDMI wire (9) respectively
Projecting apparatus (5), front-end projectors (6) and the data helmet (1), the Master Control Unit (7) pass through with ray trace camera group (2)
Radio contact.
2. the virtual reality system according to claim 1 based on 3D interactions, it is characterised in that:The multi-point touch screen
(3) and the resolution ratio of rear-projection display screen curtain (4) reaches 1920*1080 pixels, and multi-point touch screen (3) is infrared based on FTIR
The multi-point touch screen of spectrum analysis;Microcomputer group (8) is made of three dimension microcomputers, and ray trace camera shooting unit (2) is by three-dimensional camera shooting
Machine forms;The scene that prestores in microcomputer and main body image information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711452614.3A CN108052205A (en) | 2017-12-28 | 2017-12-28 | Virtual reality system based on 3D interactions |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711452614.3A CN108052205A (en) | 2017-12-28 | 2017-12-28 | Virtual reality system based on 3D interactions |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108052205A true CN108052205A (en) | 2018-05-18 |
Family
ID=62128371
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711452614.3A Pending CN108052205A (en) | 2017-12-28 | 2017-12-28 | Virtual reality system based on 3D interactions |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108052205A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112306305A (en) * | 2020-10-28 | 2021-02-02 | 黄奎云 | Three-dimensional touch device |
-
2017
- 2017-12-28 CN CN201711452614.3A patent/CN108052205A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112306305A (en) * | 2020-10-28 | 2021-02-02 | 黄奎云 | Three-dimensional touch device |
CN112306305B (en) * | 2020-10-28 | 2021-08-31 | 黄奎云 | Three-dimensional touch device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7843470B2 (en) | System, image processing apparatus, and information processing method | |
US9699438B2 (en) | 3D graphic insertion for live action stereoscopic video | |
CN106454311B (en) | A kind of LED 3-D imaging system and method | |
CN203773476U (en) | Virtual reality system based on 3D interaction | |
KR101126110B1 (en) | 3D image displaying system and 3D displaying method using the same | |
CN205987184U (en) | Real standard system based on virtual reality | |
CN103440677A (en) | Multi-view free stereoscopic interactive system based on Kinect somatosensory device | |
CN102510508B (en) | Detection-type stereo picture adjusting device and method | |
WO2023071574A1 (en) | 3d image reconstruction method and apparatus, electronic device, and storage medium | |
CN104516492A (en) | Man-machine interaction technology based on 3D (three dimensional) holographic projection | |
CN106210856B (en) | The method and system of 3D panoramic video are watched on internet video live broadcasting platform | |
JP6512575B2 (en) | Method of distributing or broadcasting three-dimensional shape information | |
WO2017173890A1 (en) | Helmet having dual cameras | |
CN107005689B (en) | Digital video rendering | |
CN106980369A (en) | The synthesis of 3rd multi-view video of virtual reality project and output system and method | |
US11727645B2 (en) | Device and method for sharing an immersion in a virtual environment | |
CN105468154B (en) | The interactive panorama display systems of electric system operation | |
CN204559792U (en) | Device and the display device of stereo-picture or video is shown for display screen | |
CN108052205A (en) | Virtual reality system based on 3D interactions | |
KR101752691B1 (en) | Apparatus and method for providing virtual 3d contents animation where view selection is possible | |
WO2023056803A1 (en) | Holographic presentation method and apparatus | |
CN108012140A (en) | Virtual reality system based on 3D interactions | |
CN203773477U (en) | 3D interaction real-time display system | |
Lee et al. | A study on virtual studio application using Microsoft Hololens | |
JP2010267192A (en) | Touch control device for three-dimensional imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180518 |