CN106444023A - Super-large field angle binocular stereoscopic display transmission type augmented reality system - Google Patents
Super-large field angle binocular stereoscopic display transmission type augmented reality system Download PDFInfo
- Publication number
- CN106444023A CN106444023A CN201610755512.8A CN201610755512A CN106444023A CN 106444023 A CN106444023 A CN 106444023A CN 201610755512 A CN201610755512 A CN 201610755512A CN 106444023 A CN106444023 A CN 106444023A
- Authority
- CN
- China
- Prior art keywords
- lens
- augmented reality
- module
- display screen
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/34—Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
- G02B30/36—Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using refractive optical elements, e.g. prisms, in the optical path between the images and the observer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
Abstract
The invention provides a super-large field angle binocular stereoscopic display transmission type augmented reality system. The augmented reality system includes a headset device. The headset device includes a circuit system, a display screen for displaying frames, two lenses having the superposed exit pupil position and the human eye pupil position, a beam splitter, and a space depth induction device arranged at the front end of a circuit system, the display screen is arranged above the two lenses, and the beam splitter is arranged under the two lenses. The transmission type augmented reality system can superpose the virtual information and the reality scene by means of augmented reality technology, forms a super-large field angle through the combination of the display screen, the lenses, and the beam splitter, realizes the three-dimensional stereoscopic display, meets the actual watching needs of human eyes' visual angle, enhances the users' sensing experience, and can be widely applied to the military, medical, construction, education, engineering, film and television, entertainment and other fields.
Description
Technical field
The invention belongs to multimedia application technical field, particularly to a kind of binocular solid at ultra-large vision field angle show saturating
Penetrate formula augmented reality system.
Background technology
Augmented reality be a kind of by technology integrated to real world information and virtual world information " seamless ", this technology be by
The information (as vision, sound, taste, tactile etc.) being difficult in script real world certain time spatial dimension experience passes through electricity
The real world that is added to again after the emulation of brain technical modelling is perceived by human sensory, thus reaching the sensory experience of exceeding reality.
Augmented reality can be widely applied to the fields such as military affairs, medical treatment, building, education, engineering, video display, amusement.
Augmented reality head-mounted system is a kind of application product in wearable computer field for the augmented reality, gathers around
There are transmission-type near-eye display system, independent operating system and powerful image-capable, user can see real scene
The virtual information that computer generates can be seen again.During using augmented reality head-mounted system, the handss of user are similar to the light of computer
Mark, augmented reality head-mounted system realizes man-machine interaction by the gesture of identifying user, and user can be by conversion gesture to being
The interface of system display is operated, and realizes clicking, double-clicking, scaling, sliding and other effects.
The angle of visual field of existing augmented reality system is less at present, and for example existing patent publication No. is that CN105425395A is public
A kind of big angle of visual field augmented reality glasses of the glass proposals opened, the glasses providing in this technology are capable of augmented reality and show
Show, but its angle of visual field still cannot meet user's request, additionally, existing patent publication No. discloses one for CN103996322A
Plant the welding operation training simulation method based on augmented reality, in the method, provide augmented reality glasses, this glasses is monocular
The augmented reality glasses of display, it can not make user pass through right and left eyes real-time reception to having stereopsises in application process
Image, for this reason, cannot make user experience real stereoeffect it is impossible to meet the actual viewing demand at human eye visual angle, for this
Be badly in need of develop a kind of ultra-large vision field angle and be capable of the augmented reality system that 3 D stereo shows.
Content of the invention
Less in order to solve the existing augmented reality system angle of visual field, and the augmented reality glasses that existing monocular shows are not
User can be made to pass through right and left eyes real-time reception to the image with stereopsises, for this reason, user cannot be made to experience really standing
Body effect is it is impossible to the problems such as meet the actual viewing demand at human eye visual angle, the invention provides a kind of binocular at ultra-large vision field angle
The transmission-type augmented reality system of stereo display.
Concrete technical scheme of the present invention is as follows:
The invention provides the transmission-type augmented reality system that a kind of binocular solid at ultra-large vision field angle shows, including wearing
Formula equipment, described headset equipment includes Circuits System, the display screen for display picture, two exit pupil positions and human eye pupil
Lens, beam splitter and the spatial depth awareness apparatus being arranged on described Circuits System front end that position overlaps, described display location
Above two described lens, described beam splitter is located at below two described lens;
Described spatial depth awareness apparatus are used for the space picture of reality scene in front of headset equipment described in captured in real-time,
And send described space picture to described Circuits System;
The described space receiving picture is analyzed obtaining the space letter in the picture of described space by described Circuits System
Breath, and calculate the depth information of every bit in described reality scene according to described spatial information, simultaneously according to described depth information
Will be stored in the three-dimensional image in described Circuits System and be added in described reality scene forming augmented reality image, described
Spatial information is the three-dimensional coordinate of described reality scene, and described depth information is every bit position pair in described reality scene
The Z axis coordinate answered.
Further, the angle of described display screen and described lens axis is 0-180 °, described beam splitter and described lens
The angle of optical axis is 1-90 °;Described lens and one times of focal length being smaller than described lens of described display screen;
Preferably, described display screen and the angle of described lens axis are 90 °.
Further, described lens are made up of at least two pieces of lens superpositions.
Further, described Circuits System includes memory module and the processing module communicating with described memory module, meter
Calculate module, split screen display available module, image output module, described processing module is communicated with described computing module, described calculating mould
Block and described image output module are all communicated with described split screen display available module;
Described memory module internal memory contains three-dimensional image;
Described processing module is used for receiving the described space picture that described spatial depth awareness apparatus send, and to described sky
Between picture carry out the spatial information that graphical analyses obtain in described space picture, and according to the every pictures of described extraction of spatial information
In each pixel depth value obtain depth information, send to described computing module;
Described computing module calculates described three-dimensional image position in described reality scene according to described depth information
Information, and the described three-dimensional image in described memory module is superimposed upon by described reality scene according to described positional information
In, described positional information is that described three-dimensional image is superimposed upon the position coordinateses in described reality scene;
Described three-dimensional image after described split screen display available module is used for processing described computing module is carried out at split screen
Reason obtains left-eye image and eye image, and described left-eye image and described eye image are sent to described image output mould
Block, the just described left-eye image of described image output module and described eye image export to shown display screen display, described
The light that display screen sends leads to respectively through entering human eye, human eye after two described lens again after the reflection of described beam splitter
Cross this transmission-type augmented reality systematic observation to augmented reality image.
Further, described Circuits System also includes sound identification module and the brain wave communicating with described processing module
Identification module, described sound identification module is used for reading voice messaging and being converted into speech control instruction realizing interaction;Described brain
Electric wave identification module is used for gathering user's eeg signal, and formed after described eeg signal is identified control instruction from
And realize interacting;
Described headset equipment is additionally provided with the sensing equipment that some and described Circuits System communicates, and described sensing equipment is
One or more of accelerometer, compass, gyroscope, GPS combine, and described sensing equipment is used for gathering the posture information of user
And send to described Circuits System, described Circuits System forms control instruction thus realizing after described posture information is identified
Body feeling interaction, described posture information includes the position of people and the head of human body, lower limb, foot, trunk, handss or arm in space
Motion track.
Further, described spatial depth awareness apparatus are that the first depth transducer, the second depth transducer and the 3rd are deep
The combination of one or more of degree sensor, described first depth transducer is made up of infrared LED lamp and thermal camera, described
Second depth transducer is made up of structured light device, thermal camera and RGB photographic head, and described 3rd depth transducer is
RGB binocular camera;
Described spatial depth awareness apparatus are additionally operable to gather the gesture motion information of user and send to described Circuits System,
Described Circuits System forms control instruction after described gesture motion information is identified thus realizing interacting, described gesture motion
Information includes finger position, palm center location, the finger anglec of rotation, finger motion angle and finger motion direction.
Further, described headset equipment also includes adjusting for the interpupillary distance adjusting horizontal range between two described lens
Section part and the spacing regulating part for adjusting distance between described lens and described display screen.
Further, described interpupillary distance regulating part includes the cross bar being slidably connected with two described lens one end, described cross bar
On be transversely provided with the first slide opening, be provided with two in described first slide opening and can open up the first sliding of direction slip along described first slide opening
Block, first slide block described with two pieces is connected respectively for the end of two described lens;
Described spacing regulating part includes the vertical pole that two left and right two ends with described cross bar are slidably connected, and two described vertical
On bar, all longitudinal direction is provided with the second slide opening, is provided with the second cunning that can open up direction slip along described second slide opening in described second slide opening
Block, second slide block described with two pieces is connected respectively at the left and right two ends of described cross bar.
Preferably, the quantity of described display screen is one or two, and the quantity of described beam splitter is one or two, described
The reflectance of beam splitter is 1-100%, and the absorbance of described beam splitter is 0-99%;Described lens are spherical lenss, aspheric surface
The superposition of one or more of lens, Fresnel Lenses, free-form surface lens or achromatic doublet.
Preferably, described headset equipment also includes picture frame, described Circuits System, described display screen, two described lens
It is arranged on described picture frame with described beam splitter, described picture frame side is provided with the mike being connected with described Circuits System and raises
Sound device.
Beneficial effects of the present invention are as follows:The transmission-type that the binocular solid at the ultra-large vision field angle that the present invention provides shows strengthens
Virtual information can be stacked up with reality scene by reality system using augmented reality, and by display screen, lens and
The optimization design of the specific combination of beam splitter and lens forms ultra-large vision field angle, additionally, being realized by the split screen at left and right visual angle
Binocular solid shows, disclosure satisfy that the actual viewing demand at human eye visual angle, strengthens the sensory experience of user, and increases with user's
Interactive, make user participate in displaying, thus realizing good bandwagon effect.
Brief description
Fig. 1 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 1
The structural representation of headset equipment;
Fig. 2 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 2
The side view of headset equipment;
Fig. 3 is a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 3
The structured flowchart of Circuits System in headset equipment;
Fig. 4 is a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 4
The structured flowchart of Circuits System in headset equipment;
Fig. 5 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 6
The structural representation of headset equipment;
Fig. 6 is the enlarged drawing of A in Fig. 5;
Fig. 7 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 8
The structural representation one of headset equipment;
Fig. 8 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 8
The structural representation two of headset equipment;
Fig. 9 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 8
The structural representation three of headset equipment;
Figure 10 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 8
The structural representation four of headset equipment;
Figure 11 is in a kind of transmission-type augmented reality system that shows of binocular solid at the ultra-large vision field angle described in embodiment 8
The perspective view of headset equipment.
Wherein:1st, Circuits System;101st, memory module;102nd, processing module;103rd, computing module;104th, split screen display available mould
Block;105th, image output module;106th, sound identification module;107th, brain wave identification module;2nd, display screen;3rd, lens;4th, divide
Shu Jing;5th, spatial depth awareness apparatus;6th, sensing equipment;7th, cross bar;8th, the first slide opening;9th, the first slide block;10th, vertical pole;11st,
Two slide openings;12nd, the second slide block;13rd, picture frame;14th, mike.
Specific embodiment
With following examples, the present invention is described in further detail below in conjunction with the accompanying drawings.
Embodiment 1
As shown in figure 1, the embodiment of the present invention 1 provides the transmission-type enhancing that a kind of binocular solid at ultra-large vision field angle shows
Reality system, including headset equipment, headset equipment can be designed as the helmet or glasses, and described headset equipment includes circuit
System 1, for display screen 2, two lens that exit pupil position is overlapped with human eye pupil position 3, beam splitters 4 of display picture and set
Put the spatial depth awareness apparatus 5 in described Circuits System 1 front end, described display screen 2 is located above two described lens 3, institute
State beam splitter 4 to be located at below two described lens 3, and have a certain degree with lens 3;Described Circuits System 1 is located at described aobvious
Above display screen 2, certainly, described Circuits System 1 can also be designed into other any positions not blocking light path, as long as can realize
Process data to transmit.It should be noted that display screen 2 can be one or two combination;Each lens 3 can be one
Block lens 3 eyeglass or polylith lens 3 eyeglass superposition composition;Beam splitter 4 can be one or two combination.
Described spatial depth awareness apparatus 5 are used for the space diagram of reality scene in front of headset equipment described in captured in real-time
Piece, and described space picture is sent to described Circuits System 1, comprise the spatial information of surrounding in the picture of space;Preferably
, spatial depth awareness apparatus 5 are located at system foremost, and certain spatial depth awareness apparatus 5 can also be located at its of system
His position, but it is to ensure that, when system uses, it is all the time towards in front of headset equipment, thus being used for captured in real-time head
Wear the reality scene in front of formula equipment.
The described space receiving picture is analyzed obtaining the space letter in the picture of described space by described Circuits System 1
Breath, and calculate the depth information of every bit in described reality scene according to described spatial information, simultaneously according to described depth information
Will be stored in the three-dimensional image in described Circuits System 1 and be added in described reality scene forming augmented reality image, institute
State the three-dimensional coordinate that spatial information is described reality scene, described depth information is every bit position in described reality scene
Corresponding Z axis coordinate.
Circuits System 1 can carry independent processor, has independent operating system, can be with the process of complete independently signal
And display, Circuits System 1 can also be a kind of signaling conversion circuit, and the picture of computer, mobile phone, panel computer etc. is transformed into this
On the display screen 2 of system.
When specifically used, described spatial depth awareness apparatus 5 are used for captured in real-time and obtain the space in front of headset equipment
Information, Circuits System 1 receives this information, processes through calculating, it is possible to obtain the depth information of every bit in space, deep according to this
Degree information, three-dimensional virtual information is accurately added in reality scene, realizes augmented reality and shows.
It should be noted that described 2, two lens 3 of display screen and beam splitter 4 are respectively along the direction of left and right lens 3 optical axis
Sequentially be distributed, the light that the left and right two parts of display screen 2 send respectively through the propagation of left and right lens 3, then through beam splitter 4
It is reflected into human eye, be in the virtual image amplified in front of human eye.The observation place of human eye is located at the rear of beam splitter 4.Due to left and right
The virtual information arriving soon has stereopsises, for this reason, human eye experience be 3 D stereo virtual information.
In the system that the present invention provides, the structural grouping of headset equipment is capable of ultra-large vision field angle, Circuits System 1 energy
Enough according to the depth information of reality scene, three-dimensional virtual information is accurately added in reality scene, thus realizing strengthening now
Real display.This system design is simple, and production cost is relatively low.
The image space of the system is located at beyond distance of distinct vision 25cm of human eye, and user can cosily watch the system
Shown content, will not damage to eyes of user.
Embodiment 2
As shown in Fig. 2 the embodiment of the present invention 2 defines described display screen 2 and described lens 3 on the basis of embodiment 1
Optical axis have a certain degree it is preferred that described display screen 2 is 0-180 ° with the angle α of described lens 3 optical axis, described beam splitting
Mirror 4 is angled with the optical axis of described lens 3 it is preferred that described beam splitter 4 is 1- with the angle β of described lens 3 optical axis
90°;Described lens 3 and one times of focal length being smaller than described lens 3 of described display screen 2.
Preferably, described display screen 2 and the angle of described lens 3 optical axis are 90 °.
The observation place of human eye is located at beam splitter 4 rear, the left and right eye pattern with stereopsises shown by display screen 2
Two lens 3 as passing sequentially through left and right side reflect, and then reflex in human eye through beam splitter 4 again, become and put in front of human eye
The big virtual image, because the virtual image that images of left and right eyes is seen has stereopsises, so forming stereoscopic vision, human eye both can have been seen outer
The real object in boundary is it is also possible to see the tridimensional virtual object that the system shows.
Need further exist for illustrating, described lens 3 are made up of at least two pieces of lens superpositions.In the technical program, lead to
Cross the particular design of lens 3, such as combined with spherical design using aspheric surface, or the lens 3 of one free form surface of design, can
The maximum horizontal angle of visual field realized can reach more than 100 °, for this reason, present invention achieves three-dimensional imaging, and there is maximum
The angle of visual field.
Preferably, lens 3 are biconvex singlet lens, and simultaneously for aspheric surface, rather than double aspheric surface, such design can be dropped
Low processing cost, accurate optimization design face type ensure that observation lens 3 have very outstanding image quality.
Embodiment 3
The embodiment of the present invention 3 further defines the structure of Circuits System 1 on the basis of embodiment 1, it is achieved thereby that
Augmented reality.
As shown in figure 3, it should be noted that described Circuits System 1 include memory module 101 and with described memory module
Processing module 102 that 101 communicate, computing module 103, split screen display available module 104, image output module 105, described process mould
Block 102 is communicated with described computing module 103, described computing module 103 and described image output module 105 all with described split screen
Display module 104 communicates;
Described memory module 101 internal memory contains three-dimensional image;
Described processing module 102 is used for receiving the described space picture that described spatial depth awareness apparatus 5 send, and to institute
State space picture and carry out the spatial information that graphical analyses obtain in described space picture, and according to described extraction of spatial information every
In picture, the depth value of each pixel obtains depth information, sends to described computing module 103;
Described computing module 103 calculates described three-dimensional image in described reality scene middle position according to described depth information
Confidence ceases, and according to described positional information, the described three-dimensional image in described memory module 101 is placed on described reality
In scene, described positional information is that described three-dimensional image is placed on the position coordinateses in described reality scene;
Described three-dimensional image after described split screen display available module 104 is used for processing described computing module 103 is carried out
Split screen processes and obtains left-eye image and eye image, and described left-eye image and described eye image is sent defeated to described image
Go out module 105, the just described left-eye image of described image output module 105 and described eye image export to shown display screen 2
Middle display, the light that described display screen 2 sends respectively through after two described lens 3 again after the reflection of described beam splitter 4
Enter human eye, human eye passes through this transmission-type augmented reality systematic observation to augmented reality image.
Split screen display available module 104 is mainly used in carrying out split screen process to augmented reality image, if display screen 2 has two,
Left side display screen 2 display left-eye image, right side display screen 2 shows eye image, if display screen 2 has one, on the left of display screen 2
Display left-eye image, display screen 2 right side display eye image;During using a beam splitter 4, the reflection of beam splitter 4 left field is left
Eye pattern picture, beam splitter 4 right side area reflects eye image, and during using two beam splitters 4, left beam splitter 4 reflects left-eye image, right
Beam splitter 4 reflects eye image, and the left-and right-eye images due to showing screen display have stereopsises, therefore images of left and right eyes cooperation
Achieve stereoscopic vision.
Embodiment 4
As shown in figure 4, the embodiment of the present invention 4 further defines described Circuits System 1 on the basis of embodiment 3 also wrapping
Include the sound identification module 106 communicating with described processing module 102 and brain wave identification module 107, described speech recognition mould
Block 106 is used for reading voice messaging and being converted into speech control instruction realizing interaction, and during use, user is referred to by sending voice
Order, makes user can be completed with voice and the interacting of system;Described brain wave identification module 107 is used for gathering user's brain wave
Signal, and form control instruction after described eeg signal is identified thus realizing interacting, brain wave identification module 107 leads to
Cross collection user's brain wave and be identified, form control instruction such that it is able to realize and the interacting of system.
Further it is emphasized that, described headset equipment is additionally provided with what some and described Circuits System 1 communicated
Sensing equipment 6, described sensing equipment 6 is that one or more of accelerometer, compass, gyroscope, GPS combine, described sensing
Equipment 6 is used for gathering the posture information of user and sending to described Circuits System 1, and described Circuits System 1 is to described posture information
Form control instruction thus realizing body feeling interaction, described posture information includes the position of people and the head of human body after being identified
Portion, lower limb, foot, trunk, handss or arm motion track in space.
Described motion track includes head, lower limb, foot, trunk, handss or the arm of human body change in displacement in space and angle
Degree change, described change in displacement includes following at least one:Move forward and backward, upper and lower displacement or left and right displacement;Described angle change
Including following at least one:Left and right horizontal rotation, up and down rotation or lateral rotation.
The action message transmission of user can effectively be gathered by the sensing equipments such as accelerometer, compass, gyroscope, GPS 6
Process to Circuits System 1, judge the action posture information of user, so that user can be by the sensing in the system
Equipment 6 completes and the interacting of the system.
Embodiment 5
It is the first depth that the present embodiment 5 further defines described spatial depth awareness apparatus 5 on the basis of embodiment 1
One or more of sensor, the second depth transducer and the 3rd depth transducer combine, described first depth transducer by
Infrared LED lamp and thermal camera composition, described second depth transducer is taken the photograph by structured light device, thermal camera and RGB
As head composition, described 3rd depth transducer is RGB binocular camera.
Spatial depth awareness apparatus 5 in the present invention can be the depth based on infrared LED lamp and thermal camera principle
Sensory perceptual system or the depth perception system based on structured light device, thermal camera and RGB photographic head, acceptable
It is the depth perception system based on RGB binocular camera or the depth perception system based on the flight time, of course for
Improve shooting precision above-mentioned several sensory perceptual systems can also be combined forming spatial depth awareness apparatus 5.
Described spatial depth awareness apparatus 5 are additionally operable to gather the gesture motion information of user and send to described Circuits System
1, described Circuits System 1 forms control instruction after described gesture motion information is identified thus realizing interacting, described gesture
Action message includes finger position, palm center location, the finger anglec of rotation, finger motion angle and finger motion direction.Handss
The collection of gesture action message, can make the cursor similar to computer for the handss of user, the gesture by identifying user for the headset equipment
Realize man-machine interaction, user can by convert gesture the interface that equipment show is operated, realization click, double-click, scaling,
Slide and other effects, effectively achieve the natural interaction of user and the system.
Embodiment 6
As shown in figure 5, in the system of the embodiment of the present invention 6 offer, the user in order to adapt to different interpupillary distances uses, this technology
Define in scheme that described headset equipment also includes adjusting for the interpupillary distance adjusting horizontal range between two described lens 3
Part, in order to realize distance and the size of regulating system imaging, defines in the technical program that headset equipment is also included for adjusting
Save the spacing regulating part of distance between described lens 3 and described display screen 2.
As shown in figure 5, in order to make headset equipment adapt to the distance of exit pupil of different user, it should be noted that institute
State the cross bar 7 that interpupillary distance regulating part includes being slidably connected with two described lens 3 one end, described cross bar 7 is transversely provided with first sliding
Hole 8, is provided with two the first slide blocks 9 that can open up direction slip along described first slide opening 8, described in two in described first slide opening 8
First slide block 9 described with two pieces is connected respectively for the end of lens 3.Cross bar 7 is initially used for two lens 3 being supported, simultaneously
First slide block 9 is provided with the first slide opening 8 opening up on cross bar 7, the first slide block 9 slides in the first slide opening 8, it is possible to achieve stir
Two pieces of lens 3, thus realize adjusting interpupillary distance, except of course that beyond stirring lens 3 by slip, can also push away by other means
Move to adjust the distance between two lens 3, in the present invention, here does not limit one by one, as long as can realize adjusting two lens 3
The distance between.
As shown in fig. 6, in order to realize adjusting the distance between lens 3 and display screen 2, limiting in the technical program
, described spacing regulating part includes the vertical pole 10 that two left and right two ends with described cross bar 7 are slidably connected, two described vertical poles
All longitudinally it is provided with the second slide opening 11 on 10, be provided with described second slide opening 11 and can open up direction slip along described second slide opening 11
Second slide block 12, second slide block 12 described with two pieces is connected respectively at the left and right two ends of described cross bar 7.Display screen 2 two ends are passed through vertical
Bar 10 connects cross bar 7, cross bar 7 pass through the second slide block 12 can longitudinal sliding motion in the second slide opening 11, it is achieved thereby that lens 3 with
The regulation of distance between display screen 2, this structure design is simple, of course for the regulation realizing distance between lens 3 and display screen 2
Can also be accomplished in several ways, here does not limit one by one, as long as distance between lens 3 and display screen 2 can be realized adjusting,
All within protection scope of the present invention.
The image space of this headset equipment is located at beyond distance of distinct vision 25cm of human eye, and user can cosily be carried out
Viewing, will not damage to eyes of user.
Embodiment 7
In the transmission-type augmented reality system that the binocular solid at the ultra-large vision field angle that the embodiment of the present invention 7 provides shows, show
Reflex in human eye through beam splitter 4 again after the rays pass through lens 3 that display screen 2 sends, the exit pupil position of lens 3 and human eye pupil
Position overlaps because human eye has a certain distance apart from beam splitter 4 and lens 3, therefore distance of exit pupil should design larger,
Distance of exit pupil optimal design is more than 10mm, most preferably between 30-60mm, in order that when human eye rotates or somewhat moves
The image in display screen 2 can be seen, exit pupil diameter optimal design is more than 6mm.
Embodiment 8
Preferably, the quantity of described display screen 2 is one or two.
The quantity of described beam splitter 4 is one or two, and the reflectance of described beam splitter 4 is 1-100%, described beam splitting
The absorbance of mirror 4 is 0-99%.
Described lens 3 are preferably selected to be spherical lenss, non-spherical lens, Fresnel Lenses, freedom in the technical program
The superposition of one or more of toroidal lenss or achromatic doublet.
As shown in fig. 7, including 2, two lens 3 of a display screen and two beam splitters 4 in this headset equipment;As figure
Shown in 8, include 2, two lens 3 of two display screens and two beam splitters 4 in this headset equipment, as shown in figure 9, this is worn
Include 2, two lens 3 of a display screen and a beam splitter 4 in formula equipment, as shown in Figure 10, wrap in this headset equipment
Containing 2, two lens 3 of two display screens and a beam splitter 4.
As shown in figure 11 it is preferred that described headset equipment also includes picture frame 13, described Circuits System 1, described display screen
2nd, two described lens 3 and described beam splitter 4 are arranged on described picture frame 13, and described picture frame 13 side is provided with and described circuit
Mike 14 and speaker that system 1 connects.
The present invention is not limited to above-mentioned preferred forms, and anyone can show that under the enlightenment of the present invention other are various
The product of form, however, making any change in its shape or structure, every have skill identical or similar to the present application
Art scheme, is within the scope of the present invention.
Claims (10)
1. the transmission-type augmented reality system that a kind of binocular solid at ultra-large vision field angle shows is it is characterised in that include wear-type
Equipment, described headset equipment includes Circuits System (1), the display screen (2) for display picture, two exit pupil positions and human eye
Lens (3), beam splitter (4) and the spatial depth awareness apparatus being arranged on described Circuits System (1) front end that pupil position overlaps
(5), described display screen (2) is located above two described lens (3), and described beam splitter (4) is located under two described lens (3)
Side;
Described spatial depth awareness apparatus (5) are used for the space picture of reality scene in front of headset equipment described in captured in real-time,
And send described space picture to described Circuits System (1);
The described space receiving picture is analyzed obtaining the space letter in the picture of described space by described Circuits System (1)
Breath, and calculate the depth information of every bit in described reality scene according to described spatial information, simultaneously according to described depth information
Will be stored in the three-dimensional image in described Circuits System (1) and be added to forming augmented reality image in described reality scene,
Described spatial information is the three-dimensional coordinate of described reality scene, and described depth information is in place for every bit institute in described reality scene
Put corresponding Z axis coordinate.
2. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
It is 0-180 ° in the angle of, described display screen (2) and described lens (3) optical axis, described beam splitter (4) and described lens (3) light
The angle of axle is 1-90 °;Described lens (3) and one times of focal length being smaller than described lens (3) of described display screen (2);Excellent
Choosing, described display screen (2) is 90 ° with the angle of described lens (3) optical axis.
3. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
In described lens (3) are made up of at least two pieces of lens superpositions.
4. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
In described Circuits System (1) includes memory module (101) and the processing module communicating with described memory module (101)
(102), computing module (103), split screen display available module (104), image output module (105), described processing module (102) and institute
State computing module (103) to communicate, described computing module (103) and described image output module (105) are all shown with described split screen
Show that module (104) communicates;
Described memory module (101) internal memory contains three-dimensional image;
Described processing module (102) is used for receiving the described space picture that described spatial depth awareness apparatus (5) send, and to institute
State space picture and carry out the spatial information that graphical analyses obtain in described space picture, and according to described extraction of spatial information every
In picture, the depth value of each pixel obtains depth information, sends to described computing module (103);
Described computing module (103) calculates described three-dimensional image position in described reality scene according to described depth information
Information, and the described three-dimensional image in described memory module (101) is superimposed upon by described reality according to described positional information
In scene, described positional information is that described three-dimensional image is superimposed upon the position coordinateses in described reality scene;
Described three-dimensional image after described split screen display available module (104) is used for processing described computing module (103) is carried out
Split screen processes and obtains left-eye image and eye image, and described left-eye image and described eye image is sent defeated to described image
Go out module (105), the just described left-eye image of described image output module (105) and described eye image export to shown display
Screen (2) in display, the light that described display screen (2) sends respectively through after two described lens (3) again through described beam splitter
(4) enter human eye after reflection.
5. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 4 shows, its feature exists
In described Circuits System (1) also includes the sound identification module (106) communicating with described processing module (102) and brain wave
Identification module (107), described sound identification module (106) is used for reading voice messaging and being converted into speech control instruction realizing handing over
Mutually;Described brain wave identification module (107) is used for gathering user's eeg signal, and after described eeg signal is identified
Form control instruction thus realizing interacting;
Described headset equipment is additionally provided with some sensing equipments (6) communicating with described Circuits System (1), described sensing equipment
(6) it is that one or more of accelerometer, compass, gyroscope, GPS combine, described sensing equipment (6) is used for gathering user's
Posture information simultaneously sends to described Circuits System (1), and described Circuits System (1) forms control after described posture information is identified
System instruction thus realizing body feeling interaction, described posture information include the position of people and the head of human body, lower limb, foot, trunk, handss or
Arm motion track in space.
6. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
In described spatial depth awareness apparatus (5) are the first depth transducer, in the second depth transducer and the 3rd depth transducer
One or more combination, described first depth transducer is made up of infrared LED lamp and thermal camera, described second depth sensing
Device is made up of structured light device, thermal camera and RGB photographic head, and described 3rd depth transducer is RGB binocular camera;
Described spatial depth awareness apparatus (5) are additionally operable to gather the gesture motion information of user and send to described Circuits System
(1) form control instruction after, described Circuits System (1) is identified to described gesture motion information thus realizing interacting, described
Gesture motion information includes finger position, palm center location, the finger anglec of rotation, finger motion angle and finger motion side
To.
7. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
Also include for adjusting between two described lens (3) the interpupillary distance regulating part of horizontal range and being used in, described headset equipment
Adjust the spacing regulating part of distance between described lens (3) and described display screen (2).
8. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 7 shows, its feature exists
In described interpupillary distance regulating part includes the cross bar (7) being slidably connected with two described lens (3) one end, and described cross bar (7) is upper laterally
It is provided with the first slide opening (8), be provided with two in described first slide opening (8) and the of direction slip can be opened up along described first slide opening (8)
One slide block (9), first slide block (9) described with two pieces is connected respectively for the end of two described lens (3);
Described spacing regulating part includes the vertical pole (10) that two left and right two ends with described cross bar (7) are slidably connected, described in two
All longitudinally it is provided with the second slide opening (11) on vertical pole (10), be provided with described second slide opening (11) and can open along described second slide opening (11)
The second slide block (12) that set direction slides, second slide block (12) described with two pieces connects respectively at the left and right two ends of described cross bar (7)
Connect.
9. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 1 shows, its feature exists
In the quantity of described display screen (2) is one or two, and the quantity of described beam splitter (4) is one or two, described beam splitter
(4) reflectance is 1-100%, and the absorbance of described beam splitter (4) is 0-99%;Described lens (3) are spherical lenss, aspheric
The superposition of one or more of face lens, Fresnel Lenses, free-form surface lens or achromatic doublet.
10. the transmission-type augmented reality system that the binocular solid at ultra-large vision field angle as claimed in claim 5 shows, its feature exists
In described headset equipment also includes picture frame (13), described Circuits System (1), described display screen (2), two described lens (3)
It is arranged on described picture frame (13) with described beam splitter (4), described picture frame (13) side is provided with described Circuits System (1) even
The mike (14) connecing and speaker.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610755512.8A CN106444023A (en) | 2016-08-29 | 2016-08-29 | Super-large field angle binocular stereoscopic display transmission type augmented reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610755512.8A CN106444023A (en) | 2016-08-29 | 2016-08-29 | Super-large field angle binocular stereoscopic display transmission type augmented reality system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106444023A true CN106444023A (en) | 2017-02-22 |
Family
ID=58090065
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610755512.8A Pending CN106444023A (en) | 2016-08-29 | 2016-08-29 | Super-large field angle binocular stereoscopic display transmission type augmented reality system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106444023A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106773068A (en) * | 2017-03-24 | 2017-05-31 | 深圳增强现实技术有限公司 | A kind of optics module of the wearable intelligent glasses of augmented reality |
CN106890441A (en) * | 2017-04-10 | 2017-06-27 | 江苏农林职业技术学院 | Shuttlecock interaction training aids and shuttlecock interaction training method |
CN106918915A (en) * | 2017-05-16 | 2017-07-04 | 核桃智能科技(常州)有限公司 | Near-eye display system |
CN107274891A (en) * | 2017-05-23 | 2017-10-20 | 武汉秀宝软件有限公司 | A kind of AR interface alternation method and system based on speech recognition engine |
CN108109478A (en) * | 2018-01-22 | 2018-06-01 | 北京费米赛因教育科技有限公司 | A kind of deployable Portable experimental box for experimental bench |
CN108572450A (en) * | 2017-03-09 | 2018-09-25 | 宏碁股份有限公司 | Head-mounted display, its visual field bearing calibration and mixed reality display system |
CN108830944A (en) * | 2018-07-12 | 2018-11-16 | 北京理工大学 | Optical perspective formula three-dimensional near-eye display system and display methods |
CN109254406A (en) * | 2018-11-07 | 2019-01-22 | 深圳市传智科技有限公司 | A kind of multi-functional augmented reality glasses |
WO2019041614A1 (en) * | 2017-09-04 | 2019-03-07 | 浙江大学 | Head-mounted immersive virtual reality display device and immersive virtual reality display method |
CN110874135A (en) * | 2018-09-03 | 2020-03-10 | 广东虚拟现实科技有限公司 | Optical distortion correction method and device, terminal equipment and storage medium |
WO2020048461A1 (en) * | 2018-09-03 | 2020-03-12 | 广东虚拟现实科技有限公司 | Three-dimensional stereoscopic display method, terminal device and storage medium |
WO2020143546A1 (en) * | 2019-01-07 | 2020-07-16 | 京东方科技集团股份有限公司 | Augmented reality system and control method |
CN112710608A (en) * | 2020-12-16 | 2021-04-27 | 深圳晶泰科技有限公司 | Experiment observation method and system |
CN113542719A (en) * | 2021-06-07 | 2021-10-22 | 支付宝(杭州)信息技术有限公司 | Image acquisition device |
CN115774335A (en) * | 2022-11-11 | 2023-03-10 | Oppo广东移动通信有限公司 | Virtual image display device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101661163A (en) * | 2009-09-27 | 2010-03-03 | 合肥工业大学 | Three-dimensional helmet display of augmented reality system |
CN102096194A (en) * | 2010-12-24 | 2011-06-15 | 北京理工大学 | Optical transmission projection type three-dimensional helmet display |
CN102566027A (en) * | 2011-12-14 | 2012-07-11 | 广州博冠企业有限公司 | Three-dimensional imaging optical assembly and digital three-dimensional microscope system based on single objective lens |
CN103348278A (en) * | 2010-12-16 | 2013-10-09 | 洛克希德马丁公司 | Collimating display with pixel lenses |
CN103472909A (en) * | 2012-04-10 | 2013-12-25 | 微软公司 | Realistic occlusion for a head mounted augmented reality display |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
CN105068659A (en) * | 2015-09-01 | 2015-11-18 | 陈科枫 | Reality augmenting system |
CN105068253A (en) * | 2015-09-15 | 2015-11-18 | 厦门灵境信息科技有限公司 | Head-mounted display device and optical lens system thereof |
CN105786163A (en) * | 2014-12-19 | 2016-07-20 | 联想(北京)有限公司 | Display processing method and display processing device |
-
2016
- 2016-08-29 CN CN201610755512.8A patent/CN106444023A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101661163A (en) * | 2009-09-27 | 2010-03-03 | 合肥工业大学 | Three-dimensional helmet display of augmented reality system |
CN103348278A (en) * | 2010-12-16 | 2013-10-09 | 洛克希德马丁公司 | Collimating display with pixel lenses |
CN102096194A (en) * | 2010-12-24 | 2011-06-15 | 北京理工大学 | Optical transmission projection type three-dimensional helmet display |
CN102566027A (en) * | 2011-12-14 | 2012-07-11 | 广州博冠企业有限公司 | Three-dimensional imaging optical assembly and digital three-dimensional microscope system based on single objective lens |
CN103472909A (en) * | 2012-04-10 | 2013-12-25 | 微软公司 | Realistic occlusion for a head mounted augmented reality display |
CN105786163A (en) * | 2014-12-19 | 2016-07-20 | 联想(北京)有限公司 | Display processing method and display processing device |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
CN105068659A (en) * | 2015-09-01 | 2015-11-18 | 陈科枫 | Reality augmenting system |
CN105068253A (en) * | 2015-09-15 | 2015-11-18 | 厦门灵境信息科技有限公司 | Head-mounted display device and optical lens system thereof |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108572450A (en) * | 2017-03-09 | 2018-09-25 | 宏碁股份有限公司 | Head-mounted display, its visual field bearing calibration and mixed reality display system |
CN108572450B (en) * | 2017-03-09 | 2021-01-29 | 宏碁股份有限公司 | Head-mounted display, visual field correction method thereof and mixed reality display system |
CN106773068A (en) * | 2017-03-24 | 2017-05-31 | 深圳增强现实技术有限公司 | A kind of optics module of the wearable intelligent glasses of augmented reality |
CN106890441A (en) * | 2017-04-10 | 2017-06-27 | 江苏农林职业技术学院 | Shuttlecock interaction training aids and shuttlecock interaction training method |
CN106918915A (en) * | 2017-05-16 | 2017-07-04 | 核桃智能科技(常州)有限公司 | Near-eye display system |
CN107274891A (en) * | 2017-05-23 | 2017-10-20 | 武汉秀宝软件有限公司 | A kind of AR interface alternation method and system based on speech recognition engine |
WO2019041614A1 (en) * | 2017-09-04 | 2019-03-07 | 浙江大学 | Head-mounted immersive virtual reality display device and immersive virtual reality display method |
CN108109478A (en) * | 2018-01-22 | 2018-06-01 | 北京费米赛因教育科技有限公司 | A kind of deployable Portable experimental box for experimental bench |
CN108830944A (en) * | 2018-07-12 | 2018-11-16 | 北京理工大学 | Optical perspective formula three-dimensional near-eye display system and display methods |
WO2020048461A1 (en) * | 2018-09-03 | 2020-03-12 | 广东虚拟现实科技有限公司 | Three-dimensional stereoscopic display method, terminal device and storage medium |
CN110874135A (en) * | 2018-09-03 | 2020-03-10 | 广东虚拟现实科技有限公司 | Optical distortion correction method and device, terminal equipment and storage medium |
CN110874135B (en) * | 2018-09-03 | 2021-12-21 | 广东虚拟现实科技有限公司 | Optical distortion correction method and device, terminal equipment and storage medium |
US11380063B2 (en) * | 2018-09-03 | 2022-07-05 | Guangdong Virtual Reality Technology Co., Ltd. | Three-dimensional distortion display method, terminal device, and storage medium |
CN109254406A (en) * | 2018-11-07 | 2019-01-22 | 深圳市传智科技有限公司 | A kind of multi-functional augmented reality glasses |
WO2020143546A1 (en) * | 2019-01-07 | 2020-07-16 | 京东方科技集团股份有限公司 | Augmented reality system and control method |
US11402900B2 (en) | 2019-01-07 | 2022-08-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality system comprising an aircraft and control method therefor |
CN112710608A (en) * | 2020-12-16 | 2021-04-27 | 深圳晶泰科技有限公司 | Experiment observation method and system |
CN112710608B (en) * | 2020-12-16 | 2023-06-23 | 深圳晶泰科技有限公司 | Experimental observation method and system |
CN113542719A (en) * | 2021-06-07 | 2021-10-22 | 支付宝(杭州)信息技术有限公司 | Image acquisition device |
CN113542719B (en) * | 2021-06-07 | 2023-10-03 | 支付宝(杭州)信息技术有限公司 | Image acquisition device |
CN115774335A (en) * | 2022-11-11 | 2023-03-10 | Oppo广东移动通信有限公司 | Virtual image display device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106444023A (en) | Super-large field angle binocular stereoscopic display transmission type augmented reality system | |
JP7316360B2 (en) | Systems and methods for augmented reality | |
US20230141039A1 (en) | Immersive displays | |
JP7094266B2 (en) | Single-depth tracking-accommodation-binocular accommodation solution | |
US9842433B2 (en) | Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality | |
CN104536579B (en) | Interactive three-dimensional outdoor scene and digital picture high speed fusion processing system and processing method | |
KR101845350B1 (en) | Head-mounted display device, control method of head-mounted display device, and display system | |
CN108983982B (en) | AR head display equipment and terminal equipment combined system | |
JP2008033219A (en) | Face-mounted display apparatus for mixed reality environment | |
CN205195880U (en) | Watch equipment and watch system | |
CN106405845A (en) | Virtual reality experience device | |
JP2017191546A (en) | Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display | |
KR100917100B1 (en) | Apparatus for displaying three-dimensional image and method for controlling location of display in the apparatus | |
WO2016051431A1 (en) | Input/output device, input/output program, and input/output method | |
CN104216126A (en) | Zooming 3D (third-dimensional) display technique | |
CN111213111B (en) | Wearable device and method thereof | |
US12015758B1 (en) | Holographic video sessions | |
CN117452637A (en) | Head mounted display and image display method | |
CN113409469A (en) | Reality fusion interaction system and method | |
KR20040037761A (en) | Both eyes type 3-D computer system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170222 |
|
RJ01 | Rejection of invention patent application after publication |