CN116203733A - Mixed reality wearing equipment - Google Patents
Mixed reality wearing equipment Download PDFInfo
- Publication number
- CN116203733A CN116203733A CN202310473059.1A CN202310473059A CN116203733A CN 116203733 A CN116203733 A CN 116203733A CN 202310473059 A CN202310473059 A CN 202310473059A CN 116203733 A CN116203733 A CN 116203733A
- Authority
- CN
- China
- Prior art keywords
- mixed reality
- binocular
- head harness
- frame
- wearable device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000003128 head Anatomy 0.000 claims description 60
- 230000003287 optical effect Effects 0.000 claims description 35
- 238000000034 method Methods 0.000 claims description 29
- 239000011521 glass Substances 0.000 claims description 23
- 230000000007 visual effect Effects 0.000 claims description 21
- 210000001747 pupil Anatomy 0.000 claims description 16
- 230000004438 eyesight Effects 0.000 claims description 14
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 claims description 13
- 239000000741 silica gel Substances 0.000 claims description 13
- 229910002027 silica gel Inorganic materials 0.000 claims description 13
- 230000005540 biological transmission Effects 0.000 claims description 12
- 230000010365 information processing Effects 0.000 claims description 12
- 239000011248 coating agent Substances 0.000 claims description 8
- 238000000576 coating method Methods 0.000 claims description 8
- 239000004417 polycarbonate Substances 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 7
- 238000012937 correction Methods 0.000 claims description 6
- 244000043261 Hevea brasiliensis Species 0.000 claims description 5
- 230000002209 hydrophobic effect Effects 0.000 claims description 5
- 229920003052 natural elastomer Polymers 0.000 claims description 5
- 229920001194 natural rubber Polymers 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 claims description 4
- 239000003973 paint Substances 0.000 claims description 4
- 239000004033 plastic Substances 0.000 claims description 4
- 229920003023 plastic Polymers 0.000 claims description 4
- 210000001061 forehead Anatomy 0.000 claims description 3
- 229920000515 polycarbonate Polymers 0.000 claims description 3
- 230000015572 biosynthetic process Effects 0.000 claims description 2
- 238000013461 design Methods 0.000 abstract description 36
- 230000006870 function Effects 0.000 abstract description 10
- 238000005457 optimization Methods 0.000 abstract description 9
- 230000002829 reductive effect Effects 0.000 abstract description 9
- 230000002035 prolonged effect Effects 0.000 abstract description 2
- 238000004088 simulation Methods 0.000 abstract description 2
- 210000001508 eye Anatomy 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 13
- 239000000463 material Substances 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 238000010168 coupling process Methods 0.000 description 11
- 238000005859 coupling reaction Methods 0.000 description 11
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000009826 distribution Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 230000003993 interaction Effects 0.000 description 6
- 210000004556 brain Anatomy 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 230000001815 facial effect Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 239000000243 solution Substances 0.000 description 4
- 208000003556 Dry Eye Syndromes Diseases 0.000 description 3
- 206010013774 Dry eye Diseases 0.000 description 3
- 206010034701 Peroneal nerve palsy Diseases 0.000 description 3
- 208000003464 asthenopia Diseases 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 208000002173 dizziness Diseases 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 230000004224 protection Effects 0.000 description 3
- 208000024891 symptom Diseases 0.000 description 3
- 206010000060 Abdominal distension Diseases 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 239000002861 polymer material Substances 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000001681 protective effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 210000001202 rhombencephalon Anatomy 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241001207148 Blaste Species 0.000 description 1
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 206010052143 Ocular discomfort Diseases 0.000 description 1
- 230000006750 UV protection Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 229920000122 acrylonitrile butadiene styrene Polymers 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000005336 cracking Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 229920001971 elastomer Polymers 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 239000003822 epoxy resin Substances 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000003475 lamination Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 231100000252 nontoxic Toxicity 0.000 description 1
- 230000003000 nontoxic effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 229920002120 photoresistant polymer Polymers 0.000 description 1
- 229920000647 polyepoxide Polymers 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- GGCZERPQGJTIQP-UHFFFAOYSA-N sodium;9,10-dioxoanthracene-2-sulfonic acid Chemical compound [Na+].C1=CC=C2C(=O)C3=CC(S(=O)(=O)O)=CC=C3C(=O)C2=C1 GGCZERPQGJTIQP-UHFFFAOYSA-N 0.000 description 1
- 239000007779 soft material Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 239000012815 thermoplastic material Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 229920006352 transparent thermoplastic Polymers 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E60/00—Enabling technologies; Technologies with a potential or indirect contribution to GHG emissions mitigation
- Y02E60/10—Energy storage using batteries
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
The invention discloses a mixed reality wearing device, which adopts a wearing type design thought to integrate functions of goggles, video acquisition, display and the like into a wearing device for head wearing. The weight of the equipment is reduced by adopting means of light weight design, high-efficiency power management and the like, and the continuous working time is prolonged. The structural matching simulation design is adopted, and the human engineering is combined for optimization, so that the equipment counterweight is optimized, the wearing comfort is improved, and the use and operation convenience are improved.
Description
Technical Field
The invention relates to the technical field of mixed reality, in particular to mixed reality wearing equipment with high wearing comfort and use convenience.
Background
From the current global development analysis, the future world will show various forms of intellectualization, networking, unmanned, diversification and the like. The mixed reality technology (MR) is to build a bridge for interactive feedback information among the virtual world, the real world and the user by introducing real scene information into the virtual environment, so as to enhance the sense of reality of the user experience. Is a deep and intensive technique for virtual reality and augmented reality.
The mixed reality technology can display the real environment and the virtual information in a superposition manner on the same picture and space, and fusion and interaction between the information are realized, so that the sensory experience exceeding reality is achieved when a user uses the mixed reality technology. When a user uses the head-mounted mixed reality augmented reality display device, in order to realize multi-view viewing of the three-dimensional model, user head position adjustment model rendering needs to be positioned in real time. The user may send simple commands to the system through gestures, such as selection, movement, and deletion, and may express more complex intents, such as switching the current interaction scenario, controlling virtual objects, performing virtual actions, and so forth.
The virtual-real combined three-dimensional scene matching technology is a novel display matching technology which starts to have application requirements along with the development of computer software and hardware. The virtual-real combination technology can blend the virtual environment into the real scene around the user, thereby providing visual and enhanced use experience, and the three-dimensional matching technology has higher operation freedom degree in the three-dimensional space, thereby forming more visual and real feeling.
The interaction of the bottom layer of the mixed reality depends on hardware equipment, and the performance and the sensor type of the equipment determine the basic interaction mode owned by the equipment. With the development of mixed reality augmented display technology, the application of the object is not limited to a specific object or a specific environment; it should have the ability to handle a variety of complex and diverse environments. The design thought of wearing type is adopted, functions such as goggles, video acquisition and display are integrated, information equipment for head wearing is formed, stable integration of technologies on the system and performances of system components are fully exerted, and the mixed reality enhanced display and interaction system achieves an ideal use state. However, since the information-based device for head wear use integrates various components, extremely high demands are placed on wearing comfort and convenience of use.
Therefore, how to provide an information device worn on the head, which is comfortable to wear and convenient to use and operate, is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the present invention provides a mixed reality wearable device for overcoming or at least partially solving the above problems.
The invention provides the following scheme:
a mixed reality wearable device, comprising:
the head harness comprises a head harness and a fixing belt, wherein the first end of the fixing belt is fixedly connected with the head harness and the position of the head, which is opposite to the position of the head, of the head harness, and the second end of the fixing belt can bypass the top of the head harness and form a free end at the forehead position of the head harness;
the comprehensive information processing unit comprises goggles and an informationized component; the informatization component is connected with the goggles and comprises a binocular light waveguide display module, and the binocular light waveguide display module is used for displaying a real environment and virtual information superposition picture; the binocular light waveguide display module comprises a light wave transmission medium with a free-form surface holographic waveguide display structure;
the power supply unit is connected with the outer side of the right rear part of the head harness; the power supply unit is used for supplying power to the informationized component;
the goggles comprise a lens frame and lenses, wherein the lens frame is connected with the second end of the fixing strap so as to enable the lens frame to be in contact with cheeks and nose bridges of a user; the glasses frame is made of TPU plastic, and a natural rubber buffer pad is arranged at the contact part of the glasses frame, cheeks and nose bridges of the user; the lens is made of polycarbonate.
Preferably: the outer side of the head harness is provided with a fixing frame, and the fixing belt is connected with the fixing frame; the front end of mount with the surface formation draw-in groove of head harness, the inboard of informatization subassembly is provided with the silica gel fixture block, the picture frame with after the fixed band links to each other the silica gel fixture block with draw-in groove joint links to each other.
Preferably: the outer side of the mirror frame is provided with a buckle, and the second end of the fixing belt is connected with the mirror frame through the buckle; and a silica gel anti-skid sleeve is arranged at the position where the second end of the fixing belt is connected with the head harness.
Preferably: the inner layer of the lens is processed by adopting a glass hydrophobic coating, and the outer layer of the lens is processed by adopting a multilayer anti-back light reflection coating mode.
Preferably: the power supply unit comprises a battery compartment, and the battery compartment and the inside of each of the glasses frames are respectively provided with a metal shielding paint layer.
Preferably: and a vision correction lens mounting mechanism is arranged in an entrance pupil area of the binocular light waveguide display module on the mirror frame.
Preferably: the binocular light waveguide display module comprises the light wave transmission medium, the sight line and the screen foot drop are taken as centers, the screen range foot drop is taken as a base point, and the left and right expansion is 18.5-19.5 mm.
Preferably: the optical wave transmission medium comprises at least one layer of optical waveguide and a volume holographic grating attached to the surface of at least one layer of optical waveguide.
Preferably: the volume holographic grating comprises a multi-region grating structure, the multi-region grating structure comprises a plurality of groups of volume holographic gratings, and the plurality of groups of volume holographic gratings are respectively used for coupling and outputting a plurality of input images with different visual angles in a one-to-one correspondence manner.
Preferably: the informatization component further comprises a binocular camera, wherein the binocular camera is used for realizing video acquisition of a real environment; the calibration method of the binocular camera comprises the following steps:
taking a checkerboard as a calibration plate, and adopting a left camera and a right camera of the binocular camera to acquire a series of static calibration pictures from different angles respectively;
monocular calibrating the left and right cameras respectively through a series of static calibration pictures;
performing double-target calibration on the double-target camera based on the calibration result of the single-target calibration;
and correcting the binocular camera according to the internal and external parameter data result of the binocular camera, so that the visual field of the picture is consistent with the display visual field of the visual optical system and the height of the binocular image is consistent.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the embodiment of the application provides a mixed reality wearing equipment adopts the design thinking of wearing formula, with function integration integrated design such as goggles, video acquisition, demonstration, forms wearing equipment that the head was worn and is used. The integrated information processing unit and the power supply unit are arranged in front of and behind the head harness as components with large weight, so that weight distribution is effectively balanced. The comprehensive information processing unit is fixed through the fixing belt, and force support is provided for the system by means of three parts of cheek, nose bridge and hindbrain, so that the comfort requirement can be met, and meanwhile, the goggles can be prevented from sliding off.
Meanwhile, under the preferred implementation mode, the display technical scheme of the optical waveguide and the volume holographic grating is adopted, the volume holographic grating attached to the surface of the optical waveguide breaks through the interface rule of total reflection, the propagation direction and energy of optical information are changed, and the optical information is further guided to be transmitted from the inside of the waveguide to human eyes. The combination of the waveguide layering scheme provided can realize image display with large viewing angle.
In addition, in another preferred embodiment, the acquired images are used for calibrating and registering the binocular camera, so that the field of view of the images is consistent with the display field of view of the visual optical system, the heights of the binocular images are consistent, and the image combination is complete and distortion is eliminated. Preventing a series of uncomfortable movement symptoms such as dizziness, dry eyes, eye distention, eye fatigue and the like of a user in the using process.
Of course, it is not necessary for any one product to practice the invention to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments will be briefly described below. It is evident that the drawings in the following description are only some embodiments of the present invention and that other drawings may be obtained from these drawings by those of ordinary skill in the art without inventive effort.
Fig. 1 is a schematic structural diagram of a mixed reality wearable device provided by an embodiment of the present invention;
fig. 2 is a partial cross-sectional view of a mixed reality wearable device provided by an embodiment of the invention;
FIG. 3 is a normal distribution diagram of the range of interpupillary distances between men according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an interpupillary distance design according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a multi-region grating provided by an embodiment of the present invention;
fig. 6 is a schematic diagram of stereoscopic parallax according to an embodiment of the present invention.
In the figure: the head harness 1, the fixing frame 11, the fixing band 2, the comprehensive information processing unit 3, the goggles 31, the glasses frame 311, the lenses 312, the informationized component 32, the silica gel clamping block 33, the clamping buckle 34, the power supply unit 4, the user heads 5, 101 and 201 are first group volume hologram gratings, 102 and 202 are second group volume hologram gratings, and 103 and 203 are third group volume hologram gratings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the invention, fall within the scope of protection of the invention.
Referring to fig. 1, a mixed reality wearable device provided for an embodiment of the present invention, as shown in fig. 1, may include:
the head harness 1 and the fixing strap 2, the first end of the fixing strap 2 is fixedly connected with the head harness 1 and the position of the back brain of the head, and the second end of the fixing strap 2 can bypass the top of the head harness 1 and form a free end at the forehead position of the head harness 1;
an integrated information processing unit 3, the integrated information processing unit 3 including goggles 31 and an informationized component 32; the informationized component 32 is connected with the goggles 31, and the informationized component 32 comprises a binocular light waveguide display module which is used for displaying a real environment and virtual information superposition picture; the binocular light waveguide display module comprises a light wave transmission medium with a free-form surface holographic waveguide display structure;
a power supply unit 4, wherein the power supply unit 4 is connected with the outer side of the right rear part of the head harness 1; the power supply unit 4 is used for supplying power to the informationized component 32;
wherein the goggles 31 comprise a lens frame 311 and a lens 312, the lens frame 311 is connected with the second end of the fixing strap 2, so that the lens frame 311 is contacted with the cheek and nose bridge of the user; the material of the mirror frame 311 is TPU plastic, and a natural rubber buffer pad is arranged at the contact part of the mirror frame 311 and the cheeks and nose bridge of the user; the lens 312 is made of polycarbonate.
The mixed reality wearing equipment that this application embodiment provided adopts the design of wear-type, with function integration integrated design such as goggles, video acquisition, demonstration, forms the informationized wearing equipment that the head was worn and is used, wears the travelling comfort and has all obtained effectual promotion with the use convenience.
The integrated information processing unit and the power supply unit are arranged in front of and behind the head harness as components with large weight, so that weight distribution is effectively balanced. The comprehensive information processing unit is fixed through the fixing belt, and force support is provided for the system by means of three parts of cheek, nose bridge and hindbrain, so that the comfort requirement can be met, and meanwhile, the goggles can be prevented from sliding off.
The glasses frame is made of TPU super soft elastic plastic materials, the materials can keep good elasticity and flexibility in a temperature range of-40-65 ℃, and the glasses frame processed by the materials has good toughness. The glasses frame and the eye-face contact area are designed with the cushion pad according to the outline of the face of a person, the cushion pad is made of natural rubber, and the cushion pad has the characteristics of good elasticity, softness and the like, can be well attached to the face, prevents light leakage, and can effectively reduce the damage to the face when the glasses are impacted.
The lens material is made of ultraviolet-resistant weather-resistant polycarbonatePC), which is a colorless transparent thermoplastic material, has the characteristics of light weight, heat aging resistance, good shock resistance, ultraviolet resistance, easy processing and the like, and is widely applied to products such as spectacle lenses, bulletproof glass and the like. The spectacle lens made of PC material not only can resist ultraviolet ray (UV 400) by 100%, but also is 57% lighter than glass lens and 37% lighter than resin lens. The impact resistance is 60 times of that of glass, and the impact strength of the simply supported beam is 50-70 KJ/m 2 And the PC lens has the characteristics of cracking and not breaking, and the PC lens with the thickness of only 7 mm can resist the damage of 1.1 g broken pieces of 168 m/s.
In order to further improve the stability of the integrated information processing unit after being connected with the head harness, as shown in fig. 2, the embodiment of the present application may further provide that a fixing frame 11 is disposed on the outer side of the head harness 1, and the fixing belt 2 is connected with the fixing frame 11; the front end of the fixing frame 11 and the surface of the head harness 1 form a clamping groove, a silica gel clamping block 33 is arranged on the inner side of the informationized assembly 32, and the silica gel clamping block 33 is connected with the clamping groove in a clamping way after the mirror frame 311 is connected with the fixing belt 2. The outer side of the mirror frame 311 is provided with a buckle 34, and the second end of the fixing strap 2 is connected with the mirror frame 311 through the buckle 34; and a silica gel anti-skid sleeve is arranged at the position where the second end of the fixing belt 2 is connected with the head harness.
The elastic band can be adjusted to be sleeved at the position of the head 5 of the user, the silica gel anti-skidding sleeve is designed at the position of the back brain of the fixing band, the force support is provided for the system by means of three parts of the cheek, the nose bridge and the back brain, the elastic band is in a double-layer design, the wearing safety can be well guaranteed, meanwhile, the buckle for installing the fixing band at the top of the head is reserved, the fixing band can be further firmly worn, the silica gel anti-skidding design is adopted for the fixing band, and the glasses can be effectively prevented from sliding by combining with the multi-point support.
In the aspect of processing technology, the multifunctional glasses frame, the rubber buffer pad mould and the lenses are fixed by adopting a nontoxic epoxy resin adhesive. Further, the inner layer of the lens 312 is processed by a glass hydrophobic coating, and the outer layer of the lens 312 is processed by a multilayer anti-back light reflection coating. The PC lens adopts an injection molding processing technology, the inner layer adopts glass hydrophobic coating processing, the lens can have a hydrophobic function, the outer layer adopts a multilayer anti-back light reflection coating (AR) mode, light generated by a light source of the binocular light waveguide display module can be effectively prevented from leaking out through the lens, and the visible light characteristic at night is reduced.
The goggles provided by the embodiment of the application are made of soft materials, have a certain buffering effect, can improve the wearing comfort of users, and adapt to the facial differences of different users. The adjustable fixing strap has the telescopic elastic function to ensure that the user wears the belt tightly, and the pressing sense of the head of the user is lightened. The nose bridge shield is designed to be attached to the man-machine shape, so that the comfort of the man-machine is improved.
In order to further improve the wearing comfort, the embodiment of the application can also provide a functional module with heavier weight to be placed on the left side and the right side of the head by fully considering the structural counterweight in design, so that the unbalance of front, back, left and right weights is avoided. The structure adopts the lightweight design, adopts polymer material in a large number, has both guaranteed equipment strength, has alleviateed weight again, can effectively alleviate neck load, promotes and dresses the travelling comfort. The head harness is worn on the head, and the head harness is integrally stressed on the head. Front and rear counterweights prevent the center of gravity from being too far forward after being worn; the flexible material contacts the face to form an image darkroom. For the whole balance and counterweight, the battery is placed under the rear part, and the circuit is hidden in the fixing belt. Through above design, with whole weight support from the bridge of the nose transfer to cervical vertebra center to support through the head harness, promote user's wearing comfort level.
To accommodate the wearable requirements of the user, structural matching with a lightweight full-pack protective head harness is required. In practical applications, a lightweight protective head harness may be selected for structural adaptation. The bonding part adopts semi-flexible material to form tight bonding. Meanwhile, the silica gel clamping block forms a clamping groove with the front end of the fixing frame and the surface of the head fixture to realize precise lamination.
In order to avoid water leakage between the comprehensive information processing unit and the head, the left and right diversion grooves can be designed on the basis of tight combination so as to avoid water seepage.
It can be appreciated that the integrated information processing unit provided in the embodiments of the present application may further include components necessary to implement mixed reality image acquisition and display. For example, a computing platform may also be included.
In order to further achieve the light weight and low power requirements of the wearable device, the embodiment of the application can further provide an embedded edge computing platform. Because the computing resources of the edge computing platform are often limited, if the algorithm can be run by a lightweight method, the resources used in the process of running the algorithm are reduced, the running speed is improved, and the performance optimization of the whole system is very important.
The light weight means that the parameter quantity and the calculated quantity of the algorithm model are reduced and the running speed is accelerated on the premise of ensuring that the precision meets the requirement. The weak and small target detection and fusion enhancement algorithm based on the neural network uses a deep stacked convolution network module as a basis of feature extraction and mapping learning, and each subtask module of the system can consider real-time operation of the model on the edge equipment when the scheme is designed, process the model to a certain extent in the aspect of design, comprehensively consider each module of the whole system, further lighten each subtask from the angles of multi-task cooperation and the like, and greatly improve the real-time performance of the whole equipment.
In the system edge calculation module, a high-performance embedded ARM platform and a heterogeneous chip platform are adopted, the method is based on the detection direction of a computer vision target in the current deep learning, and a structural design of a lightweight model, a lightweight conversion method of the model, a calculation amount reduction method of compression of the model and the like are researched, so that a depth target detection model facing edge calculation can be provided.
The network model is mainly divided into three parts of multi-task cooperative optimization, network structure light weight and model compression, the target detection and identification task and the fusion enhancement display task both comprise a large amount of parameters and complex calculation, the task cooperation of the target detection and identification task and the fusion enhancement display task is realized by sharing the sub-modules, and the resource consumption during the whole operation of the system can be greatly reduced. And then, the basic operation of the network structure is redesigned in a light-weight manner, the model size is further reduced in a model compression manner, and finally, the network model after light weight is obtained.
Algorithm transplanting on the embedded platform is a key step for the final landing of the algorithm, and is also an important step in the whole light-weight scheme. The migration optimization procedure may include three steps of standard computational graph model conversion, algorithm migration, and runtime acceleration.
The algorithm model trained under the mainstream AI framework cannot be directly transplanted to the rayleigh core micro-platform, and needs to be converted into a standard calculation graph model ONNX (Open Neural Network Exchange) format. This is because each deep learning framework has its own unique format to interpret and store the network model, and the emphasis of these frameworks is different, and some efficient middleware and operators are designed specifically, and these designs do not apply to all AI computing platforms (devices that typically only support NVIDIA). The system provided by the embodiment of the application can adopt a domestic Rayleigh core micro RK3588 chip, a related AI computing platform is not perfect, and a mainstream AI framework does not adapt to the chip, so that the chip is required to be converted into an ONNX format.
The network model of the standard computational graph in ONNX format is transplanted to a Rayleigh core micro RK3588 chip platform, and mainly comprises the steps of re-realization of platform related operation, core operator conversion, operator merging optimization, model quantization, accuracy verification and the like. RK3588 is composed of four-core ARM Corte A76 and four-core ARM Corte A55 CPU, and is internally provided with independent NPU of 6 TOPs, so that special design and optimization are performed in a core driver and related bottom layer modules, and better support is provided for common edge computing tasks. The re-implementation of the relevant operations for the RK3588 chip platform is an important step of the transplanting work. In addition, the conversion and merging optimization of the core operators are transplanted core work, the finally trained deep learning model can be regarded as being composed of a series of operator operation and input and output auxiliary modules with high-speed operation, merging optimization is carried out on a plurality of operators by combining a chip platform, and dynamic strategies are designed from the aspects of memory allocation and task scheduling, so that experiments and designs are needed to be made in the actual research process, and the operation efficiency of the operators is further accelerated.
And finally, combining the runtime environment of the RK3588 chip, calling a bottom functional module according to the bottom reasoning engine and the tool provided by the RK3588 chip, and performing parallel computation by using a large-scale NPU computing unit to further accelerate the deep learning model in running.
Further, the power supply unit 4 includes a battery compartment, and a metal shielding paint layer is disposed in each of the battery compartment and the mirror frame. The glasses frame and the battery bin are made of ABS plastic, metal shielding paint is sprayed inside the glasses frame, a certain electromagnetic protection effect is achieved, and the satellite positioning antenna and the WIFI/Bluetooth antenna are designed into independent cavities and are sealed and fixed by adopting non-shielding materials.
The design of the goggles fully considers the facial shape and the facial outline characteristics of Asians, adopts natural rubber materials at the joint part of the goggles and the human faces, has a certain buffering effect, can improve the wearing comfort of users, and the adjustable fixing strap has the telescopic elastic function to ensure the wearing fastening of the adjustable fixing strap and lighten the pressing sense of the heads of the users.
In order to perform vision correction for the vision condition of the user to further improve comfort, the embodiment of the application can further provide that a vision correction lens mounting mechanism is arranged in an entrance pupil area of the binocular light waveguide display module on the mirror frame. The vision correction lens mounting mechanism is reserved in the entrance pupil area of the binocular light waveguide display module, and the vision correction lens can be mounted according to the vision condition of a user, so that the requirements of users with different eyesight are met. The low-illumination camera module adopts an adjustable mode, so that human eye observation fatigue caused by fusion display of virtual and real scenes of the transmission type display can be reduced, and the observation comfort is improved.
In the design, the structural balance weight is fully considered, and the functional module with heavy weight is placed on the left side and the right side of the head, so that the unbalance of front, back, left and right weight is avoided. The structure adopts the lightweight design, adopts polymer material in a large number, has both guaranteed equipment strength, has alleviateed weight again, can effectively alleviate neck load, promotes and dresses the travelling comfort.
The equipment design based on the mixed reality enhanced display and interaction system is based on domestic user requirements, and combines an Asian facial model data set and different wearing habit differences to develop artificial engineering design, and constraint structural layout is developed from aspects of overall weight, counterweight distribution, display exit pupil distance, binocular pupil distance and the like so as to improve man-machine efficiency and different user applicability.
In order to further improve wearing comfort and meet the optimal requirements of user interpupillary distance and interpupillary distance physiological parameters, the embodiment of the application can further provide that the light wave transmission medium contained in the binocular light waveguide display module is extended by 18.5-19.5 mm from left to right by taking the vertical feet of the vision line and the screen as the centers and the vertical feet of the screen as the base points. Further expanding by 19 mm from side to side.
The pupil distance design is displayed, and the pupil distance median of the east Asian adult male is 65.2 mm according to the pupil distance data investigation and analysis. The pupil distance distribution is in a normal distribution state, and the 5 th percentile and the 95 th percentile are selected as the upper and lower ranges of the optimal image interval. As shown in fig. 3.
Meanwhile, when the optical waveguide is laid out, the vision and screen drop foot are taken as the center. The screen range is extended by 19 mm left and right with the foot drop as a base point. Completely covering the up-and-down deviation of the pupil distance value. Different interpupillaries reduce the effective optimal imaging range, matching the optimal effect by resolution, depth of field, etc., as shown in fig. 4.
In addition, a fine adjustment device can be added, and the fine adjustment device has a position fine adjustment function for the binocular light waveguide display module so as to adapt to interpupillary distances and interpupillary distance physiological parameters of different users.
In order to further realize the large-view-field optical waveguide display, the embodiment of the application can also provide that the optical wave transmission medium comprises at least one layer of optical waveguide and a volume holographic grating attached to the surface of at least one layer of optical waveguide. Further, the volume hologram grating comprises a multi-region grating structure, the multi-region grating structure comprises a plurality of groups of volume hologram gratings, and the plurality of groups of volume hologram gratings are respectively used for coupling and outputting a plurality of input images with different visual angles in a one-to-one correspondence manner.
By adopting the technical scheme of optical waveguide and volume holographic grating, the volume holographic grating attached to the surface of the optical waveguide breaks the interface rule of total reflection, changes the propagation direction and energy of optical information, and further guides the optical information to be transmitted from the inside of the waveguide to human eyes.
The holographic waveguide display system has the characteristics of small volume and light weight, and the problems of exit pupil, field of view, resolution and the like need to be considered in the optical design. At the same time, due to the low display efficiency of the holographic waveguide, the influence of stray light and scattering on the contrast ratio of the system is considered. The parallel slab holographic waveguide head-mounted display system is a no optical power imaging system, and the most predominant sources of aberrations are caused by non-parallelism of the waveguides and uneven diffraction of the gratings. To reduce the aberrations of the system, it is necessary to tightly control the waveguide tolerances and grating ruling process when fabricating holographic waveguides. The imaging principle of the freeform holographic waveguide helmet-mounted display system is similar to that of a common optical system, wherein both the freeform surface and the grating can be regarded as reflection or refraction elements.
Holographic waveguide design, which is a key element of the system, mainly plays roles of coupling, image transmission and imaging. The design of the grating shape and diffraction efficiency of the holographic grating and the design of the whole grating structure are mainly adopted. Holographic gratings with uniform brightness and high diffraction efficiency require electromagnetic field analysis using rigorous coupled wave analysis methods. The structure and layout of the holographic waveguide is an important consideration for the design of optical systems, especially for freeform holographic waveguide head-mounted display systems. The existing holographic waveguide structure has various advantages and disadvantages, reasonable structural types are selected according to actual needs, and the final goal is to realize free-form surface holographic waveguide display.
The preparation technology of the grating, the preparation of the holographic grating is the key for ensuring the imaging quality of the system. High quality holographic grating can ensure high diffraction efficiency and image quality. In order to perfect the holographic waveguide imaging method, many scientific research institutions have studied various holographic gratings for helmet display. These gratings include surface gratings (university of wezmann), volume holographic gratings (Sony, BAE), tilted gratings (nokia), area coded gratings (blastes, zeiss), and the like. The different holographic gratings have different principal shapes and performances, and the manufacturing process has great difference. According to the current commercialization, a volume holographic grating is a holographic grating with development prospect. Holographic optical waveguide diffraction gratings were developed based on volume holographic gratings. During exposure, the laser emits laser, the laser is divided into light beams with a certain light intensity ratio after passing through the beam splitter, and the light beams are incident on the holographic dry plate pre-coated with the photoresist after being subjected to beam expansion collimation and reflection by the reflector, so that interference fringes are formed by exposure interference.
When light is totally reflected on a glass substrate, diffraction occurs once reaching the holographic surface. Whereas diffracted light no longer satisfies the condition of total reflection, it will be emitted from the glass plate. Meanwhile, the size of the entrance pupil can be adjusted, and a continuous exit pupil area is realized. By stacking the in-coupling and out-coupling volume holographic gratings, a colored VHG-coupled grating waveguide is formed.
By adopting the theoretical design method of the large-view-field holographic waveguide, the holographic diffraction waveguide device is expanded, so that the internal transmission of more angle light rays can be accommodated, and the larger the view field angle FOV is. The refractive index modulation degree and the grating thickness of the recording material are changed, and the silver salt photosensitive material can adjust the refractive index modulation degree by 0.03-0.1 through a developing process.
The microscopic amplified holographic diffraction device consists of a series of periodic stripes, the interval between adjacent stripes is hundreds of nanometers, the thickness d represents the thickness of the photosensitive film and is generally larger than 3um (micrometers), and the refractive index modulation degree parameter represents the contrast between the stripes. The diffraction performance of the volume hologram grating is determined by the interval, the thickness d and the refractive index modulation degree. The diffraction efficiency curve represents the diffraction efficiency (ratio of the diffracted light to the reference light) of the diffracted light when the reference light is incident at different incident angles, and the angle range in which the diffraction efficiency is reduced from the maximum value to the first minimum value is called an angle bandwidth, which represents the size of the angle of view, and the smaller the thickness is, the larger the refractive index modulation degree is, and the larger the corresponding bandwidth is. The larger the field angle.
The embodiments of the present application employ a multi-region grating scheme that equally divides the in-coupling portion and the out-coupling portion into two or more regions. Taking three areas as an example, as shown in fig. 5. 101 and 201 are a first set of volume holographic gratings, corresponding to images of a first viewing angle; 102 and 202 are a second set of volume holographic gratings, corresponding to images at a second viewing angle; 103 and 203 are a third set of volume holographic gratings, corresponding to images at a third viewing angle. And the image display with a large viewing angle is realized by superposing three viewing angle images. Theoretically, the three-region grating scheme can three times the field angle of the conventional holographic waveguide scheme.
The angle multiplexing scheme adopts an angle multiplexing method, and multiplexes the two or more times on the input coupling volume holographic grating and the output coupling volume holographic grating, thereby effectively increasing the field angle of the holographic waveguide system.
Waveguide layering schemes may employ two or more layers of waveguides to expand the field angle. In theory, the larger the number of waveguide layers is, the larger the angle of view is, but the design and processing difficulty can be greatly increased when the number of layers is increased, and two layers of waveguides can be selected optimally to enlarge the angle of view. Each layer of waveguide has two volume holographic gratings, one as an in-coupling optical element and one as an out-coupling optical element. Each layer of waveguide corresponds to an image with one viewing angle, the two layers of waveguides can realize superposition of two viewing angles, and compared with the traditional holographic waveguide, the layered holographic waveguide can realize image display with a large viewing angle.
The purpose of the wavefront conversion technology (WTT) is to offset the angular bandwidth, and by generating a special wavefront surface, the wavefront shape of the light beam is controlled, so that the single exposure can expand the angle of view to 50 ° at the set exit pupil, and no special requirement is imposed on the holographic material.
It will be appreciated that binocular parallax refers to the fact that when two eyes observe an object, two slightly different images are obtained due to the different positions of the eyes, as shown in fig. 6. The human brain fuses the two images with slight differences to produce a sense of distance, depth and third dimension. Fusion refers to the ability of the brain to integrate parallax images from both eyes and form a complete impression at the perceptual level. This is a function of fusing the object images falling on the corresponding points of the two retinas into a complete impression based on the simultaneous perception of both eyes. Parallax is a central task for spatial perception. Since the vertical viewing angles of the eyeballs of both eyes of a person are the same, the horizontal viewing angles are different, and the parallax that can generate a stereoscopic sensation is generally referred to as a horizontal parallax. To address this issue, embodiments of the present application may provide that the informatization component further includes a binocular camera for enabling video acquisition of the real environment; the calibration method of the binocular camera comprises the following steps:
taking a checkerboard as a calibration plate, and adopting a left camera and a right camera of the binocular camera to acquire a series of static calibration pictures from different angles respectively;
monocular calibrating the left and right cameras respectively through a series of static calibration pictures;
performing double-target calibration on the double-target camera based on the calibration result of the single-target calibration;
and correcting the binocular camera according to the internal and external parameter data result of the binocular camera, so that the visual field of the picture is consistent with the display visual field of the visual optical system and the height of the binocular image is consistent.
Poor visual effects are mainly manifested in two aspects: affecting the physiological health of the user and providing the user with false perception.
Wherein, the physiological health of the user is affected by a series of movement symptoms such as dizziness, dry eyes, eye distension, eye fatigue and the like of the user in the use process. Misalignment of corresponding pixels of left and right pictures on a display is a major cause of visual discomfort. The left and right eyes of a person can be rotated at different angles in the horizontal direction, but the rotation angles of the left and right eyes in the vertical direction must be uniform. Thus, if the corresponding pixels on the left and right images of the display have different vertical coordinates, the left and right eyes of the person are forced to rotate different angles in the vertical direction, thereby causing motion disorder.
Meanwhile, in binocular stereoscopic vision display, differences between the displayed picture and corresponding parameters of human eyes in the aspects of view angle, view axis, eye point position, distortion and the like can cause the picture to be inconsistent with the human eyes in natural condition. Therefore, calibration registration of the acquired frames and their acquisition devices is also required.
The fact that the binocular camera has certain distortion and low assembly accuracy is a main reason that corresponding pixel points of the left and right images cannot be aligned strictly. The calibration process is mainly divided into two parts: the first part is a series of static pictures of the checkerboard calibration plate acquired from different angles; the second part is to monocular target the left and right cameras respectively, and then to re-perform the double-target positioning based on the calibration results of the left and right cameras.
Correcting the left camera and the right camera according to the internal and external parameter data of the binocular camera obtained through calibration, so that the visual field of a picture is consistent with the display visual field of a visual optical system, the heights of binocular images are consistent, the image combination is perfect, and the distortion is eliminated. Finally, a group of ideal parallel binocular vision models are obtained.
In a word, the mixed reality wearing equipment that this application provided adopts the design thinking of wearing formula, with function integration integrated design such as goggles, video acquisition, demonstration, forms wearing equipment that the head was worn and is used. The weight of the equipment is reduced by adopting means of light weight design, high-efficiency power management and the like, and the continuous working time is prolonged. The structural matching simulation design is adopted, and the human engineering is combined for optimization, so that the equipment counterweight is optimized, the wearing comfort is improved, and the use and operation convenience are improved.
Meanwhile, by adopting the technical scheme of displaying the optical waveguide and the volume holographic grating, the interface rule of total reflection is broken through by the volume holographic grating attached to the surface of the optical waveguide, the propagation direction and energy of optical information are changed, and the optical information is further guided to be transmitted from the inside of the waveguide to human eyes. The holographic optical waveguide display module with low cost, large view field and large exit pupil is formed.
In addition, the collected images are used for calibrating and registering the binocular camera, so that the visual field of the images is consistent with the display visual field of the visual optical system, the heights of the binocular images are consistent, the image combination is perfect, and the distortion is eliminated. Preventing a series of uncomfortable movement symptoms such as dizziness, dry eyes, eye distention, eye fatigue and the like of a user in the using process.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the description of the embodiments above, it will be apparent to those skilled in the art that the present application may be implemented in software plus the necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the methods described in the embodiments or some parts of the embodiments of the present application.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for a system or system embodiment, since it is substantially similar to a method embodiment, the description is relatively simple, with reference to the description of the method embodiment being made in part. The systems and system embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.
Claims (10)
1. A mixed reality wearable device, comprising:
the head harness comprises a head harness and a fixing belt, wherein the first end of the fixing belt is fixedly connected with the head harness and the position of the head, which is opposite to the position of the head, of the head harness, and the second end of the fixing belt can bypass the top of the head harness and form a free end at the forehead position of the head harness;
the comprehensive information processing unit comprises goggles and an informationized component; the informatization component is connected with the goggles and comprises a binocular light waveguide display module, and the binocular light waveguide display module is used for displaying a real environment and virtual information superposition picture; the binocular light waveguide display module comprises a light wave transmission medium with a free-form surface holographic waveguide display structure;
the power supply unit is connected with the outer side of the right rear part of the head harness; the power supply unit is used for supplying power to the informationized component;
the goggles comprise a lens frame and lenses, wherein the lens frame is connected with the second end of the fixing strap so as to enable the lens frame to be in contact with cheeks and nose bridges of a user; the glasses frame is made of TPU plastic, and a natural rubber buffer pad is arranged at the contact part of the glasses frame, cheeks and nose bridges of the user; the lens is made of polycarbonate.
2. The mixed reality wearable device of claim 1, wherein a mount is provided on an outside of the head harness, the mount being connected to the strap; the front end of mount with the surface formation draw-in groove of head harness, the inboard of informatization subassembly is provided with the silica gel fixture block, the picture frame with after the fixed band links to each other the silica gel fixture block with draw-in groove joint links to each other.
3. The mixed reality wearable device of claim 2, wherein a clasp is provided on an outer side of the frame, the second end of the securing strap being connected to the frame by the clasp; and a silica gel anti-skid sleeve is arranged at the position where the second end of the fixing belt is connected with the head harness.
4. The mixed reality wearing device of claim 1, wherein an inner layer of the lens is processed by adopting a glass hydrophobic coating, and an outer layer of the lens is processed by adopting a multilayer anti-back light reflection coating mode.
5. The mixed reality wearable device of claim 1, wherein the power supply unit comprises a battery compartment, and wherein each of the battery compartment and the mirror frame is provided with a metallic shielding paint layer inside.
6. The mixed reality wearable device of claim 1, wherein an entrance pupil area of the frame located at the binocular waveguide display module is provided with a vision correction lens mounting mechanism.
7. The mixed reality wearable device of claim 1, wherein the binocular light waveguide display module comprises the light wave transmission medium centered on a line of sight and a screen foot, the screen range extending 18.5-19.5 mm from side to side with the foot as a base point.
8. The mixed reality wearable device of claim 1, wherein the light wave transmission medium comprises at least one layer of light waveguide and a volume holographic grating affixed to a surface of the at least one layer of light waveguide.
9. The mixed reality wearable device of claim 8, wherein the volume hologram grating comprises a multi-region grating structure comprising a plurality of sets of volume hologram gratings, the plurality of sets of volume hologram gratings respectively configured to couple and output input images of a plurality of different viewing angles in a one-to-one correspondence.
10. The mixed reality wearable device of claim 1, wherein the informationized component further comprises a binocular camera for enabling video acquisition of a real environment; the calibration method of the binocular camera comprises the following steps:
taking a checkerboard as a calibration plate, and adopting a left camera and a right camera of the binocular camera to acquire a series of static calibration pictures from different angles respectively;
monocular calibrating the left and right cameras respectively through a series of static calibration pictures;
performing double-target calibration on the double-target camera based on the calibration result of the single-target calibration;
and correcting the binocular camera according to the internal and external parameter data result of the binocular camera, so that the visual field of the picture is consistent with the display visual field of the visual optical system and the height of the binocular image is consistent.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310473059.1A CN116203733A (en) | 2023-04-28 | 2023-04-28 | Mixed reality wearing equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310473059.1A CN116203733A (en) | 2023-04-28 | 2023-04-28 | Mixed reality wearing equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116203733A true CN116203733A (en) | 2023-06-02 |
Family
ID=86514985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310473059.1A Pending CN116203733A (en) | 2023-04-28 | 2023-04-28 | Mixed reality wearing equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116203733A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116520997A (en) * | 2023-07-05 | 2023-08-01 | 中国兵器装备集团自动化研究所有限公司 | Mixed reality enhanced display and interaction system |
CN118091946A (en) * | 2024-03-09 | 2024-05-28 | 山东泰视智能技术有限公司 | Universal intelligent glasses with detection and protection functions |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5584073A (en) * | 1995-04-12 | 1996-12-17 | Gentex Corporation | Integrated helmet system |
US5682172A (en) * | 1994-12-30 | 1997-10-28 | Forte Technologies, Inc. | Headset for presenting video and audio signals to a wearer |
FR2833069A1 (en) * | 2001-08-08 | 2003-06-06 | Tsl Sport Equipment | Forehead lamp for sport or work has bulb-holder and reflector-holder that can be adjusted in two different focus positions and off position |
CN204613518U (en) * | 2015-05-27 | 2015-09-02 | 山东神戎电子股份有限公司 | A kind of binocular helmet formula night vision maintenance recording geometry |
WO2017133564A1 (en) * | 2016-02-03 | 2017-08-10 | 上海群英软件有限公司 | Head-mounted reality-augmented smart display device |
WO2017173943A1 (en) * | 2016-04-06 | 2017-10-12 | 成都虚拟世界科技有限公司 | Head-mounted structure and head-mounted device |
CN109061882A (en) * | 2018-09-21 | 2018-12-21 | 歌尔科技有限公司 | A kind of helmet |
CN110543019A (en) * | 2019-07-18 | 2019-12-06 | 中国人民解放军军事科学院国防科技创新研究院 | modular augmented reality military helmet display |
CN111640340A (en) * | 2020-07-02 | 2020-09-08 | 广东海洋大学 | Physical ocean virtual simulation experiment teaching device |
CN212814623U (en) * | 2020-06-27 | 2021-03-30 | 迅捷安消防及救援科技(深圳)有限公司 | Intelligent fire fighting helmet |
WO2021061410A1 (en) * | 2019-09-23 | 2021-04-01 | Apple Inc. | Electronic devices with finger sensors |
CN212994760U (en) * | 2020-09-21 | 2021-04-20 | 华北理工大学 | Intelligent helmet |
WO2021103950A1 (en) * | 2019-11-29 | 2021-06-03 | Oppo广东移动通信有限公司 | Display module and augmented reality glasses |
US20210303025A1 (en) * | 2018-09-28 | 2021-09-30 | Goertek Inc. | Head-mounted display device |
CN214372944U (en) * | 2021-01-26 | 2021-10-08 | 上海巨哥科技股份有限公司 | Head-wearing thermal imager |
CN215813551U (en) * | 2021-07-13 | 2022-02-11 | 国网浙江宁海县供电有限公司 | Wearable mixed reality infrared intelligent glasses |
US11300999B1 (en) * | 2019-05-23 | 2022-04-12 | Facebook Technologies, Llc | Artificial-reality headset assembly with back-of-the-head battery |
CN114545630A (en) * | 2022-01-24 | 2022-05-27 | 李湘裔 | Laser scanning type reflection spectrum formation of image AR glasses optical system |
CN114647087A (en) * | 2022-03-14 | 2022-06-21 | Oppo广东移动通信有限公司 | Forehead supporting structure and head display equipment |
WO2022160014A1 (en) * | 2021-01-29 | 2022-08-04 | ResMed Pty Ltd | A head mounted display system with positioning, stabilising and interfacing structures |
CN115205914A (en) * | 2022-07-18 | 2022-10-18 | 中国兵器装备集团自动化研究所有限公司 | Identity recognition system and wearable equipment based on vein formation of image |
CN115244361A (en) * | 2020-01-22 | 2022-10-25 | 光子医疗公司 | Open-field multi-modal calibration digital magnifier with depth sensing |
CN115891645A (en) * | 2022-12-13 | 2023-04-04 | 中国兵器装备集团自动化研究所有限公司 | Target observing and aiming system and motor vehicle |
-
2023
- 2023-04-28 CN CN202310473059.1A patent/CN116203733A/en active Pending
Patent Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5682172A (en) * | 1994-12-30 | 1997-10-28 | Forte Technologies, Inc. | Headset for presenting video and audio signals to a wearer |
US5584073A (en) * | 1995-04-12 | 1996-12-17 | Gentex Corporation | Integrated helmet system |
FR2833069A1 (en) * | 2001-08-08 | 2003-06-06 | Tsl Sport Equipment | Forehead lamp for sport or work has bulb-holder and reflector-holder that can be adjusted in two different focus positions and off position |
CN204613518U (en) * | 2015-05-27 | 2015-09-02 | 山东神戎电子股份有限公司 | A kind of binocular helmet formula night vision maintenance recording geometry |
WO2017133564A1 (en) * | 2016-02-03 | 2017-08-10 | 上海群英软件有限公司 | Head-mounted reality-augmented smart display device |
WO2017173943A1 (en) * | 2016-04-06 | 2017-10-12 | 成都虚拟世界科技有限公司 | Head-mounted structure and head-mounted device |
CN109061882A (en) * | 2018-09-21 | 2018-12-21 | 歌尔科技有限公司 | A kind of helmet |
US20210303025A1 (en) * | 2018-09-28 | 2021-09-30 | Goertek Inc. | Head-mounted display device |
US11300999B1 (en) * | 2019-05-23 | 2022-04-12 | Facebook Technologies, Llc | Artificial-reality headset assembly with back-of-the-head battery |
CN110543019A (en) * | 2019-07-18 | 2019-12-06 | 中国人民解放军军事科学院国防科技创新研究院 | modular augmented reality military helmet display |
WO2021061410A1 (en) * | 2019-09-23 | 2021-04-01 | Apple Inc. | Electronic devices with finger sensors |
WO2021103950A1 (en) * | 2019-11-29 | 2021-06-03 | Oppo广东移动通信有限公司 | Display module and augmented reality glasses |
CN115244361A (en) * | 2020-01-22 | 2022-10-25 | 光子医疗公司 | Open-field multi-modal calibration digital magnifier with depth sensing |
CN212814623U (en) * | 2020-06-27 | 2021-03-30 | 迅捷安消防及救援科技(深圳)有限公司 | Intelligent fire fighting helmet |
CN111640340A (en) * | 2020-07-02 | 2020-09-08 | 广东海洋大学 | Physical ocean virtual simulation experiment teaching device |
CN212994760U (en) * | 2020-09-21 | 2021-04-20 | 华北理工大学 | Intelligent helmet |
CN214372944U (en) * | 2021-01-26 | 2021-10-08 | 上海巨哥科技股份有限公司 | Head-wearing thermal imager |
WO2022160014A1 (en) * | 2021-01-29 | 2022-08-04 | ResMed Pty Ltd | A head mounted display system with positioning, stabilising and interfacing structures |
CN114879362A (en) * | 2021-01-29 | 2022-08-09 | 瑞思迈私人有限公司 | Positioning, stabilizing and interfacing structure and system incorporating same |
CN215813551U (en) * | 2021-07-13 | 2022-02-11 | 国网浙江宁海县供电有限公司 | Wearable mixed reality infrared intelligent glasses |
CN114545630A (en) * | 2022-01-24 | 2022-05-27 | 李湘裔 | Laser scanning type reflection spectrum formation of image AR glasses optical system |
CN114647087A (en) * | 2022-03-14 | 2022-06-21 | Oppo广东移动通信有限公司 | Forehead supporting structure and head display equipment |
CN115205914A (en) * | 2022-07-18 | 2022-10-18 | 中国兵器装备集团自动化研究所有限公司 | Identity recognition system and wearable equipment based on vein formation of image |
CN115891645A (en) * | 2022-12-13 | 2023-04-04 | 中国兵器装备集团自动化研究所有限公司 | Target observing and aiming system and motor vehicle |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116520997A (en) * | 2023-07-05 | 2023-08-01 | 中国兵器装备集团自动化研究所有限公司 | Mixed reality enhanced display and interaction system |
CN116520997B (en) * | 2023-07-05 | 2023-09-26 | 中国兵器装备集团自动化研究所有限公司 | Mixed reality enhanced display and interaction system |
CN118091946A (en) * | 2024-03-09 | 2024-05-28 | 山东泰视智能技术有限公司 | Universal intelligent glasses with detection and protection functions |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11733456B2 (en) | Eyepiece for virtual, augmented, or mixed reality systems | |
US11754840B2 (en) | Eye-imaging apparatus using diffractive optical elements | |
CN109073901B (en) | Binocular wide-field-of-view (WFOV) wearable optical display system | |
CN102402005B (en) | Bifocal-surface monocular stereo helmet-mounted display device with free-form surfaces | |
CN116203733A (en) | Mixed reality wearing equipment | |
US11947121B2 (en) | Waveguides with integrated optical elements and methods of making the same | |
CN208818941U (en) | Ray machine mould group and AR head-wearing display device | |
WO2023244271A1 (en) | Optical layers to improve performance of eyepieces for use with virtual and augmented reality display systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20230602 |
|
RJ01 | Rejection of invention patent application after publication |