CN112099616A - Method and system for realizing 360-degree full-view angle of body-control AR glasses and AR glasses - Google Patents
Method and system for realizing 360-degree full-view angle of body-control AR glasses and AR glasses Download PDFInfo
- Publication number
- CN112099616A CN112099616A CN201910523461.XA CN201910523461A CN112099616A CN 112099616 A CN112099616 A CN 112099616A CN 201910523461 A CN201910523461 A CN 201910523461A CN 112099616 A CN112099616 A CN 112099616A
- Authority
- CN
- China
- Prior art keywords
- control
- glasses
- degree full
- controlled
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011521 glass Substances 0.000 title claims abstract description 44
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000003384 imaging method Methods 0.000 claims abstract description 27
- 230000003287 optical effect Effects 0.000 claims abstract description 10
- 238000005192 partition Methods 0.000 claims description 16
- 230000003238 somatosensory effect Effects 0.000 claims description 4
- 230000002452 interceptive effect Effects 0.000 abstract description 4
- 239000013589 supplement Substances 0.000 abstract description 4
- 230000000007 visual effect Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Eyeglasses (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a method for realizing 360-degree full view angle of body control AR glasses, which comprises the following steps: dividing the rear side view 240 degree angle area into one or more zones; collecting the live-action data of each subarea, and imaging in real time according to the collected live-action data; transmitting the imaging of the corresponding subarea to human eyes through a waveguide optical module according to a control instruction input by a user; the invention supplements the rear side viewing angle of 240 degrees, can add a rear viewing angle image to the front viewing angle of human eyes, and makes more application experience possible; meanwhile, the user can make an interactive response according to the current scene, and the imaging of the subarea corresponding to the instruction is obtained by inputting a control instruction.
Description
Technical Field
The invention relates to the technical field of AR (augmented reality) glasses, in particular to a method and a system for realizing 360-degree full-view angle of body-control AR glasses and the AR glasses.
Background
Current world AR glasses mainly use the region of visual 180 degrees as the development direction, should be the scope of visual binocular common segment 120 degrees in the strict sense, for the most comfortable district of vision, each 30 degrees about, are the tired district of vision, and general condition people's eyes like to stay in comfortable district, when needing angle adjustment, can not adjust the comfortable district of eye yet, and the head wrench movement, and the degree is relatively poor is experienced to this kind of mode, and easily causes the neck tired.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method for implementing 360-degree full view angle of body-controlled AR glasses aiming at the above-mentioned defects of the prior art;
a360-degree full-view angle system of body-control AR glasses and the AR glasses are further provided.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a method for realizing 360-degree full view angle of body control AR glasses is constructed, and the method comprises the following steps:
the method comprises the following steps: dividing the rear side view 240 degree angle area into one or more zones;
step two: collecting the live-action data of each subarea, and imaging in real time according to the collected live-action data;
step three: and transmitting the imaging of the corresponding subarea to human eyes through the waveguide optical module according to a control command input by a user.
The method for realizing the 360-degree full-view angle of the body control AR glasses comprises the step one, wherein 1-5 partitions are arranged.
The method for realizing the 360-degree full-view angle of the body control AR glasses comprises the following steps that two partitions are arranged, and the angles are the same; and respectively assembling miniature cameras at the 60-degree angle positions of the two subareas to acquire data.
The method for realizing the 360-degree full-view angle of the body-control AR glasses comprises the step two, collected real-scene data are sent to a processor, and the processor controls imaging output of a micro display.
The method for realizing the 360-degree full view angle of the body-controlled AR glasses comprises the third step of controlling the control instruction through voice control, body sensing control, touch control, key operation control or remote control.
A360-degree full-view angle system of body-control AR glasses is disclosed, which comprises a camera sensor, a processor unit, an imaging unit, a waveguide optical module and a user instruction encoder;
the camera sensors correspond to the partitions one by one and are used for acquiring live-action data;
the user instruction encoder is used for identifying a control instruction input by a user and sending the control instruction to the processor unit;
and the processor unit is used for receiving the live-action data, sending the live-action data to the imaging unit, receiving and decoding the control instruction identified by the user instruction encoder, and controlling the imaging unit to output according to the decoded control instruction.
The 360-degree full-view system of the body control AR glasses, provided by the invention, comprises 1-5 partitions.
The 360-degree full-view system of the body-control AR glasses, provided by the invention, comprises two partitions, wherein the angles of the two partitions are the same; and assembling the camera sensors at the 60-degree angle positions of the two subareas respectively for data acquisition.
The 360-degree full-view system of the body-control AR glasses, disclosed by the invention, is characterized in that the user control instruction identified by the user instruction encoder is voice control, body-sensing control, touch control, key operation control or remote control.
The AR glasses are provided with the 360-degree full-view system of the body-control AR glasses.
The invention has the beneficial effects that: the invention supplements the rear side view angle of 240 degrees, divides the rear side view 240 degree angle area into one or more subareas, collects the live-action data of each subarea, and images in real time according to the collected live-action data, and the images are transmitted to human eyes through the waveguide optical module, and the human eyes can obtain the real images from the front and the virtual images from the rear, thereby completing AR imaging, so that the human eyes can add rear view angle images to the visual angle views, and more application experiences become possible; the user can make an interactive response according to the current scene, and the imaging of the subarea corresponding to the instruction is obtained by inputting the control instruction.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the present invention will be further described with reference to the accompanying drawings and embodiments, wherein the drawings in the following description are only part of the embodiments of the present invention, and for those skilled in the art, other drawings can be obtained without inventive efforts according to the accompanying drawings:
FIG. 1 is a flowchart illustrating a method for implementing 360 degree full view of physically controlled AR glasses according to a preferred embodiment of the present invention;
FIG. 2 is a schematic block diagram of a 360 degree full view system of physically controlled AR glasses in accordance with a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of the 360 degree full view system of the physically controlled AR glasses according to the preferred embodiment of the present invention divided into two sections.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the following will clearly and completely describe the technical solutions in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without inventive step, are within the scope of the present invention.
The method for implementing 360-degree full view angle of body control AR glasses in the preferred embodiment of the invention is as shown in FIG. 1, and the implementation method is as follows:
s01: dividing the rear side view 240 degree angle area into one or more zones;
s02: collecting the live-action data of each subarea, and imaging in real time according to the collected live-action data;
s03: transmitting the imaging of the corresponding subarea to human eyes through a waveguide optical module according to a control instruction input by a user;
the invention supplements the rear side view angle of 240 degrees, divides the rear side view 240 degree angle area into one or more subareas, collects the live-action data of each subarea, and images in real time according to the collected live-action data, and the images are transmitted to human eyes through the waveguide optical module, and the human eyes can obtain the real images from the front and the virtual images from the rear, thereby completing AR imaging, so that the human eyes can add rear view angle images to the visual angle views, and more application experiences become possible; the user can make an interactive response according to the current scene, and the imaging of the subarea corresponding to the instruction is obtained by inputting the control instruction.
Preferably, in the step one, the number of the partitions is 1-5; the number of the subareas can be flexibly adjusted according to the actual acquisition effect.
A preferred embodiment is: the number of the partitions is two, and the angles are the same; respectively assembling miniature cameras at the 60-degree angle positions of the two subareas for data acquisition; this embodiment is only an exemplary embodiment, and various changes may be made without departing from the scope of the present application; the angle when data is collected can be flexibly adjusted according to the actual collection effect.
Preferably, in the second step, the collected real-scene data is sent to a processor, and the processor controls the imaging output of the micro display; the processor can adopt the existing image video processing method to carry out optimization processing on the live-action data or not to carry out the optimization processing on the live-action data; the user can conveniently control the instruction.
Preferably, in the third step, the control instruction is voice control, somatosensory control, touch control, key operation control or remote control; other existing control modes can also be adopted, and the control modes are all within the protection scope of the invention.
A360-degree full-view angle system of body-control AR glasses is disclosed, which is realized according to the 360-degree full-view angle realization method of the body-control AR glasses, as shown in FIG. 2, and referring to FIG. 3, and comprises a camera sensor 1, a processor unit 2, an imaging unit 3, a waveguide optical module 4 and a user instruction encoder 5;
the camera sensors 1 correspond to the partitions one by one and are used for acquiring live-action data;
a user instruction encoder 5 for recognizing a control instruction input by a user and transmitting the control instruction to the processor unit 2;
the processor unit 2 is used for receiving the live-action data, sending the live-action data to the imaging unit 3, receiving and decoding the control instruction identified by the user instruction encoder 5, and controlling the imaging unit 3 to output according to the decoded control instruction;
the invention supplements the rear side view angle of 240 degrees, divides the rear side view 240 degree angle area into one or more subareas, collects the live-action data of each subarea, and images in real time according to the collected live-action data, and the images are transmitted to human eyes 6 through a waveguide optical module, and the human eyes 6 can obtain real images from the front and virtual images from the rear, thereby completing AR imaging, so that the front visual angle image of the human eyes 6 is added with a rear visual angle image, and more application experiences become possible; the user can make an interactive response according to the current scene, and the imaging of the subarea corresponding to the instruction is obtained by inputting the control instruction.
Preferably, the number of partitions is 1-5; the number of the subareas can be flexibly adjusted according to the actual acquisition effect.
Preferably, the number of the partitions is two, and the angles are the same; assembling camera sensors at the 60-degree angle positions of the two subareas respectively for data acquisition; this embodiment is only an exemplary embodiment, and various changes may be made without departing from the scope of the present application; the angle when data is collected can be flexibly adjusted according to the actual collection effect.
Preferably, the user control instruction identified by the user instruction encoder is voice control, somatosensory control, touch control, key operation control or remote control; other existing control modes can also be adopted, and the control modes are all within the protection scope of the invention.
The AR glasses are provided with the 360-degree full-view system of the body-control AR glasses.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.
Claims (10)
1. A method for realizing 360-degree full view angle of body control AR glasses is characterized by comprising the following steps:
the method comprises the following steps: dividing the rear side view 240 degree angle area into one or more zones;
step two: collecting the live-action data of each subarea, and imaging in real time according to the collected live-action data;
step three: and transmitting the imaging of the corresponding subarea to human eyes through the waveguide optical module according to a control command input by a user.
2. The method for implementing 360-degree full-viewing-angle of physically-controlled AR glasses according to claim 1, wherein in the step one, the number of the partitions is 1-5.
3. The method for realizing 360-degree full view of the body-controlled AR glasses according to claim 2, wherein the number of the partitions is two, and the angles of the partitions are the same; and respectively assembling miniature cameras at the 60-degree angle positions of the two subareas to acquire data.
4. The method for realizing 360-degree full-view of the volume-controlled AR glasses according to claim 1, wherein in the second step, the collected real-scene data is sent to a processor, and the processor controls imaging output of a micro-display.
5. The method for realizing 360-degree full-view angle of the body-controlled AR glasses according to claim 1, wherein in the third step, the control instruction is voice control, somatosensory control, touch control, key operation control or remote control.
6. A360-degree full-view system of body-controlled AR glasses, the 360-degree full-view realization method of the body-controlled AR glasses according to any one of claims 1 to 5, characterized by comprising a camera sensor, a processor unit, an imaging unit, a waveguide optical module and a user instruction encoder;
the camera sensors correspond to the partitions one by one and are used for acquiring live-action data;
the user instruction encoder is used for identifying a control instruction input by a user and sending the control instruction to the processor unit;
and the processor unit is used for receiving the live-action data, sending the live-action data to the imaging unit, receiving and decoding the control instruction identified by the user instruction encoder, and controlling the imaging unit to output according to the decoded control instruction.
7. The 360 degree full view system of volume controlled AR glasses according to claim 6, wherein the number of partitions is 1-5.
8. The 360 degree full view system of volume controlled AR glasses according to claim 7, wherein the number of said segments is two and the angles are the same; and assembling the camera sensors at the 60-degree angle positions of the two subareas respectively for data acquisition.
9. The 360-degree full-view system of body-controlled AR glasses according to claim 6, wherein the user control command recognized by the user command encoder is a voice control, a somatosensory control, a touch control, a key operation control or a remote control.
10. The AR glasses according to any one of claims 6-9, wherein the AR glasses are provided with the 360 degree full view system of the body-controlled AR glasses.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910523461.XA CN112099616A (en) | 2019-06-17 | 2019-06-17 | Method and system for realizing 360-degree full-view angle of body-control AR glasses and AR glasses |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910523461.XA CN112099616A (en) | 2019-06-17 | 2019-06-17 | Method and system for realizing 360-degree full-view angle of body-control AR glasses and AR glasses |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112099616A true CN112099616A (en) | 2020-12-18 |
Family
ID=73748558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910523461.XA Pending CN112099616A (en) | 2019-06-17 | 2019-06-17 | Method and system for realizing 360-degree full-view angle of body-control AR glasses and AR glasses |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112099616A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107960121A (en) * | 2016-01-03 | 2018-04-24 | 人眼科技有限公司 | Frame is spliced into panoramic frame |
CN108008873A (en) * | 2017-11-10 | 2018-05-08 | 亮风台(上海)信息科技有限公司 | A kind of operation method of user interface of head-mounted display apparatus |
CN108663812A (en) * | 2018-04-04 | 2018-10-16 | 上海理湃光晶技术有限公司 | A kind of amblyopia visual aid based on AR glasses |
-
2019
- 2019-06-17 CN CN201910523461.XA patent/CN112099616A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107960121A (en) * | 2016-01-03 | 2018-04-24 | 人眼科技有限公司 | Frame is spliced into panoramic frame |
CN108008873A (en) * | 2017-11-10 | 2018-05-08 | 亮风台(上海)信息科技有限公司 | A kind of operation method of user interface of head-mounted display apparatus |
CN108663812A (en) * | 2018-04-04 | 2018-10-16 | 上海理湃光晶技术有限公司 | A kind of amblyopia visual aid based on AR glasses |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9842433B2 (en) | Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality | |
US11819273B2 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
CN107223223B (en) | Control method and system for first-view-angle flight of unmanned aerial vehicle and intelligent glasses | |
CN108292489B (en) | Information processing apparatus and image generating method | |
JP6339239B2 (en) | Head-mounted display device and video display system | |
USRE45031E1 (en) | Vision-based augmented reality system using invisible marker | |
WO2017173735A1 (en) | Video see-through-based smart eyeglasses system and see-through method thereof | |
US20120306725A1 (en) | Apparatus and Method for a Bioptic Real Time Video System | |
KR20180096434A (en) | Method for displaying virtual image, storage medium and electronic device therefor | |
JP6378781B2 (en) | Head-mounted display device and video display system | |
US11353708B1 (en) | Custom mixed reality smart glasses and software for vision impaired use | |
CN106444023A (en) | Super-large field angle binocular stereoscopic display transmission type augmented reality system | |
CA2875261C (en) | Apparatus and method for a bioptic real time video system | |
CN205195880U (en) | Watch equipment and watch system | |
US20210004081A1 (en) | Information processing apparatus, information processing method, and program | |
US11533443B2 (en) | Display eyewear with adjustable camera direction | |
WO2015027574A1 (en) | 3d glasses, 3d display system, and 3d display method | |
KR101203921B1 (en) | Information providing apparatus using an eye tracking and local based service | |
CN106327583A (en) | Virtual reality equipment for realizing panoramic image photographing and realization method thereof | |
US20210063746A1 (en) | Information processing apparatus, information processing method, and program | |
CN113438464A (en) | Switching control method, medium and system for naked eye 3D display mode | |
CN114371779B (en) | Visual enhancement method for sight depth guidance | |
CN106842565A (en) | A kind of wearable intelligent vision enhancing equipment of separate type | |
US11521297B2 (en) | Method and device for presenting AR information based on video communication technology | |
JP2021039444A (en) | Image processing device, control method and program thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201218 |
|
RJ01 | Rejection of invention patent application after publication |