CN107193127A - A kind of imaging method and Wearable - Google Patents

A kind of imaging method and Wearable Download PDF

Info

Publication number
CN107193127A
CN107193127A CN201710503049.2A CN201710503049A CN107193127A CN 107193127 A CN107193127 A CN 107193127A CN 201710503049 A CN201710503049 A CN 201710503049A CN 107193127 A CN107193127 A CN 107193127A
Authority
CN
China
Prior art keywords
virtual image
user
channel structure
light channel
wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710503049.2A
Other languages
Chinese (zh)
Inventor
郭宁
史俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Asu Tech Co Ltd
Original Assignee
Beijing Asu Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Asu Tech Co Ltd filed Critical Beijing Asu Tech Co Ltd
Priority to CN201710503049.2A priority Critical patent/CN107193127A/en
Publication of CN107193127A publication Critical patent/CN107193127A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The embodiments of the invention provide a kind of imaging method and Wearable, it is related to augmented reality field, wherein, the Wearable includes shell, microcomputer schreiner calender and light channel structure, the microcomputer schreiner calender and the light channel structure are arranged at the inside of the shell, and methods described includes:The microcomputer schreiner calender obtains virtual image to be presented, and the virtual image is projected into the light channel structure;The virtual image is reflected and/or reflexed to predeterminated position by the light channel structure, wherein, the predeterminated position is the ad-hoc location on the retina for the user for wearing the Wearable.Using scheme provided in an embodiment of the present invention, it is possible to increase the quality of the virtual image of projection, Consumer's Experience is lifted.

Description

A kind of imaging method and Wearable
Technical field
The present invention relates to augmented reality field, more particularly to a kind of imaging method and Wearable.
Background technology
Augmented reality (Augmented Reality, abbreviation AR), is a kind of that real world images and virtual image are integrated New technology.By science and technology such as computers, script is obtained after real world is difficult the solid images analog simulation experienced To virtual image, virtual image is applied to real world, realizes that real world images and virtual image are added in real time Same picture or space, are perceived by human sensory, so as to reach the sensory experience of exceeding reality.
Existing AR technologies carry out digital processing using digital optical processing technique to virtual image, then by the void after processing Intend the retina that image projects people, to realize the superposition of real world images and virtual image at human eye.Yet with number The word optical processing technique shortcoming of itself, it may appear that the virtual image of projection not enough clear, color saturation is low and color point occurs From phenomenon, cause the quality for projecting virtual image low.
The content of the invention
The purpose of the embodiment of the present invention is to provide a kind of imaging method and Wearable, to improve the virtual graph of projection The quality of picture, lifts Consumer's Experience.Concrete technical scheme is as follows:
First aspect is applied to Wearable, the Wearable bag there is provided a kind of imaging method, methods described Shell, microcomputer schreiner calender and light channel structure are included, the microcomputer schreiner calender and the light channel structure are arranged at the inside of the shell, Methods described includes:
The microcomputer schreiner calender obtains virtual image to be presented, and the virtual image is projected into the light channel structure;
The virtual image is reflected and/or reflexed to predeterminated position by the light channel structure, wherein, the predeterminated position is Ad-hoc location on the retina for the user for wearing the Wearable.
Optionally, it is described that the virtual image is projected into the light channel structure, including:
The first crevice projection angle of the virtual image is obtained, the virtual image is projected according to first crevice projection angle To the light channel structure, to cause the position that the virtual image is presented on the retina of the user and the user current Field range in the first position of target real-world object match.
Optionally, the Wearable also include data acquisition device and drive device, the data acquisition device and The drive device is arranged at the inside of the shell, the drive device respectively with the data acquisition device and the microcomputer Schreiner calender be electrically connected with, it is described by the virtual image according to first crevice projection angle project the light channel structure it Afterwards, methods described also includes:
When the eyes that the data acquisition device detects the user rotate and/or head is moved, adopt Collect the eye rotation data and head movement data of the user;
The data acquisition device determines that the target is true according to the eye rotation data and the head movement data The second place in the field range of real object after movement;
The drive device drives the microcomputer schreiner calender by institute according to corresponding second crevice projection angle in the second place State virtual image and project the light channel structure according to second crevice projection angle, to cause the virtual image is presented on described Position and the second place of the target real-world object on the retina of user match.
Optionally, the data acquisition device includes camera and gyroscope, the eye rotation of the collection user Data and head movement data, including:
The camera gathers the eye rotation data of the user;
The gyroscope gathers the head movement data of the user.
Optionally, the data acquisition device also includes computing module, the input of the computing module respectively with it is described Camera and the gyroscope are electrically connected with, and output end is connected with the drive device.
Optionally, the microcomputer schreiner calender includes LASER Light Source and eyeglass, it is described the virtual image is projected it is described Light channel structure, including:
The LASER Light Source launches the corresponding compound glory of the virtual image;
The compound glory is reflexed to the light channel structure by the eyeglass.
Optionally, the Wearable also includes being used for that user wears wears component, it is described wear component with it is described Shell is fixedly connected.
Second aspect includes shell, microcomputer schreiner calender and light path there is provided a kind of Wearable, the Wearable Structure, the microcomputer schreiner calender and the light channel structure are arranged at the inside of the shell, wherein:
The microcomputer schreiner calender, the virtual image to be presented for obtaining, the light path is projected by the virtual image Structure;
The light channel structure, for reflecting and/or reflexing to predeterminated position by the virtual image, wherein, it is described default Position is the ad-hoc location on the retina for the user for wearing the Wearable.
Optionally, the microcomputer schreiner calender, the first crevice projection angle for obtaining the virtual image, by the virtual graph As projecting the light channel structure according to first crevice projection angle, to cause the virtual image to be presented on regarding for the user The first position of target real-world object in the field range current with the user of position on nethike embrane matches.
Optionally, the Wearable also include data acquisition device and drive device, the data acquisition device and The drive device is arranged at the inside of the shell, the drive device respectively with the data acquisition device and the microcomputer Schreiner calender is electrically connected with, wherein:
The data acquisition device, for that ought detect, the eyes of the user rotate and/or head is moved When, the eye rotation data and head movement data of the user are gathered, are moved according to the eye rotation data and the head Dynamic data, determine the second place in the field range of the target real-world object after movement;
The drive device, for according to corresponding second crevice projection angle in the second place, driving the microcomputer electric light The virtual image is projected the light channel structure by machine according to second crevice projection angle, to cause the virtual image to present Position and the second place of the target real-world object on the retina of the user match.
A kind of imaging method provided in an embodiment of the present invention and Wearable, it is possible to use microcomputer schreiner calender will be to be presented Virtual image project light channel structure, virtual image is reflected and/or reflexed to the use for wearing Wearable by light channel structure Ad-hoc location on the retina at family.Projected using microcomputer schreiner calender, the quality of the virtual image of projection can be improved, carried Rise Consumer's Experience.Certainly, any product or method for implementing the present invention it is not absolutely required to while reaching all the above Advantage.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the accompanying drawing used required in technology description to be briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can be with Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is the handling process schematic diagram of imaging method provided in an embodiment of the present invention;
Fig. 2 is a kind of operation principle schematic diagram of Wearable provided in an embodiment of the present invention;
Fig. 3 is the first schematic diagram of a scenario of imaging provided in an embodiment of the present invention;
Fig. 4 is the second schematic diagram of a scenario of imaging provided in an embodiment of the present invention;
Fig. 5 is a kind of structural representation of Wearable provided in an embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
Fig. 1 is the handling process schematic diagram of imaging method provided in an embodiment of the present invention, including:
S101:Microcomputer schreiner calender obtains virtual image to be presented, and virtual image is projected into light channel structure.
In force, the Wearable can include shell, microcomputer schreiner calender and light channel structure.Wherein, microcomputer electric light Can be provided with machine MEMS (Micro Electro Mechanical System, abbreviation MEMS), LASER Light Source and Eyeglass.The LASER Light Source can launch the compound glory that corresponding (R G the B) three primary colors of virtual image to be presented are modulated, should MEMS can control the eyeglass that the compound glory that LASER Light Source is sent is reflexed into light channel structure, and pointwise is progressively scanned out One frame complete image (i.e. virtual image), so as to realize the projection of virtual image.Contrasted with conventional projector, microcomputer schreiner calender tool There are high small volume, imaging clearly, imaging reduction degree, imaging saturation degree height and without a variety of advantages such as focusings.The Wearable Can also include can be with the integrated other parts in addition to component is worn in shell, shell.For example, can be with integrated above-mentioned micro electronmechanical Ray machine and light channel structure, microcomputer schreiner calender and light channel structure can be arranged at the inside of shell.In addition, can be with integrated in shell It is responsible for the part of computing, such as CPU (Central Processing Unit, central processing unit), GPU (Graphics Processing Unit, graphics processor) and memory (memory) etc..In the case where operand is larger, it is responsible for computing Part can also be realized by other equipment.It can such as be realized by connecting PC or smart machine.
Optionally, the Wearable can be the Wearable of head-mount.Now, the Wearable can be with Including wearing component.Wear component and can be used for user and wear fixation, the Wearable of head-mount can often be designed to class Like the shape of glasses, earphone or the helmet, in order to which user wears.
After user has worn Wearable, the entity in Wearable or virtual key to run can be passed through The Wearable, if now the Wearable does not have the access of signal, user can be appreciated that the current no signal input of display Interface, or, user can only see the target real-world object in the range of present viewing field.Wearable can be built with input equipment Vertical data cube computation, the input equipment can be the smart machines such as mobile phone or tablet personal computer, for virtual image to be presented to be passed Wearable is defeated by, so that Wearable is shown.The corresponding void of several scenes can be prestored in input equipment Intend image, user can select to want the scene of experience in input equipment, and input equipment then can be by the corresponding void of the scene Intend image transmitting and give microcomputer schreiner calender.Microcomputer schreiner calender is got after the virtual image, the virtual image can be projected into light Line structure, to be shown to the virtual image.
For example, Wearable is connected with smart mobile phone, user can select the game for needing to experience on smart mobile phone Scene, smart mobile phone can then show the corresponding virtual image of the scene of game, and the virtual image is sent into the Wearable set Standby, the Wearable is shown to the virtual image, and user then can see the virtual image and be added to real world thing The effect of body, experiences more real scene of game.In use, in the broadcasting that user at any time can be by changing mobile phone Hold to change the virtual image of presentation.During Consumer's Experience, when body is moved, Wearable can be according to user's body The situation of body movement, is adjusted to the image space of virtual image, lets user experiencing sensation on the spot in person.
In embodiments of the present invention, virtual image to be presented can be the image previously generated, and light channel structure can be For aiding in microcomputer schreiner calender with the anaclasis of the image space that changes virtual image and/or light reflecting device.
S102:Virtual image is reflected and/or reflexed to predeterminated position by light channel structure.
Wherein, predeterminated position is the ad-hoc location on the retina for the user for wearing Wearable.
In inventive embodiments, virtual image is reflected and/or reflexed to the user for wearing Wearable by light channel structure Retina on, to realize effect that the target real-world object that user watches is superimposed with virtual image.
As seen from the above, in embodiments of the present invention, virtual image to be presented is obtained using microcomputer schreiner calender, and is projected To light channel structure, virtual image is reflected and/or reflexed to predeterminated position by light channel structure again.Compared with prior art, using micro- Electromechanical ray machine carries out the projection of virtual image, can be become apparent from, the virtual image that reduction degree is higher, saturation degree is higher, The quality of the virtual image of projection, and microcomputer schreiner calender small volume are improved, without focusing during imaging, Consumer's Experience can be lifted.
In one particular embodiment of the present invention, microcomputer schreiner calender can also obtain the first projectional angle of virtual image Degree, projects light channel structure, to cause virtual image to be presented on the retina of user by virtual image according to the first crevice projection angle On the position field range current with user in the first position of target real-world object match.
Wherein, the first crevice projection angle can be when Wearable just starts the original projection angle pre-set Degree, or the eyes of last user rotate and/or when head is moved, according to the eye rotation data of user and The crevice projection angle that head movement data is calculated, it is and current to cause virtual image to be presented on the position on the retina of user Field range in the first position of target real-world object match.So, user can visually reach that virtual image is The effect of object, realizes augmented reality in reality scene.Calculated and thrown according to the eye rotation data and head movement data of user Extended meeting is described in detail after the process of firing angle degree;First position can be target real-world object in the current field range of user Interior relative position.
As seen from the above, in embodiments of the present invention, microcomputer schreiner calender can obtain the first crevice projection angle, by virtual image Light channel structure is projected according to the first crevice projection angle, to cause virtual image to be presented on the position on the retina of user and user The first position of target real-world object in current field range matches.Realize the mesh that virtual image and user watch The location matches of real-world object are marked, the experience of user is lifted.
In one particular embodiment of the present invention, Wearable can also include data acquisition device and driving is filled Put, the data acquisition device and drive device can be arranged at the inside of shell, drive device respectively with data acquisition device and Micro electronmechanical ray machine is electrically connected with.
After virtual image is projected into light channel structure according to the first crevice projection angle, Wearable can also be according to user Eyes rotate and/or situation that head is moved, the image space of adjustment virtual image, specific treated in real time Journey can be as follows:When the eyes for detecting user in data acquisition device rotate and/or head is moved, user is gathered Eye rotation data and head movement data;Data acquisition device according to eye rotation data and head movement data, it is determined that The second place in the field range of target real-world object after movement;Drive device is according to corresponding second projection in the second place Virtual image is projected light channel structure by angle, driving microcomputer schreiner calender according to the second crevice projection angle, to cause virtual image to be in Position and the second place of target real-world object on the retina of present user match.
In force, the shell of Wearable can with integrated data harvester and drive device, drive device One end can be electrically connected with data acquisition device, and the other end can be electrically connected with micro electronmechanical ray machine, in addition, drive device is also It can be electrically connected with light channel structure.Data acquisition device can be used for the eye rotation data of collection user in real time and head is moved Dynamic data, store the eye rotation data and head movement data that collect, and crevice projection angle is calculated so as to follow-up;Driving Device can be independently of a device of microcomputer schreiner calender, for driving microcomputer schreiner calender according to throwing that is default or calculating Firing angle degree is projected, and drive device can also drive light channel structure to change refraction angle and/or reflection angle.Data acquisition is filled Putting can be carried out data transmission with drive device by being electrically connected with;Second crevice projection angle can be calculated according to the second place The crevice projection angle of the microcomputer schreiner calender arrived.
Optionally, data acquisition device can include camera and gyroscope, accordingly, gather the eye rotation number of user Can be as follows according to the processing procedure with head movement data:Camera gathers the eye rotation data of user;Gyroscope collection is used The head movement data at family.
In force, rotated and/or when head is moved when data acquisition device detects eyes of user, shooting Head can detect the rotation of eyeball, obtain eye rotation data (angle of such as eye rotation), gyroscope can detect head Movement, obtain head movement data (displacement on such as head).Data acquisition device can be according to eye rotation data and head Portion's mobile data, determines the change of the field range of user, obtains the displacement of field range, and then according to the position of field range Move, determine the relative position (i.e. the second place) in field range of the target real-world object after change, and then according to second Put and calculate crevice projection angle (i.e. the second crevice projection angle) required for microcomputer schreiner calender, the second crevice projection angle is then transferred to drive Dynamic device, drive device can then drive microcomputer schreiner calender to be projected with second crevice projection angle, in addition, data acquisition device The refraction angle and/or reflection angle of light channel structure can be calculated according to the second place, then by the refraction angle and/or reflection Angle is transferred to drive device, and drive device then can drive light channel structure to be reflected with the refraction angle and/or reflection angle And/or reflection, coordinate microcomputer schreiner calender to complete the presentation of image so that the eyes of user are rotated and/or head is moved After dynamic, virtual image is constant in the relative position within the vision of user with target real-world object.
Optionally, data acquisition device can also include computing module, the input of computing module can respectively with shooting Head and gyroscope are electrically connected with, and output end is connected with drive device.It is that application provided in an embodiment of the present invention is above-mentioned referring to Fig. 2 A kind of operation principle schematic diagram of the Wearable of imaging method.
In force, can after camera and gyroscope collect eye rotation data and the head movement data of user So that the eye rotation collected data and head movement data are sent into computing module, computing module can be above-mentioned responsible fortune Part of calculation, such as CPU, GPU or memory etc..The visual field becomes caused by computing module can determine eye rotation data respectively Change, obtain the first displacement of field range, and the visual field changes caused by head movement data, obtains the second of field range Move.And then according to the first displacement and second displacement, the change of the field range of user is determined, according to the change of the field range of user Change, the relative position within the vision (i.e. the second place) of target real-world object after movement is calculated, then according to second Position, calculates the crevice projection angle (i.e. the second crevice projection angle) required for microcomputer schreiner calender.Computing module can be according to second The crevice projection angle (i.e. the second crevice projection angle) required for determination microcomputer schreiner calender is put, the second crevice projection angle is then exported and is filled to driving Put, so that drive device driving microcomputer schreiner calender is projected with second crevice projection angle.Computing module can also be according to second Position calculates the refraction angle and/or reflection angle of light channel structure, then exports the refraction angle and/or reflection angle to driving Device, so that drive device driving light channel structure is reflected and/or reflected with the refraction angle and/or reflection angle.In addition, Computing module can also be the module beyond data acquisition device, and the present embodiment is not limited.
As seen from the above, in embodiments of the present invention, the eye rotation data of user are gathered using camera, gyro is utilized Instrument gathers the head movement data of user, can obtain the eye rotation data and head movement data of accurate user, then Calculated by computing module, obtain the accurate change in location within the vision that target real-world object is presented on user, obtain The crevice projection angle of microcomputer schreiner calender and the refraction angle and/or reflection angle of light channel structure are obtained, the visual field according to user is realized The change of scope changes the position that virtual image is presented on the retina of user, reaches real enhancing effect true to nature, is lifted Consumer's Experience.
Exemplary, it is the first schematic diagram of a scenario of imaging provided in an embodiment of the present invention referring to Fig. 3.Wherein, C1 is represented The current field range of user, B represents the relative position relation of the two in target real-world object, figure, shows the true thing of target First positions of the body B in the current field range C1 of user, A is the virtual image being projected.Virtual image A is according to described One crevice projection angle is projected on the retina of user, you can with target real-world object B in the current field range C1 of user phase Superposition, visually reaches the effect that virtual image A is placed on target real-world object B.Thus, shown based on augmented reality Virtual image A, a kind of visual effect similar to real-world object can be produced.
So-called virtual image is presented on the of position on the retina of user and target real-world object in previous embodiment One position matches, and means visually virtual image A and target real-world object B specific positions on first position here Relation, i.e. virtual image A are placed on target real-world object B.
It is the second schematic diagram of a scenario of imaging provided in an embodiment of the present invention referring to Fig. 4, the eyes that C2 represents user occur The field range of user after rotation and/or head are moved, i.e. the eyes of user rotate and/or head is moved, The field range of user is changed, and B will not change as target real-world object, its physical location, then target is true The inevitable visual field with user in positions of the object B in the visual field is moved and changed, and is being occurred within sweep of the eye as shown in Figure 4 " movement ".Using the field range of user as referential, you can to think target real-world object B in the visual field from first position It is moved to the second place.
Because virtual image A is substantially the visual effect within the vision that is presented on user, to make virtual image A The visual effect closer to real-world object is produced, makes virtual image A accord with as far as possible during just should being moved in the visual field The visual characteristic of real-world object is closed, that is, ensures that virtual image A and target real-world object B relative position relation is constant.This It just must correspondingly change virtual image A during the field range change of user and be presented on the within the vision of user Position, to enable virtual image A to follow target real-world object B in " movement " within sweep of the eye.Need to realize that target is true Real object B is moved to behind the second place from first position, the second place phases of virtual image A still with target real-world object B Match somebody with somebody.
It should be noted that position of appearing of the virtual image A in the visual field, throws depending on the light for forming virtual image A The position penetrated on the retina of user.And the eye rotation number of the user collected can be utilized during changing in the visual field According to and head movement data, calculating obtain field range C1 to field range C2 directions change and angle, and then inversely calculating Obtain the relative position that virtual image A should be present in field range C2, and being somebody's turn to do in virtual image A and field range C2 Under relative position, the light for forming virtual image A is directed to field range C2 crevice projection angle, i.e. the second described crevice projection angle. Then according to the second crevice projection angle, drive device driving microcomputer schreiner calender is projected with the crevice projection angle.To ensure in the visual field Virtual image A and target real-world object B relative position are constant in scope C2.Wherein, projectional angle is being carried out to microcomputer schreiner calender During the adjustment of degree, drive device can also adjust the refraction angle and/or reflection angle of light channel structure, coordinate microcomputer schreiner calender complete The change of the position within the vision of user is presented on into virtual image A, realizes virtual image A and target real-world object B's The second place matches.
As seen from the above, in embodiments of the present invention, when the eyes of user rotate and/or head is moved, The second crevice projection angle is obtained according to eye rotation data and head movement data, drive device drives microcomputer schreiner calender by virtual graph As being projected according to the second crevice projection angle, to cause virtual image and target real-world object to be presented on user within sweep of the eye Relative position keep it is constant.The embodiment of the present invention is realized changes virtual image presentation according to the change of the field range of user Position on the retina of user, improves the effect of imaging, improves Consumer's Experience.
Imaging method with Fig. 1 is corresponding, and Fig. 5 provides a kind of structure of the Wearable of the above-mentioned imaging method of application Schematic diagram, including:Shell 501, microcomputer schreiner calender 502 and light channel structure 503, the microcomputer schreiner calender 502 and the light path knot Structure 503 is arranged at the inside of the shell 501, and methods described includes:
The microcomputer schreiner calender 502, the virtual image to be presented for obtaining, the light is projected by the virtual image Line structure;
The light channel structure 503, for reflecting and/or reflexing to predeterminated position by the virtual image, wherein, it is described pre- If position is the ad-hoc location on the retina for the user for wearing the Wearable.
As seen from the above, in embodiments of the present invention, virtual image to be presented is obtained using microcomputer schreiner calender, and is projected To light channel structure, virtual image is reflected and/or reflexed to predeterminated position by light channel structure again.Compared with prior art, using micro- Electromechanical ray machine carries out the projection of virtual image, can be become apparent from, the virtual image that reduction degree is higher, saturation degree is higher, The quality of the virtual image of projection, and microcomputer schreiner calender small volume are improved, without focusing during imaging, Consumer's Experience can be lifted.
In one particular embodiment of the present invention, microcomputer schreiner calender can be used for obtaining the first of the virtual image and throw Firing angle degree, projects the light channel structure, to cause the virtual graph by the virtual image according to first crevice projection angle As the of the target real-world object that is presented in the position on the retina of the user and the current field range of the user One position matches.
As seen from the above, in embodiments of the present invention, microcomputer schreiner calender can obtain the first crevice projection angle, by virtual image Light channel structure is projected according to the first crevice projection angle, to cause virtual image to be presented on the position on the retina of user and user The first position of target real-world object in current field range matches.Realize the mesh that virtual image and user watch The location matches of real-world object are marked, the experience of user is lifted.
In one particular embodiment of the present invention, the Wearable also includes data acquisition device and driving is filled Put, the data acquisition device and the drive device are arranged at the inside of the shell, the drive device respectively with it is described Data acquisition device and the micro electronmechanical ray machine are electrically connected with.
The data acquisition device, for that ought detect, the eyes of the user rotate and/or head is moved When, gather the eye rotation data and head movement data of the user;Moved according to the eye rotation data and the head Dynamic data, determine the second place in the field range of the target real-world object after movement;
The drive device, for according to corresponding second crevice projection angle in the second place, driving the microcomputer electric light The virtual image is projected the light channel structure by machine according to second crevice projection angle, to cause the virtual image to present Position and the second place of the target real-world object on the retina of the user match.
As seen from the above, in embodiments of the present invention, when the eyes of user rotate and/or head is moved, The second crevice projection angle is obtained according to eye rotation data and head movement data, drive device drives microcomputer schreiner calender by virtual graph As being projected according to the second crevice projection angle, to cause virtual image and target real-world object to be presented on user within sweep of the eye Relative position keep it is constant.The embodiment of the present invention is realized changes virtual image presentation according to the change of the field range of user Position on the retina of user, improves the effect of imaging, improves Consumer's Experience.
It should be noted that herein, such as first and second or the like relational terms are used merely to a reality Body or operation make a distinction with another entity or operation, and not necessarily require or imply these entities or deposited between operating In any this actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant are intended to Nonexcludability is included, so that process, method, article or equipment including a series of key elements not only will including those Element, but also other key elements including being not expressly set out, or also include being this process, method, article or equipment Intrinsic key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that Also there is other identical element in process, method, article or equipment including the key element.
Each embodiment in this specification is described by the way of related, identical similar portion between each embodiment Divide mutually referring to what each embodiment was stressed is the difference with other embodiment.It is real especially for system Apply for example, because it is substantially similar to embodiment of the method, so description is fairly simple, related part is referring to embodiment of the method Part explanation.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the scope of the present invention.It is all Any modification, equivalent substitution and improvements made within the spirit and principles in the present invention etc., are all contained in protection scope of the present invention It is interior.

Claims (10)

1. a kind of imaging method, it is characterised in that methods described is applied to Wearable, the Wearable includes outer Shell, microcomputer schreiner calender and light channel structure, the microcomputer schreiner calender and the light channel structure are arranged at the inside of the shell, described Method includes:
The microcomputer schreiner calender obtains virtual image to be presented, and the virtual image is projected into the light channel structure;
The virtual image is reflected and/or reflexed to predeterminated position by the light channel structure, wherein, the predeterminated position is to wear Ad-hoc location on the retina of the user of the Wearable.
2. according to the method described in claim 1, it is characterised in that described that the virtual image is projected into the light path knot Structure, including:
The first crevice projection angle of the virtual image is obtained, the virtual image is projected into institute according to first crevice projection angle State light channel structure, with cause the position that the virtual image is presented on the retina of the user with the user regarding currently The first position of target real-world object in wild scope matches.
3. method according to claim 2, it is characterised in that the Wearable also includes data acquisition device and drive Dynamic device, the data acquisition device and the drive device are arranged at the inside of the shell, the drive device respectively with The data acquisition device and the micro electronmechanical ray machine are electrically connected with, and project the virtual image according to described first described Angle is projected after the light channel structure, and methods described also includes:
When the eyes that the data acquisition device detects the user rotate and/or head is moved, institute is gathered State the eye rotation data and head movement data of user;
The data acquisition device determines the true thing of the target according to the eye rotation data and the head movement data The second place in the field range of body after movement;
The drive device drives the microcomputer schreiner calender by the void according to corresponding second crevice projection angle in the second place Intend image and project the light channel structure according to second crevice projection angle, to cause the virtual image to be presented on the user Retina on position and the second place of the target real-world object match.
4. method according to claim 3, it is characterised in that the data acquisition device includes camera and gyroscope, The eye rotation data and head movement data of the collection user, including:
The camera gathers the eye rotation data of the user;
The gyroscope gathers the head movement data of the user.
5. method according to claim 4, it is characterised in that the data acquisition device also includes computing module, described The input of computing module is electrically connected with the camera and the gyroscope respectively, and output end connects with the drive device Connect.
6. according to the method described in claim 1, it is characterised in that the microcomputer schreiner calender includes LASER Light Source and eyeglass, institute State and the virtual image is projected into the light channel structure, including:
The LASER Light Source launches the corresponding compound glory of the virtual image;
The compound glory is reflexed to the light channel structure by the eyeglass.
7. according to the method described in claim 1, it is characterised in that the Wearable also includes being used for the head that user wears Component is worn, the component of wearing is fixedly connected with the shell.
8. a kind of Wearable, it is characterised in that the Wearable includes shell, microcomputer schreiner calender and light channel structure, The microcomputer schreiner calender and the light channel structure are arranged at the inside of the shell, wherein:
The microcomputer schreiner calender, the virtual image to be presented for obtaining, the light channel structure is projected by the virtual image;
The light channel structure, for reflecting and/or reflexing to predeterminated position by the virtual image, wherein, the predeterminated position The ad-hoc location on retina to wear the user of the Wearable.
9. Wearable according to claim 8, it is characterised in that the microcomputer schreiner calender, for obtaining the void Intend the first crevice projection angle of image, the virtual image is projected into the light channel structure according to first crevice projection angle, with So that the virtual image is presented on the position on the retina of the user and the mesh in the current field range of the user The first position of mark real-world object matches.
10. Wearable according to claim 8, it is characterised in that the Wearable also includes data acquisition Device and drive device, the data acquisition device and the drive device are arranged at the inside of the shell, the driving dress Put and be electrically connected with respectively with the data acquisition device and the micro electronmechanical ray machine, wherein:
The data acquisition device, for when the eyes for detecting the user rotate and/or head is moved, adopting Collect the eye rotation data and head movement data of the user, number is moved according to the eye rotation data and the head According to determining the second place in the field range of the target real-world object after movement;
The drive device, for according to corresponding second crevice projection angle in the second place, driving the microcomputer schreiner calender will The virtual image projects the light channel structure according to second crevice projection angle, to cause the virtual image to be presented on institute The position and the second place of the target real-world object stated on the retina of user match.
CN201710503049.2A 2017-06-27 2017-06-27 A kind of imaging method and Wearable Pending CN107193127A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710503049.2A CN107193127A (en) 2017-06-27 2017-06-27 A kind of imaging method and Wearable

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710503049.2A CN107193127A (en) 2017-06-27 2017-06-27 A kind of imaging method and Wearable

Publications (1)

Publication Number Publication Date
CN107193127A true CN107193127A (en) 2017-09-22

Family

ID=59880970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710503049.2A Pending CN107193127A (en) 2017-06-27 2017-06-27 A kind of imaging method and Wearable

Country Status (1)

Country Link
CN (1) CN107193127A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1980999A1 (en) * 2007-04-10 2008-10-15 Nederlandse Organisatie voor Toegepast-Natuuurwetenschappelijk Onderzoek TNO An augmented reality image system, a method and a computer program product
CN101589327A (en) * 2007-09-26 2009-11-25 松下电器产业株式会社 Beam scan type display device, its display method, program, and integrated circuit
CN103119512A (en) * 2008-11-02 2013-05-22 大卫·乔姆 Near to eye display system and appliance
CN104750230A (en) * 2013-12-27 2015-07-01 中芯国际集成电路制造(上海)有限公司 Wearable intelligent device, interactive method of wearable intelligent device and wearable intelligent device system
CN104749777A (en) * 2013-12-27 2015-07-01 中芯国际集成电路制造(上海)有限公司 Interaction method for wearable smart devices
US20150185828A1 (en) * 2013-12-27 2015-07-02 Semiconductor Manufacturing International (Beijing) Corporation Wearable intelligent systems and interaction methods thereof
CN204635199U (en) * 2015-06-02 2015-09-16 广东电网有限责任公司佛山供电局 The three-dimensional adjustable light horn of wear-type virtual image safe and intelligent mobile terminal
US20150288944A1 (en) * 2012-09-03 2015-10-08 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Head mounted system and method to compute and render a stream of digital images using a head mounted display
CN105264423A (en) * 2013-05-31 2016-01-20 Qd激光公司 Image projection device and projection device
US20160033770A1 (en) * 2013-03-26 2016-02-04 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
CN106802485A (en) * 2017-03-29 2017-06-06 核桃智能科技(常州)有限公司 A kind of eyepiece for head mounted display
CN106842570A (en) * 2017-01-18 2017-06-13 上海乐蜗信息科技有限公司 A kind of wear-type mixed reality device and control method
CN207216145U (en) * 2017-06-27 2018-04-10 北京一数科技有限公司 A kind of Wearable

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1980999A1 (en) * 2007-04-10 2008-10-15 Nederlandse Organisatie voor Toegepast-Natuuurwetenschappelijk Onderzoek TNO An augmented reality image system, a method and a computer program product
CN101589327A (en) * 2007-09-26 2009-11-25 松下电器产业株式会社 Beam scan type display device, its display method, program, and integrated circuit
CN103119512A (en) * 2008-11-02 2013-05-22 大卫·乔姆 Near to eye display system and appliance
US20150288944A1 (en) * 2012-09-03 2015-10-08 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Head mounted system and method to compute and render a stream of digital images using a head mounted display
US20160033770A1 (en) * 2013-03-26 2016-02-04 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
CN105264423A (en) * 2013-05-31 2016-01-20 Qd激光公司 Image projection device and projection device
CN104750230A (en) * 2013-12-27 2015-07-01 中芯国际集成电路制造(上海)有限公司 Wearable intelligent device, interactive method of wearable intelligent device and wearable intelligent device system
US20150185828A1 (en) * 2013-12-27 2015-07-02 Semiconductor Manufacturing International (Beijing) Corporation Wearable intelligent systems and interaction methods thereof
CN104749777A (en) * 2013-12-27 2015-07-01 中芯国际集成电路制造(上海)有限公司 Interaction method for wearable smart devices
CN204635199U (en) * 2015-06-02 2015-09-16 广东电网有限责任公司佛山供电局 The three-dimensional adjustable light horn of wear-type virtual image safe and intelligent mobile terminal
CN106842570A (en) * 2017-01-18 2017-06-13 上海乐蜗信息科技有限公司 A kind of wear-type mixed reality device and control method
CN106802485A (en) * 2017-03-29 2017-06-06 核桃智能科技(常州)有限公司 A kind of eyepiece for head mounted display
CN207216145U (en) * 2017-06-27 2018-04-10 北京一数科技有限公司 A kind of Wearable

Similar Documents

Publication Publication Date Title
US11676333B2 (en) Spatially-resolved dynamic dimming for augmented reality device
KR101260287B1 (en) Method for simulating spectacle lens image using augmented reality
JP5538483B2 (en) Video processing apparatus, video processing method, and video processing system
CA2781064C (en) Image magnification on a head mounted display
US11217024B2 (en) Artificial reality system with varifocal display of artificial reality content
JP2021518679A (en) Depth-based foveal rendering for display systems
CN109901710B (en) Media file processing method and device, storage medium and terminal
KR20180096434A (en) Method for displaying virtual image, storage medium and electronic device therefor
JP5483761B2 (en) Video output device, stereoscopic video observation device, video presentation system, and video output method
US20150312558A1 (en) Stereoscopic rendering to eye positions
WO2014128752A1 (en) Display control device, display control program, and display control method
WO2016013272A1 (en) Information processing device, information processing method and image display system
CN107844190A (en) Image presentation method and device based on Virtual Reality equipment
US11956415B2 (en) Head mounted display apparatus
CN110770636A (en) Wearable image processing and control system with functions of correcting visual defects, enhancing vision and sensing ability
CN207216145U (en) A kind of Wearable
CN114616511A (en) System and method for improving binocular vision
JP6576639B2 (en) Electronic glasses and control method of electronic glasses
US11743447B2 (en) Gaze tracking apparatus and systems
JP2017191546A (en) Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display
JP6915368B2 (en) Multifocal visual output method, multifocal visual output device
CN107193127A (en) A kind of imaging method and Wearable
KR20240040727A (en) How to simulate optical products
WO2017191703A1 (en) Image processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination