CN107005653A - Virtual focusing feeds back - Google Patents
Virtual focusing feeds back Download PDFInfo
- Publication number
- CN107005653A CN107005653A CN201580066112.XA CN201580066112A CN107005653A CN 107005653 A CN107005653 A CN 107005653A CN 201580066112 A CN201580066112 A CN 201580066112A CN 107005653 A CN107005653 A CN 107005653A
- Authority
- CN
- China
- Prior art keywords
- camera
- image
- degree
- focus
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 31
- 238000012545 processing Methods 0.000 claims description 56
- 230000002596 correlated effect Effects 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 2
- 210000001508 eye Anatomy 0.000 description 118
- 230000003287 optical effect Effects 0.000 description 43
- 238000001514 detection method Methods 0.000 description 24
- 238000005516 engineering process Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 15
- 230000015654 memory Effects 0.000 description 15
- 210000001747 pupil Anatomy 0.000 description 13
- 238000004891 communication Methods 0.000 description 8
- 230000000875 corresponding effect Effects 0.000 description 7
- 210000005252 bulbus oculi Anatomy 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 230000005855 radiation Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 210000003786 sclera Anatomy 0.000 description 5
- 238000005259 measurement Methods 0.000 description 4
- 230000000712 assembly Effects 0.000 description 3
- 238000000429 assembly Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- RTZKZFJDLAIYFH-UHFFFAOYSA-N Diethyl ether Chemical compound CCOCC RTZKZFJDLAIYFH-UHFFFAOYSA-N 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 210000004087 cornea Anatomy 0.000 description 2
- 210000000720 eyelash Anatomy 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 241001269238 Data Species 0.000 description 1
- 208000030984 MIRAGE syndrome Diseases 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000011149 active material Substances 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- TVLSRXXIMLFWEO-UHFFFAOYSA-N prochloraz Chemical compound C1=CN=CN1C(=O)N(CCC)CCOC1=C(Cl)C=C(Cl)C=C1Cl TVLSRXXIMLFWEO-UHFFFAOYSA-N 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000000527 sonication Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007474 system interaction Effects 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
- Viewfinders (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Disclose the system and method for being focused to camera.System can generate proxy image, and the proxy image is by a certain degree of fuzzy with related to the degree that camera is out of focus.User can be required adjustment focusing to attempt proxy image becoming focus alignment.This enables camera to be focused in the case where seeing the image from camera without user.This can be by for example for being focused to infrared camera.Infrared camera can be the tracking camera in equipment (such as head-mounted display apparatus).
Description
Background technology
In the optical system for needing to focus on, the positioning of focusing directly results in the focus variations of the image of transmission.
In the system of Manual focusing, user's adjustment concentrating element, until the focusing of image reaches desired state.Autofocus system leads to
Cross using rangefinder or reach similar target by measuring the focus level of image.
Problem is still had in Focused Optical system.
General introduction
Each embodiment of this technology is related to the system and method for being focused to camera.In one embodiment, phase
Machine is the head-mounted display apparatus for including display unit and eye position and following the trail of component.Image is shown to left eye by display unit
With the optical element on right eye.Eye position and tracking component may include one or more light sources and one or more cameras.It is public
The technology for being focused to one or more of HMD cameras is opened.This technology is not limited to gather the camera in HMD
It is burnt.
In one embodiment, the processing logic communicated with display and camera receives the data associated with camera
And the degree of camera focus alignment is determined based on the data.Handling logic generation has a certain degree of fuzzy proxy image
(proxy image), and proxy image is shown over the display, the degree negative that fuzzy degree is aligned with camera focus
Close.User is instructed to adjustment camera focusing preferably to focus on proxy image.
Alternate embodiment includes herein below.Receive the data associated with camera.The degree of camera focus alignment is based on
Received data is determined.Generation has a certain degree of fuzzy proxy image, and fuzzy degree is aligned with camera focus
Degree it is negatively correlated.Proxy image is displayed on the screen.Above step is repeated, including receives the renewal associated with camera
Data, determine the new degree of camera focus alignment, and the fog-level of modification proxy image with new poly- with camera
Burnt degree is negatively correlated.
Another example includes head mounted display (HMD), including nearly eye see-through display, infrared (IR) camera, Yi Jiyu
Infrared camera and the processor of nearly eye see-through display communication.Processor receives the infrared image from IR cameras, and determines
Infrared image degree out of focus.Processor is obscured to benchmark image to create proxy image, and proxy image has certain journey
That spends is fuzzy, and the fog-level is related to the degree that infrared image is out of focus.The processor is in the nearly eye, see-through display
Show proxy image.After display proxy image, processor receives the new infrared image from IR cameras, and determines red
The degree of outer image focus.The fog-level of processor modification proxy image is with related to the new degree that infrared image is out of focus.Place
Reason device shows updated proxy image based on modified fog-level.
This general introduction is provided to introduce following some concepts further described in detailed description in simplified form.This
General introduction is not intended to identify the key feature or essential feature of claimed theme, is also not intended to be used as auxiliary determination requirement
The scope of the theme of protection.
Brief description
Fig. 1 is the exemplary components for the one embodiment for the system for mixing actual environment to be presented to one or more users
Diagram.
Fig. 2 is the stereogram of one embodiment of wear-type display unit.
Fig. 3 A are the side views of the part of one embodiment of wear-type display unit.
Fig. 3 B, 3C and 3D show the position of the set of in the HMD being provided on a pair of glasses, corresponding gaze detection element
The illustrative arrangements put.
Fig. 4 is the block diagram of one embodiment of the component of wear-type display unit.
Fig. 5 is the block diagram of one embodiment of the component of the processing unit associated with wear-type display unit.
Fig. 6 is the diagram of one embodiment of the system for being focused to camera.
Fig. 7 is related to the one embodiment for the process being focused to camera.
Fig. 8 is the flow chart for generating one embodiment of the process of proxy image.
Fig. 9 A show to can be used for the example reference image during Fig. 8.
Fig. 9 B show the example proxy image that can be generated during Fig. 8.
Fig. 9 C depict the figure of the focus level negative correlation of the fog-level for representing proxy image and camera.
Figure 10 A are the diagrams of one embodiment of the system for being focused on to camera.
Figure 10 B are the diagrams of another embodiment of the system for being focused on to camera.
Figure 11 is the block diagram that can be used for realizing one embodiment of the computing system of computing system described herein.
It is described in detail
Each embodiment of this technology will now be described, these embodiments relate generally to the system for being focused to camera
And method.In one embodiment, system generation proxy image, the proxy image is obscured by a certain degree of, and this is to a certain degree
It is related to the degree that camera is out of focus.User can be required adjustment focusing to attempt proxy image becoming focus alignment.This
Camera is focused in the case where seeing the image from camera without user's needs.This can be by for example for red
Outer camera is focused.Infrared camera can be the tracking camera in equipment (such as head-mounted display apparatus).
Head-mounted display apparatus may include display element.The display element is transparent to a certain extent so that user can be saturating
Cross the display element and see real-world objects in the visual field (FOV) of the user.The display element is also provided virtual image
Project in the FOV of the user with so that the ability that the virtual image may also appear in beside real-world objects.The system
Automatically follow the trail of user and see part, so that the system can determine that the where being inserted into virtual image in the FOV of the user.One
The denier system aware projects the virtual image to where, just projects the image using the display element.
Head-mounted display apparatus, which can be used for realizing, includes the mixed reality environment of reality and dummy object.Mixed reality is
A kind of technology for allowing to mix holographic or virtual image and real world physical environment.The wearable perspective of user, wear-type,
The mixed image of shown real world object and dummy object in the visual field of the mixed reality display device to watch user.In order to just
In the mirage for forming three dimensional depth, the image of dummy object is independently shown to left eye and right eye by head-mounted display apparatus, its
In there is small binocular disparity between images.This binocular disparity is interpreted as indicating dummy object in mixed reality by brain
Depth in environment.
Fig. 1 is shown for providing mixed reality body by the way that dummy object 21 is merged with the real content in user FOV
The system 10 tested.Fig. 1 shows multiple user 18a, 18b, 18c, and each user wears head-mounted display apparatus 2, for from itself
Dummy object, such as dummy object 21 are watched in visual angle.In other example, there may be use more more or less than three
Family.As seen in figs 2 and 3, head-mounted display apparatus 2 may include integrated processing unit 4.In other embodiments, handle
Unit 4 can be separated with head-mounted display apparatus 2, and can be communicated via wired or wireless communication with head-mounted display apparatus 2.
It is worn in one embodiment for the head-mounted display apparatus 2 of shape of glasses on the head of user so that user
It can be watched through display, and so as to the actual directly view with the space in front of the user.It is " real using term
The direct view in border " sees the ability of real-world objects to refer to direct employment eye, rather than sees the figure being created of object
As representing.For example, seeing that permission user obtains the actual directly view in the room through the glass in room, and watch on television set
The video in room is not the actual directly view in the room.The more details of head-mounted display apparatus 2 are provided below.
Processing unit 4 may include many abilities in the computing capability for operating head-mounted display apparatus 2.In some realities
Apply in example, processing unit 4 is with one or more maincenter computing systems 12 wirelessly (for example, WiFi, bluetooth, infrared or other nothings
Line means of communication) communication.As hereafter explained, maincenter computing system 12 can processing unit 4 long-range offer so that maincenter
Computing system 12 and processing unit 4 communicate via wireless networks such as LAN or WAN.In a further embodiment, maincenter meter
Calculation system 12 can be omitted to provide mobile mixed reality experience using head-mounted display apparatus 2 and processing unit 4.
Head-mounted display apparatus 2 (both can also be together with maincenter computing system 12 with itself) can provide mixed reality ring
Border, wherein one or more virtual images (dummy object 21 in such as Fig. 1) can be mixed with the real-world objects in scene
Together.Fig. 1 shows the example of plant 23 or the hand 23 of user, is used as the real-world objects in the FOV for appearing in user.
Fig. 2 and 3A show the stereogram and side view of head-mounted display apparatus 2.Fig. 3 A show head-mounted display apparatus
2 right side, including the equipment have temple 102 and a part for the bridge of the nose 104.Microphone 110 has been inserted in the bridge of the nose 104 to be used for
Record sound and send voice data to processing unit 4, as described below.It is directed towards in the front of head-mounted display apparatus 2
The video camera 112 in room, the video camera 112 can catch video and rest image.It is single that those images are transferred into processing
Member 4, as described below.
A part for the mirror holder of head-mounted display apparatus 2 will be around display (display includes one or more lens).
In order to show the component of head-mounted display apparatus 2, the frame portion around display is not described.The display includes Light-guiding optics
Element 115, opaque filter 114, perspective lens 116 and perspective lens 118.In one embodiment, opacity filters
Device 114 is in after perspective lens 116 and aligned, and light-guide optical element 115 is in after opacity filter 114 simultaneously
Align, and have an X-rayed lens 118 and be in after light-guide optical element 115 and align.It is eye to have an X-rayed lens 116 and 118
The standard eyeglass used in mirror, and can be made according to any prescription (including without prescription).Light-guide optical element 115 will
Artificial light is directed to eyes.
Circuit 136 is controlled to provide the various electronic installations for the other assemblies for supporting head-mounted display apparatus 2.Control circuit
136 more details are provided below with regard to Fig. 4.In the inside of temple 102 or be installed to temple 102 is earphone 130, inertia
Measuring unit 132 and temperature sensor 138.In Fig. 4 in shown one embodiment, Inertial Measurement Unit 132 (or IMU
132) inertial sensor, such as three axle magnetometer 132A, three-axis gyroscope 132B and three axis accelerometer 132C are included.Inertia
Measuring unit 132 senses position, orientation and the abrupt acceleration (pitching, rolling and driftage) of head-mounted display apparatus 2.Except magnetic
Outside power meter 132A, gyroscope 132B and accelerometer 132C or substitution magnetometer 132A, gyroscope 132B and accelerometer
132C, IMU 132 may also include other inertial sensors.
Micro-display 120 is by lens 122 come projected image.It there is the difference that can be used for realizing micro-display 120
Image generating technologies.For example, micro-display 120 can use transmissive projection technology to realize, wherein light source is by optical activity
Material is modulated, and is illuminated from behind with white light.These technologies are usually using the LCD type with powerful backlight and bloom energy density
Display is realized.Micro-display 120 it is also possible to use reflection technology to realize, wherein exterior light is reflected simultaneously by optically active material
Modulation.Depending on the technology, illumination is lighted forward by white light source or RGB source.Digital light processing (DLP), liquid crystal over silicon
(LCOS) and from QualcommDisplay Technique is all the example of efficient reflection technology (because most of
Energy is reflect off from brewed structure) and can be used in the system.Additionally, micro-display 120 can use transmitting
Technology realizes that wherein light is generated by the display.For example, the PicoP from Microvision Co., LtdsTMDisplay engine
Laser signal is transmitted on the small screen for taking on transmissive element using miniature minute surface rudder or directly by light beam (for example, swashing
Light) it is transmitted into eyes.
Light from micro-display 120 is sent to the user's that wears head-mounted display apparatus 2 by light-guide optical element 115
Eyes 140.Light-guide optical element 115 also allows light as arrow 142 is described from the front of head-mounted display apparatus 2
Eyes 140 are sent to by light-guide optical element 115, so that in addition to receiving the virtual image from micro-display 120 also
Allow user that there is the actual directly view in the space in the front of head-mounted display apparatus 2.So as to light-guide optical element 115
Wall is perspective.Light-guide optical element 115 includes the first reflecting surface 124 (such as minute surface or other surfaces).From micro display
The light of device 120 is through lens 122 and is incident on reflecting surface 124.Reflecting surface 124 reflects entering from micro-display 120
Penetrate light so that light, which is reflected by internally reflective, to be trapped in the planar substrates including light-guide optical element 115.Carry out on the surface of the substrate
After some secondary reflections, the array on selective reflecting surface 126 is reached by sunken light wave.Note, a table in five surfaces
Face is marked as 126 to prevent accompanying drawing too crowded.Reflecting surface 126 from substrate outgoing and will be incident on these reflecting surfaces
Light wave be coupled into the eyes 140 of user.
According to each side of this technology, head-mounted display apparatus 2 may also include the position for positioning and following the trail of eyes of user
The system put.The system includes eye position and follows the trail of component 134 (Fig. 3 A), and it has eye tracks lighting apparatus 134A and eye
Eyeball tracing sensor 134B (Fig. 4).In one embodiment, eye tracks lighting apparatus 134A includes one or more infrared
(IR) transmitter, these infrared transmitters launch IR light to eyes.In one embodiment, eye tracks sensor 134B includes
The camera of the IR light of one or more sensing reflections.Alternatively, eye tracks sensor 134B can be RGB or depth sensing
Device.In embodiments, there may be multiple sensor 134B.
By the known imaging technique for the reflection for detecting cornea, it can identify in position and the eyes of eyes of user
Pupil.Such technology can position position of the center relative to tracing sensor 134B of eyes.In embodiments, for
Each in left eye and right eye, may be present a single eye position and follows the trail of component 134 so that can determine that user's
IPD.In a further embodiment, there may be the single eye position of mark left eye or the center of each in right eye and chase after
Track component 134.
In one embodiment, the system will be used with the 4 IR LED and 4 IR photoelectric detectors of rectangular arrangement, is made
Obtain and there is IR LED and IR a photoelectric detector at each angle of the lens of head-mounted display apparatus 2.Light from LED from
Eye reflections are left.The amount of the infrared light detected at each of 4 IR photoelectric detectors determines eyes relative to biography
Sensor 134B position and pupil direction.Specifically, the white of the eye is specific for this by determination relative to the amount of pupil in eyes
The light quantity left for photoelectric detector from eye reflections.Therefore, photoelectric detector is by with to the white of the eye or pupil in eyes
Amount measurement.From this 4 samplings, the system can determine that the direction of eyes.
Another alternative solution is to use 4 infrared LEDs as discussed above, but in head-mounted display apparatus 2
Lens side on use infrared CCD.CCD will use small mirror and/or lens (flake), to cause CCD can be to coming
From up to 75% imaging of the visible eyes of spectacle-frame.Then, the CCD will sense image and be found out using computer vision
The image, just as discussed above.Therefore, although Fig. 3 shows a part with an IR transmitter,
Fig. 3 structure can be adjusted to have 4 IR transmitters and/or 4 IR sensors.It can also use more or less than 4
IR transmitters and/or IR sensors more or less than 4.
Another embodiment in the direction for following the trail of eyes is followed the trail of based on electric charge.This concept is based on following observation:Retina
Carry measurable positive charge and cornea has negative electrical charge.Sensor be installed in by the ear of user (close to earphone 130) with
Detect eyes rotate when potential and efficiently in real time read eyes just in What for.This had both provided the eyes phase of user
For the position of head-mounted display apparatus, the position of the pupil of user is also provided.The eyes for determining user can also be used
Relative to the other embodiment of the position of head-mounted display apparatus.
Using any one in embodiments described above, eye position and follow the trail of component 134 can determine left eye and
Right eye is relative to eye position and the position for the position for following the trail of component 134.Using system 134 relative to optical element 115
The position known and geometry, also be realised that position of the optical element 115 relative to left eye and right eye.This position includes eyes
Relative position (for example, horizontal location) with optical element along x-axis.This position includes phase eyes and optical element along y-axis
To position (for example, vertically oriented).Also, this position includes the relative position of eyes and optical element along z-axis (for example, eye
The distance between eyeball and optical element).
Except position, angle direction (tilt, deflect and roll) of the optical element 115 relative to left eye and right eye is determined
It is favourable.For this purpose, eye position and following the trail of component 134 and also determining the center of each eye and from eyes
The eyes vector that the heart is projected straight.
Eye center can be determined with various ways.The image of eyes is captured (or as color in sensor 134B
Image and/or be used as depth image) in the case of, image can be analyzed to determine eye center.For example, imaging sensor can be examined
Anterior corneal surface is looked into, and with this determination main shaft and corneal center.In a further embodiment, imaging sensor can check its of eyes
His feature, including pupil, sclera (white of the eye part) and/or eyelash.Other features (such as eyebrow, nose and the bridge of the nose) of face
Can further it be imaged and for determining the center of left eye and right eye.
The eyes that example including IR emittor/receivers may further determine that the center of eyes and be projected from the central straight
Vector.For example, in the case where there are multiple (such as 4) IR emittor/receivers, each in these components is measurable
The amount of sclera in the eyes that they are detected.This four independent values can be determined and compare.When each component measures eye
In eyeball during the identical amount of sclera, eyes be located at center (straight eyes front), and eyes vector can be taken as it is vertical from pupil
Ground is outside straight.This position was found out when the identical amount of sclera in eyes can have both been measured in each IR emittor/receiver,
Also the pairing of four emittor/receivers it can measure in eyes and deduced in the measurement of the different values of sclera therefrom.
As mentioned above, each eyes can have the position of its own and follow the trail of component 134, and can be each
Eyes determine independent eyes vector.Alternatively, it can be assumed that eyes are symmetrical and move together, thus can be two eyes
Eyeball determines and used single eyes vector.
Fig. 3 A show the half of head-mounted display apparatus 2.Complete head-mounted display apparatus will include another group of perspective
Lens, another opaque filter, another light-guide optical element, another micro-display 120, another lens 122, towards room
Camera 112, eye position and tracking component 134, micro-display, earphone and temperature sensor.
In one embodiment, display and opacity filter are rendered simultaneously, and are calibrated to user in sky
Between in exact position with offset angle offset problem.Eye tracks (for example, using eye tracks camera 134) can be used for calculating
The correct image shift of the end in the visual field.Eye tracks can also be used to provide the camera 113 or another for being used for making face forward
The data that one camera is focused on.In one embodiment, eye tracks camera 134 and other logics for calculating eyes vector
It is considered as eye tracking system.
Fig. 3 B show the position of in the HMD 2 being provided on a pair of glasses, corresponding gaze detection element set
Illustrative arrangements.Show as each eyes eyeglass be each eyes display optical system 14, such as 14r and 14l.Display
Optical system includes perspective lens, as common spectacles, but also includes the reality for being used for seeing by virtual content and through lens 6
The optical element (for example, speculum, filter) that border and direct real world-view are seamlessly merged.Display optical system 14
With the optical axis for being typically in perspective lens centre, wherein light is typically calibrated to provide undistorted view.For example, in eyes shield
When reason professional makes the secondary common spectacles be suitable for the face of user, target is the glasses in each pupil and corresponding eyeglass
Fall at the position that center or optical axis are aligned on the nose of user, so as to generally cause calibration light to reach the eyes of user to obtain
To clear or undistorted view.
In Fig. 3 B example, detection zone 139r, 139l display optical system corresponding to its of at least one sensor
14r, 14l optical axis are aligned so that the center of detection zone 139r, 139l catches the light along optical axis.If showing optics
System 14 is aligned with the pupil of user, then each detection zone 139 and the pupil of user of respective sensor 134 are aligned.Inspection
The reflected light for surveying region 139 is sent to the real image sensor 134 of camera via one or more optical elements, in the example
Middle sensor 134 is by shown in phantom inside mirror holder 115.
In one example, the Visible Light Camera for being also generally referred to as RGB camera can be the sensor, and optics
The example of element or light induction element is the visible reflectance mirror of fractional transmission and part reflection.Visible Light Camera provides user's
The view data of the pupil of eyes, and IR photodetectors 162 catch flash of light, flash of light is the reflection in the IR parts of frequency spectrum.Such as
Fruit uses Visible Light Camera, then the reflection of virtual image can be appeared in the ocular data that the camera is caught.Image filtering
Technology can be used for removing virtual image reflection on demand.Virtual image reflection in IR camera of eye is insensitive.
In one embodiment, at least one described sensor 134 is the IR cameras or position that IR radiation is directed into
Sensitive detectors (PSD).For example, heat reflective surface can transmit visible ray, but reflecting ir radiation.Can from the IR radiation of eye reflections
From luminaire 153, the incident radiation of other IR luminaire (not shown) or to carry out ambient IR radiation since eye reflections.
In some instances, sensor 134 can be the combination of RGB and IR cameras, and optical guidance element may include that visible ray is anti-
Penetrate or steering component and IR radiation reflectives or steering component.In some instances, camera can be it is small-sized, such as 2 millimeters
(mm) 2mm is multiplied.The example of such camera sensor is Omnivision OV7727.In other examples, camera can be enough
Small (such as Omnivision OV7727), for example, enable imaging sensor or camera 134 with the optical axis of display optical system 14
Or centered on other positions.For example, camera 134 can be embedded in the eyeglass of system 14.Furthermore it is possible to which application image filters skill
Camera is mixed into the user visual field to mitigate any interference to user by art.
In Fig. 3 B example, there are four groups of luminaires 163, luminaire 162 is matched and by barrier with photoelectric detector 163
It is dry between 164 reflected lights for separating to avoid incident light that luminaire 162 generated and receive at photoelectric detector 152
Disturb.In order to avoid unnecessary confusion in the accompanying drawings, reference has just been illustrated representational a pair.Each luminaire can be with
It is infrared (IR) luminaire for generating the about arrow beam of light of predetermined wavelength.Each in photoelectric detector is selectable to catch
The about light of the predetermined wavelength.It is infrared to include near-infrared.Because luminaire or photoelectric detector there may be wavelength drift
Move or be acceptable on the small scope of wavelength, so luminaire and photoelectric detector can have with to generate or examine
The relevant marginal range of the wavelength of survey.In the embodiment that sensor is IR cameras or IR position sensitive detectors (PSD), photoelectricity
Detector may include additional data capture equipment and can also be used to monitor the operation of luminaire, such as wave length shift, wave beam
Width change etc..The photoelectric detector also serves as the Visible Light Camera for sensor 134 to provide flashlight data.
As described above, in some embodiments of a part for vector to calculate corneal center are watched attentively as determination, two
Flash of light (and therefore two luminaires) will be enough.However, other embodiment can be it is determined that pupil position and and then determination
Watch attentively in vector using additional flash of light.Because representing that gleaming eyes data are repeatedly caught, such as with 30 frame per second
Or bigger frame per second, so the data of a flash of light can be blocked by eyelid or even by eyelashes, but data can be by another luminaire
The flash of light that is generated is collected.
Fig. 3 C show the another exemplary arrangement of the position of the set of corresponding gaze detection element in a pair of glasses.In the reality
Apply in example, two groups of luminaires 163 and photoelectric detector 162 are pointed to each frame portion 115 around display optical system 14
Near top, and another two groups of luminaires and photoelectric detector be pointed near the bottom of each frame portion 115, to show
Therefore geometrical relationship between luminaire simultaneously shows another example of the geometrical relationship between the flash of light that they are generated.Flash of light this
One arranges that the more information relevant with the pupil position in vertical direction can be provided.
Fig. 3 D show the another illustrative arrangements of the position of the set of corresponding gaze detection element.In this example, sense
Device 134r, 134l display optical system 14r, 14l corresponding to its optical axis are in line or aligned with it, but on mirror holder 115
Positioned at the lower section of system 14.In addition, in certain embodiments, camera 134 can be depth camera or including depth transducer.Depth
Camera can be used for following the trail of eyes in 3D.In this example, there are two set of luminaire 153 and photodetector 152.
Fig. 4 is the block diagram for each component for depicting head-mounted display apparatus 2.Fig. 5 is describe processing unit 4 various groups
The block diagram of part.Head-mounted display apparatus 2 (its component is depicted in Fig. 4) be used by by one or more virtual images with
User provides a user mixed reality experience to the seamless fusion of the view of real world.In addition, Fig. 4 wear-type shows and set
Slave component includes many sensors for following the trail of various situations.Head-mounted display apparatus 2 will be received on virtual graph from processing unit 4
The instruction of picture, and sensor information will be provided back to processing unit 4.Processing unit 4 (its component is described in Fig. 4) will connect
Receive the sensor information from head-mounted display apparatus 2.
Using the information and the possible information from maincenter computing system 12, processing unit 4 can determine that wherein and
When providing a user virtual image and correspondingly sending an instruction to Fig. 4 head-mounted display apparatus.As set forth below,
Using the information from eye position and tracking component 134, processing unit 4 can extraly determine eye position and follow the trail of component
The degree that camera is focused in 134.This information can be used for generation be present on micro-display 120 (and therefore display
In optical system 14 or various elements 124,115,126 etc.) proxy image.User can be instructed to focus on machine by adjusting camera
Structure focuses on the proxy image.In this way, camera can be focused.
Some (such as camera 112, eye tracks sensor 134B, micro-displays towards room in Fig. 4 component
120th, opaque filter 114, eye tracks illumination 134A, earphone 130 and temperature sensor 138) be it is shown in phantom, with
Each in these equipment is indicated in the presence of two, one of them is used for the left side of head-mounted display apparatus 2, and one is used for head
Wear the right side of formula display device 2.Fig. 4 shows the control circuit 200 communicated with electric power management circuit 202.Control circuit 200 includes
Processor 210, the Memory Controller 212 communicated with memory 214 (such as D-RAM), camera interface 216, camera delay
Rush device 218, display driver 220, display format device 222, timing generator 226, display output interface 228 and display
Input interface 230.
In one embodiment, all component of control circuit 200 all by special circuit or one or more buses that
This is communicated.In another embodiment, each component of control circuit 200 communicates with processor 210.Camera interface 216
Interface to two cameras 112 towards room is provided, and will be stored in from the image received by the camera towards room
In camera buffer 218.Display driver 220 will drive micro-display 120.Display format device 222 is to controlling opaque filter
The opacity control circuit 224 of light device 114 provides the information of the virtual image on just being shown on micro-display 120.It is fixed
When maker 226 be utilized for the system provide timing data.Display output interface 228 be for by image from towards room
Camera 112 is supplied to the buffer of processing unit 4.Display input interface 230 is such as will be in micro-display 120 for reception
The buffer of the image of the virtual image of display etc.Display output interface 228 and display input interface 230 are with being used as processing
The band interface 232 of the interface of unit 4 communicates.
Electric power management circuit 202 includes voltage regulator 234, eye tracks illumination driver 236, audio DAC and amplification
Device 238, microphone preamplifier and audio ADC 240, temperature sensor interface 242 and clock generator 244.Voltage is adjusted
Save device 234 and electric power is received from processing unit 4 by band interface 232, and the electric power is supplied to other of head-mounted display apparatus 2
Component.Each eye tracks illumination driver 236 provides IR light sources for eye tracks illumination 134A as described above.
Audio DAC and amplifier 238 export audio-frequency information to earphone 130.Microphone preamplifier and audio ADC 240 are provided for talking about
The interface of cylinder 110.Temperature sensor interface 242 is the interface for temperature sensor 138.Electric power management circuit 202 is also to three
Axle magnetometer 132A, three-axis gyroscope 132B and three axis accelerometer 132C provide electric energy and receive from it back data.
Fig. 5 is the block diagram for the various assemblies for describing processing unit 4.Fig. 5 shows the control communicated with electric power management circuit 306
Circuit 304.Control circuit 304 includes:CPU (CPU) 320, graphics processing unit (GPU) 322, cache
324th, RAM 326, communicated with memory 330 (such as D-RAM) Memory Controller 328, with flash memory 334 (or other
The non-volatile memories of type) communicated flash controller 332, show by band interface 302 and with interface 232 and wear-type
Show display output buffer 336 that equipment 2 communicated, by band interface 302 and with interface 232 and head-mounted display apparatus 2
The display input buffer 338 that is communicated, with for being connected to the microphone that the external speaker connector 342 of microphone is communicated
Interface 340, PCI express interfaces and (one or more) USB port for being connected to Wireless Telecom Equipment 346
348.In one embodiment, Wireless Telecom Equipment 346 may include to enable Wi-Fi communication equipment, it is Bluetooth communication equipment, infrared
Communication equipment etc..USB port can be used for processing unit 4 being docked to maincenter computing system 12, so as to which data or software are filled
It is downloaded to processing unit 4 and processing unit 4 is charged.In one embodiment, CPU 320 and GPU 322 are to be used to determine
The main power of virtual three-dimensional object where, when and how is inserted into the visual field of user.More details presented below.
Electric power management circuit 306 includes clock generator 360, analog-digital converter 362, battery charger 364, voltage-regulation
(it can for device 366, head mounted display power supply 376 and the temperature sensor interface 372 that is communicated with temperature sensor 374
It can be located on the wrist strap of processing unit 4).Analog-digital converter 362 is used for monitoring cell voltage, temperature sensor, and control
Battery charging function.Voltage regulator 366 is communicated with the battery 368 for providing electric power to the system.Battery charger
364 are used to charge to battery 368 (by voltage regulator 366) when receiving electric power from charging jacks 370.HMD
Power supply 376 provides electric power to head-mounted display apparatus 2.
Fig. 6 is the diagram of one embodiment of the system 600 for being focused on to camera 604.System 600 includes processing logic
602nd, camera 604 and display 606.In one embodiment, system 600 is HMD 2 part.Camera 604 can be
It is described herein it is magazine any one.In one embodiment, camera 604 is IR cameras.Camera 604 can be eyes position
Put and follow the trail of a part for component 134.For example, camera 604 can be eye tracks camera 134B.Camera 604 can also be court
To the camera 112 in room.Camera to be focused on is not limited to these examples.
In one embodiment, display 606 includes HMD 2 described herein one or more elements.For example, display
Device 606 may include micro-display 120.In addition, such as reflecting surface 124, light-guide optical element 115, selective reflecting surface 126
Etc the combination of element can be considered as display 606.With reference to Fig. 3 B-3D, display 606 may include display optical system 14.
It is noted that display 606 may include the display for right eye and the display for left eye.However, existing for each
The single display of individual eyes is not required in that.
Processing logic 602 has the focus detection 612 for being used for detecting the degree that camera 604 is focused.Focus detection 612 is defeated
Enter data that are associated with camera 604 and can be used for the degree that determination camera 604 is focused.This data can be come from
The view data of camera 604.This can be the data in addition to view data, such as indicate camera 604 apart from camera 604 just
The how far data of the object of focusing thereon.Degree that the output indication camera 604 of focus detection 612 is focused (or camera
By " shape library " degree) signal.
Render signal of 614 inputs from focus detection 612 and generate with the degree negative with the shape library of camera 604
The image (for example, " proxy image ") of the fog-level of pass.It is noted that this can be a kind of negatively correlated, i.e., when camera is got over
, may be fewer fuzzy in proxy image when focus is aligned.In other words, may in proxy image when camera is by fuzzyyer focusing
More Full Fuzzy." fuzzy " image is somebody's turn to do to be present on display 606.In one embodiment, render is in graphics process list
Realized on first (GPU).
Camera focusing 605 allows camera 604 to be focused.For example, user can manually adjust camera focusing
605.As an example, camera 604 can have condenser lens (in addition to the object lens of camera), and the condenser lens is moved
Camera 604 is focused.However, many other types of camera focusing 605 can be used.For HMD 2, user is adjustable
The distance between whole their eyes and camera 604 to camera 604 to focus on.In the example present, camera focusing 605 can be with
It is a kind of structure on HMD, the structure allows the eyes when user wears HMD 2 by the position of camera relative to user to carry out
It is mobile.
Image procossing 610 inputs the view data from camera 604 and performs the certain type of place to view data
Reason.Eye tracks are an examples, but processing can be any processing.Processing logic 602, focus detection 612, render 614, with
And image procossing 610 can be realized each with certain combination of software, hardware or software and hardware.Each element in Figure 4 and 5
It can be used.For example, rendering 614 can be performed by GPU (Fig. 5,322), and image procossing 610 and focus detection 612 can be by CPU
(Fig. 5,320) is performed.Processor (Fig. 4,210) may be additionally used for image procossing 610, render 614 and/or focus detection 612.With
In the instruction performed on each processor 210,320,322 can be stored in memory (for example, memory 244,330,334,
Cache 324, RAM 326) in.These are only example, but are not intended to restricted.Further, it is not required that processing
Logic 602, render 614 and/or focus detection 612 instruction that is performed on processor realize.For example, can be used special integrated
Circuit (ASIC).
Fig. 7 is related to the one embodiment for the process being focused to camera.This process can be used for Fig. 6 system 600
In.By each element in reference Fig. 6, but the process is not limited to the system 600.The process can be used for HMD 2, but be not must
Need.In one embodiment, the process be used to be focused infrared camera.The process can start in many ways.
One possibility is to be used to system (for example, HMD) determine that camera needs to be focused, and starts the process in response to this.
This process is comprising presentation proxy image to user.In step 702, the instruction user of system 600 they will be in the mistake
Attempt to be focused proxy image during journey.It is noted that when this instruction is provided, system 600 may or may not
Proxy image is presented.System 600 may also provide the instruction for being focused to proxy image.This instruction can be for user
To adjust camera focusing to attempt to become proxy image more to focus on.
In step 704, system 600 receives the data associated with camera 604.In one embodiment, the data are come
From the view data of camera 604.For example, the data can be IR images.This data be the view data from camera not
It is required.In one embodiment, this data indicates that camera 604 and camera 604 are waited to focus between object thereon
Distance.
In step 706, system 600 determines the degree that camera 604 is focused.In one embodiment, system 600 determines to come
The degree being focused from the image of camera 604.However, system 600 can be by this judgement based on information in addition to images.Example
Such as, this judgement can wait to focus on the data of the distance between object thereon based on instruction camera 604 and camera 604.
In step 708, the generation of system 600 has a certain degree of fuzzy proxy image, fuzzy degree and camera 604
The degree being focused is negatively correlated.As mentioned above, this can make to be related.In other words, in proxy image obscure degree with
The degree out of focus of camera 604 is related.Further detail below is described below.
In step 710, system 600 includes proxy image on display 606.In one embodiment, proxy image
It is displayed on HMD.As mentioned above, display 606 may include display optical system 14, micro-display 120, reflecting surface
124th, light-guide optical element 115, selective reflecting surface 126.
In step 712, user's adjustment camera focusing.User attempts proxy image becoming focus alignment.Hereafter retouch
Further detail below is stated.
In step 714, system 600 determines whether camera 604 is satisfactorily focused on.If it is, process is completed.It is no
Then, process return to step 704 is with the reception more data associated with camera 604.Assuming that user enters to camera focusing
Adjustment is gone, system 600 will determine that the degree that camera is focused on has changed (in step 706).Therefore, generated in step 708
The fog-level that proxy image will be updated with it so that the fog-level is negatively correlated with new camera focus level.
Fig. 8 is the flow chart for generating one embodiment of the process of proxy image.The process is Fig. 7 step 708
One embodiment.Again, by each element in reference Fig. 6, but it is understood that the process is not limited to the system 600.In step
Rapid 802, system 600 accesses the benchmark image from processor readable storage.The benchmark image is typically situated in visible spectrum
Image.The content of the benchmark image is inessential.Because the benchmark image will be displayed to user, therefore this figure
As can be based on having good properties to be selected so that user can be focused to it.It is noted that can be directly from IR phases
The image generated in machine image may be not suitable for user to be focused.If for example, IR images will be offset on wavelength makes it
It can be seen that, then it may be difficult to be focused the image for a user.For purposes of discussion, Fig. 9 A are shown with flowerpot
Plant example reference image 910.It is noted that this benchmark image 910 is not from the image of camera 604.
In step 804, system 600 is obscured to form proxy image to benchmark image 910.For purposes of discussion,
The example proxy image 920 for the version that the warp that Fig. 9 B are shown as benchmark image 910 is obscured.In one embodiment, this mould
Paste is performed by using mathematical function.For example, point spread function (PSF) can be used.PSF can be applied to benchmark image 910
In each pixel, it can be used for the light intensity distributions of each respective pixel to adjacent pixel.Then, as a result it is cumulatively added
To produce proxy image 920.The fog-level that PSF width can be set to be needed.In one embodiment, step
804 perform on GPU.
As previously discussed, the focus level of the fog-level of proxy image 920 and camera is negatively correlated.For example, camera
604 more focus alignments, proxy image 920 is not obscured.Therefore, this is referred to alternatively as negative correlation.Alternatively, it is properly termed as camera
604 is more out of focus, and proxy image 920 is fuzzyyer.
Fig. 9 C depict the figure with curve 950, and curve 950 represents the fog-level and camera 604 of proxy image 920
Focus level is negatively correlated.System 600 can determine that the value for representing camera focus level, and it can be represented by x-axis.System 600 can
Determine the suitable fog-level of proxy image so that produce desired negative correlation.
Figure 10 A are the diagrams of one embodiment of the system for being focused on to camera.This is the deformation of Fig. 6 system 600,
Wherein focus detection 612a input image datas with determine the focus of camera 604 alignment degree.In one embodiment, picture number
According to being IR view data.For example, camera 604 catches the image in infrared wavelength.Figure 10 A system can be HMD 2.It is not institute
Some elements are all depicted, to avoid making diagram from obscuring.
In one embodiment, focus detection 612a determines the journey of camera image focus alignment using image processing techniques
Degree.There is the technology of the degree of the image focal point alignment known to persons of ordinary skill in the art for being used to determine from camera 604.
Any convenient technology can be used.For example, Contrast Detection can be used.Contrast in Contrast Detection measurement camera image
Degree.Intensity difference between adjacent pixel should be focused on and increased with correct image.Thus optical system can be adjusted until inspection
Measure maximum-contrast.The other technologies of analysis camera image can be used to determine the degree of camera image focus alignment.
Benchmark image 910 is illustrated as the input to rendering 614, renders 614 output agent images 920.Proxy image
920 are present on display 606.User's (being represented with eyes 140) can manually adjust camera focusing 605.For example, with
Family can adjust their eyes and camera by moving camera 604 (as indicated by the double arrow) along camera focusing 605
The distance between 604.As an example, camera focusing 605 can be allow to adjust camera 604 to eyes 140 away from
From any structure being adjusted.Camera focusing 605 can be the structure in HMD 2 frame, and it allows relative to eye
Eyeball moves camera position.Other technologies can be used to adjust the focusing of camera 604.
In Figure 10 A example, processing logic 602 has eye tracks 1010, and it inputs camera image data.This comes
From an example of Fig. 6 image procossing 610.
Focus detection 612 is not required in that using graphical analysis.Figure 10 B be wherein focus detection 612b range data with
Determine the diagram of one embodiment of the degree of the focus of camera 604 alignment.In one embodiment, the range data indicates camera
The object that 604 distances are just being focused is how far.In one embodiment, the range data indicates eye of the camera 604 apart from user
Eyeball 140 is how far.Figure 10 A system can be HMD 2.Not every element is all depicted, to avoid making diagram from obscuring.
Focus detection 612a can have the target range that camera 604 should be at.As an example, focus detection 612a can
Access the table having focus level and each each associated value of distance.As another example, focus detection 612b can
Focus level is determined come the known attribute (focal length etc.) based on transmission range and camera 604 using arithmetic equation.
Range data can be provided by camera 604 oneself.It can determine its own with to gather for example, camera 604 can have
The rangefinder of the distance between burnt object thereon.Range data can be by addition to the camera 604 being just focused
Element is provided.For example, there may be another camera that can be used for determining distance on HMD.Alternatively, (the camera on HMD
) range data can determine by the equipment (such as neighbouring another camera) not on HMD.
Figure 11 shows to can be used for realize maincenter computing system 12 or the computing system of other processors disclosed herein
Example embodiment.As shown in Figure 11, computing system 500, which has, contains on-chip cache 502, the and of second level cache 504
The CPU (CPU) 501 of flash rom (read-only storage) 506.On-chip cache 502 and second level cache 504
Interim storage data, and the quantity of memory access cycle is therefore reduced, thus improve processing speed and handling capacity.CPU
501 can be provided as with more than one kernel, and thus have additional firsts and seconds cache 502 and 504.
Flash rom 506 is storable in the executable code loaded when computing device 500 is powered in the initial phase of bootup process.
Graphics processing unit (GPU) 508 and the formation of video encoder/video codec (encoder/decoder) 514 are used
In the video processing pipeline handled with high graphics at a high speed.Via bus from graphics processing unit 508 to Video coding
Device/Video Codec 514 transports data.Video processing pipeline is used for the output data of A/V (audio/video) port 540
Transmit to TV or other displays.Memory Controller 510 is connected to GPU 508 and accesses various types of with convenient processor
Memory 512, such as but is not limited to RAM (random access memory).
Computing device 500 include preferably realized in module 518 I/O controllers 520, System Management Controller 522,
Audio treatment unit 523, network interface 524, the first USB host controller 526, the second USB controller 528 and front panel I/O
Subassembly 530.USB controller 526 and 528 is used as peripheral controllers 542 (1) -542 (2), wireless adapter 548 and external
The main frame of memory devices 546 (for example, flash memory, external CD/DVD ROM drives, removable medium etc.).Network interface 524
And/or wireless adapter 548 provides the access to network (for example, internet, home network etc.), and can include ether
In a variety of wired or wireless adapter assemblies of network interface card, modem, bluetooth module, cable modem etc.
It is any.
System storage 543 is provided to be stored in the application data being loaded during bootup process.Media-driven is provided
Device 544, and it may include DVD/CD drivers, blu-ray drive, hard disk drive or other removable media drivers etc..
Media drive 544 can be located at the internal or external of computing device 500.Application data can be accessed via media drive 544,
Perform for computing device 500, play back.Media drive 544 is via such as Serial ATA bus or other high speed connection (examples
Such as IEEE 1394) etc bus be connected to I/O controllers 520.
System Management Controller 522 provides the various service functions related to ensuring the availability of computing device 500.Audio
The respective audio processing streamline of processing unit 523 and the formation of audio codec 532 with high fidelity and three-dimensional sonication.
Voice data is transmitted via communication link between audio treatment unit 523 and audio codec 532.Audio processing pipeline
Output data to A/V ports 540, for external audio user or the equipment with audio capability reproduce.
Front panel I/O subassemblies 530 support power knob 550 and ejection on the outer surface of computing device 500
The function of button 552 and any LED (light emitting diode) or other indicators.System power supply module 536 is to computing device 500
Assembly power supply.Circuit in the Cooling calculation equipment 500 of fan 538.
Each other component in CPU 501, GPU 508, Memory Controller 510 and computing device 500 are via one
Or appoint in multiple bus interconnection, including serial and concurrent bus, memory bus, peripheral bus and the various bus architectures of use
A kind of processor or local bus.As an example, such framework may include peripheral parts interconnected (PCI) bus, PCI-
Express buses etc..
When computing device 500 is powered, application data can be loaded into memory 512 and/or height from system storage 543
Performed in speed caching 502,504 and on CPU 501.Using available different media classes on computing device 500 can be being navigate to
The graphic user interface that consistent Consumer's Experience is provided is presented during type.In operation, the application included in media drive 544
And/or other media can start or play from media drive 544, and additional function is supplied into computing device 500.
Technical equipment 500 can by simple system is connected to television set or other displays and as autonomous system come
Operation.In the stand-alone mode, computing device 500 allows one or more users and the system interaction, seen a film, or audition
It is happy.However, in the case where being integrated with the broadband connection that can be made available by by network interface 524 or wireless adapter 548,
Computing device 500 is alternatively arranged as the participant in larger Web Community to operate.In addition, computing device 500 can be by wireless
Adapter 548 communicates with processing unit 4.
Optional input equipment (for example, controller 542 (1) and 542 (2)) is by game application and system Application share.It is defeated
It is not the resource retained to enter equipment, but to be switched between system application and game application so that it will each have equipment
Focus.The switching of application manager preferably control input stream, the knowledge without knowing game application, and driver is tieed up
Protect the status information of relevant focus switching.Equipment 500 can be defined via USB controller 526 or other interfaces by catching equipment 20
Additional input equipment.In other embodiments, maincenter computing system 12 can use other hardware structures to realize.Without one
Individual hardware structure is required.
Although acting special language with architectural feature and/or method describes present subject matter, it is to be understood that, it is appended
Theme defined in claims is not necessarily limited to above-mentioned specific features or action.More precisely, above-mentioned specific features and dynamic
Work is as realizing disclosed in the exemplary forms of claim.The scope of the present invention is defined by the claims appended hereto.
Claims (15)
1. a kind of device, including:
Display;
Camera;And
The processing logic communicated with the display and the camera, the processing logic receives the number associated with the camera
According to determining the degree of camera focus alignment based on the data, generation has a certain degree of fuzzy proxy image, mould
The degree that paste degree is aligned with the camera focus is negatively correlated, and instruction user adjusts camera focusing with preferably to institute
Proxy image is stated to be focused.
2. device as claimed in claim 1, it is characterised in that the processing logic is received after the proxy image is shown
The updated data associated with the camera, determine the new degree of the camera focus alignment, change agency's figure
The fog-level of picture is negatively correlated with the new degree being aligned with the camera focus, and is shown based on modified fog-level
Updated proxy image.
3. device as claimed in claim 1 or 2, it is characterised in that it is associated with the camera that the processing logic is received
Data be the image from the camera, the processing logic determines the phase based on the degree that described image focus is aligned
The degree of machine focus alignment.
4. device as claimed in claim 3, it is characterised in that the image from the camera is infrared (IR) image.
5. device as claimed in claim 1 or 2, it is characterised in that it is associated with the camera that the processing logic is received
Data include the distance of the camera distance object;The processing logic determines the camera focus pair based on the distance
Accurate degree.
6. the device as described in any one of claim 1 to 5, it is characterised in that the display is nearly eye, perspective display
Device, described device is head mounted display (HMD), and the camera is focused on the user for wearing the HMD.
7. the device as described in any one of claim 1 to 6, it is characterised in that the processing logical access benchmark image,
The processing logic is obscured to create the proxy image to the benchmark image.
8. device as claimed in claim 7, it is characterised in that also including graphics processing unit (GPU), the processing logic exists
The benchmark image is obscured to create the proxy image on the GPU.
9. a kind of method, including:
A) data associated with camera are received;
B) degree of the camera focus alignment is determined based on received data;
C) generation has a certain degree of fuzzy proxy image, and the degree that fuzzy degree is aligned with camera focus is negatively correlated;
D) proxy image is shown over the display;
E) repetition is described a)-described d), including receives the updated data associated with camera, determines that camera focus is aligned
New degree, the new degree negative correlation for changing the fog-level of the proxy image be directed at the camera focus, with
And show modified proxy image on the display.
10. method as claimed in claim 9, it is characterised in that further comprise:
Instruction user adjusts the focusing associated with the camera to attempt to become the proxy image more to focus on.
11. the method as described in claim 9 or 10, it is characterised in that receiving the data associated with camera includes receiving next
From the image of the camera;
Determine that the degree of the camera focus alignment includes the journey for determining received image focal point alignment based on the data
Degree.
12. method as claimed in claim 11, it is characterised in that the image from the camera is infrared (IR) image.
13. the method as described in claim 9 or 10, it is characterised in that receive the data associated with camera and refer to including reception
Show the data of the distance of the camera distance object;
Determine that the degree of the camera focus alignment includes the degree that the camera focus alignment is determined based on the distance.
14. the method as described in any one of claim 9-13, it is characterised in that generation has a certain degree of fuzzy
Proxy image, fuzzy degree includes with the degree negative correlation that the camera focus is aligned:
Access benchmark image;And
Mathematically the benchmark image is obscured to create the proxy image.
15. method as claimed in claim 14, it is characterised in that mathematically obscured the benchmark image to create
State proxy image be included in graphics processing unit (GPU) use graph and image processing.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/562,292 US10498976B2 (en) | 2014-12-05 | 2014-12-05 | Virtual focus feedback |
US14/562,292 | 2014-12-05 | ||
PCT/US2015/062849 WO2016089712A1 (en) | 2014-12-05 | 2015-11-28 | Virtual focus feedback |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107005653A true CN107005653A (en) | 2017-08-01 |
CN107005653B CN107005653B (en) | 2021-04-27 |
Family
ID=55022682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201580066112.XA Active CN107005653B (en) | 2014-12-05 | 2015-11-28 | Device and method for performing virtual focus feedback and head-mounted display |
Country Status (6)
Country | Link |
---|---|
US (1) | US10498976B2 (en) |
EP (1) | EP3228072B1 (en) |
JP (1) | JP6718873B2 (en) |
KR (1) | KR102599889B1 (en) |
CN (1) | CN107005653B (en) |
WO (1) | WO2016089712A1 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10552676B2 (en) | 2015-08-03 | 2020-02-04 | Facebook Technologies, Llc | Methods and devices for eye tracking based on depth sensing |
US10338451B2 (en) | 2015-08-03 | 2019-07-02 | Facebook Technologies, Llc | Devices and methods for removing zeroth order leakage in beam steering devices |
US10437061B2 (en) | 2015-08-03 | 2019-10-08 | Facebook Technologies, Llc | Near-ocular display based on hologram projection |
US10459305B2 (en) | 2015-08-03 | 2019-10-29 | Facebook Technologies, Llc | Time-domain adjustment of phase retardation in a liquid crystal grating for a color display |
US9544054B1 (en) * | 2015-08-10 | 2017-01-10 | Facebook, Inc. | Multidirectional communication system |
US10247858B2 (en) | 2015-10-25 | 2019-04-02 | Facebook Technologies, Llc | Liquid crystal half-wave plate lens |
US10416454B2 (en) | 2015-10-25 | 2019-09-17 | Facebook Technologies, Llc | Combination prism array for focusing light |
US10203566B2 (en) | 2015-12-21 | 2019-02-12 | Facebook Technologies, Llc | Enhanced spatial resolution using a segmented electrode array |
US11222519B2 (en) * | 2016-05-31 | 2022-01-11 | Ellcie-Healthy | Personal system for the detection of a risky situation, more particularly of a fall prone situation |
US10964190B2 (en) * | 2016-05-31 | 2021-03-30 | Ellcie-Healthy | Personal system for the detection of a risky situation and alert |
CN114928737B (en) | 2016-10-12 | 2023-10-27 | 弗劳恩霍夫应用研究促进协会 | Spatially unequal streaming |
US10877556B2 (en) | 2016-10-21 | 2020-12-29 | Apple Inc. | Eye tracking system |
KR102173778B1 (en) | 2017-07-25 | 2020-11-03 | 주식회사 엘지화학 | Battery management unit and a battery pack including the same |
KR102086779B1 (en) * | 2018-05-14 | 2020-03-09 | 단국대학교 산학협력단 | Focus setting modul and method, virtual video playback system including the same |
CN112261300B (en) * | 2020-10-22 | 2021-12-24 | 维沃移动通信(深圳)有限公司 | Focusing method and device and electronic equipment |
US20230283896A1 (en) * | 2022-03-03 | 2023-09-07 | Jsc Yukon Advanced Optics Worldwide | Focus indication for manual focus adjustments |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000033569A1 (en) * | 1998-11-25 | 2000-06-08 | Iriscan, Inc. | Fast focus assessment system and method for imaging |
US20030117511A1 (en) * | 2001-12-21 | 2003-06-26 | Eastman Kodak Company | Method and camera system for blurring portions of a verification image to show out of focus areas in a captured archival image |
CN101067710A (en) * | 2006-01-20 | 2007-11-07 | 红外线解决方案公司 | Camera with visible light and infrared image blending |
CN101387734A (en) * | 2007-09-14 | 2009-03-18 | 三星电子株式会社 | Method and apparatus for auto focusing |
CN102591016A (en) * | 2010-12-17 | 2012-07-18 | 微软公司 | Optimized focal area for augmented reality displays |
CN104079908A (en) * | 2014-07-11 | 2014-10-01 | 上海富瀚微电子股份有限公司 | Infrared and visible light image signal processing method and implementation device thereof |
US20140354874A1 (en) * | 2013-05-30 | 2014-12-04 | Samsung Electronics Co., Ltd. | Method and apparatus for auto-focusing of an photographing device |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6753919B1 (en) * | 1998-11-25 | 2004-06-22 | Iridian Technologies, Inc. | Fast focus assessment system and method for imaging |
CN101111748B (en) | 2004-12-03 | 2014-12-17 | 弗卢克公司 | Visible light and ir combined image camera with a laser pointer |
US7542210B2 (en) | 2006-06-29 | 2009-06-02 | Chirieleison Sr Anthony | Eye tracking head mounted display |
US8194995B2 (en) | 2008-09-30 | 2012-06-05 | Sony Corporation | Fast camera auto-focus |
JP2010213105A (en) * | 2009-03-11 | 2010-09-24 | Olympus Corp | Imaging apparatus |
JP5346266B2 (en) * | 2009-09-30 | 2013-11-20 | 富士フイルム株式会社 | Image processing apparatus, camera, and image processing method |
US8599264B2 (en) | 2009-11-20 | 2013-12-03 | Fluke Corporation | Comparison of infrared images |
DE102010019399B4 (en) | 2010-05-04 | 2019-02-28 | Testo Ag | Image acquisition method for IR images and thermal imaging camera |
US8654239B2 (en) | 2010-05-17 | 2014-02-18 | Flir Systems Ab | Focus ring-controlled focusing servo |
JP2012023468A (en) * | 2010-07-12 | 2012-02-02 | Ricoh Co Ltd | Imaging device |
US9304319B2 (en) * | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
US8508652B2 (en) | 2011-02-03 | 2013-08-13 | DigitalOptics Corporation Europe Limited | Autofocus method |
JP5779910B2 (en) | 2011-03-01 | 2015-09-16 | 日本電気株式会社 | Infrared camera and focus position correction method |
JP2013114123A (en) | 2011-11-30 | 2013-06-10 | Seiko Epson Corp | Transmission type display device, display method and display program |
US8736747B2 (en) | 2012-01-13 | 2014-05-27 | Sony Corporation | Camera autofocus adaptive blur matching model fitting |
JP2013165413A (en) * | 2012-02-13 | 2013-08-22 | Nikon Corp | Image display device |
US9001030B2 (en) | 2012-02-15 | 2015-04-07 | Google Inc. | Heads up display |
KR102516124B1 (en) * | 2013-03-11 | 2023-03-29 | 매직 립, 인코포레이티드 | System and method for augmented and virtual reality |
-
2014
- 2014-12-05 US US14/562,292 patent/US10498976B2/en active Active
-
2015
- 2015-11-28 JP JP2017527868A patent/JP6718873B2/en active Active
- 2015-11-28 KR KR1020177017581A patent/KR102599889B1/en active IP Right Grant
- 2015-11-28 WO PCT/US2015/062849 patent/WO2016089712A1/en active Application Filing
- 2015-11-28 EP EP15816295.8A patent/EP3228072B1/en active Active
- 2015-11-28 CN CN201580066112.XA patent/CN107005653B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000033569A1 (en) * | 1998-11-25 | 2000-06-08 | Iriscan, Inc. | Fast focus assessment system and method for imaging |
US20030117511A1 (en) * | 2001-12-21 | 2003-06-26 | Eastman Kodak Company | Method and camera system for blurring portions of a verification image to show out of focus areas in a captured archival image |
CN101067710A (en) * | 2006-01-20 | 2007-11-07 | 红外线解决方案公司 | Camera with visible light and infrared image blending |
CN101387734A (en) * | 2007-09-14 | 2009-03-18 | 三星电子株式会社 | Method and apparatus for auto focusing |
CN102591016A (en) * | 2010-12-17 | 2012-07-18 | 微软公司 | Optimized focal area for augmented reality displays |
US20140354874A1 (en) * | 2013-05-30 | 2014-12-04 | Samsung Electronics Co., Ltd. | Method and apparatus for auto-focusing of an photographing device |
CN104079908A (en) * | 2014-07-11 | 2014-10-01 | 上海富瀚微电子股份有限公司 | Infrared and visible light image signal processing method and implementation device thereof |
Also Published As
Publication number | Publication date |
---|---|
JP6718873B2 (en) | 2020-07-08 |
CN107005653B (en) | 2021-04-27 |
WO2016089712A1 (en) | 2016-06-09 |
JP2018507570A (en) | 2018-03-15 |
KR102599889B1 (en) | 2023-11-07 |
US20160165151A1 (en) | 2016-06-09 |
US10498976B2 (en) | 2019-12-03 |
EP3228072B1 (en) | 2021-03-10 |
KR20170094255A (en) | 2017-08-17 |
EP3228072A1 (en) | 2017-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107005653A (en) | Virtual focusing feeds back | |
JP6641361B2 (en) | Waveguide eye tracking using switched diffraction gratings | |
JP6498606B2 (en) | Wearable gaze measurement device and method of use | |
CN105359076B (en) | Multi-step virtual objects selection method and device | |
CN106662685B (en) | It is tracked using the waveguide eyes of volume Bragg grating | |
US20140375540A1 (en) | System for optimal eye fit of headset display device | |
CN103091843B (en) | See-through display brilliance control | |
CN105452994B (en) | It is preferably watched while dummy object | |
KR102341225B1 (en) | Eye tracking apparatus, method and system | |
CA2750287C (en) | Gaze detection in a see-through, near-eye, mixed reality display | |
US20150003819A1 (en) | Camera auto-focus based on eye gaze | |
TWI597623B (en) | Wearable behavior-based vision system | |
CN102566049B (en) | Automatic variable virtual focus for augmented reality displays | |
US20160131902A1 (en) | System for automatic eye tracking calibration of head mounted display device | |
JP2019159076A (en) | Head mounted display device, display control method and computer program | |
KR20170065631A (en) | See-through display optic structure | |
US20230209032A1 (en) | Detection, analysis and correction of disparities in a display system utilizing disparity sensing port | |
KR20230152724A (en) | Projector with field lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |