WO2015008654A1 - 検出装置および方法 - Google Patents
検出装置および方法 Download PDFInfo
- Publication number
- WO2015008654A1 WO2015008654A1 PCT/JP2014/068126 JP2014068126W WO2015008654A1 WO 2015008654 A1 WO2015008654 A1 WO 2015008654A1 JP 2014068126 W JP2014068126 W JP 2014068126W WO 2015008654 A1 WO2015008654 A1 WO 2015008654A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- eyeball
- detection device
- light receiving
- user
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 206
- 238000000034 method Methods 0.000 title abstract description 70
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 229
- 238000012545 processing Methods 0.000 claims abstract description 145
- 210000001747 pupil Anatomy 0.000 claims description 77
- 210000001508 eye Anatomy 0.000 claims description 46
- 210000003128 head Anatomy 0.000 claims description 13
- 210000004087 cornea Anatomy 0.000 claims description 11
- 238000005516 engineering process Methods 0.000 abstract description 19
- 230000008569 process Effects 0.000 description 44
- 230000033001 locomotion Effects 0.000 description 36
- 230000017531 blood circulation Effects 0.000 description 24
- 230000010349 pulsation Effects 0.000 description 13
- 238000001028 reflection method Methods 0.000 description 12
- 210000001525 retina Anatomy 0.000 description 12
- 241001469893 Oxyzygonectes dovii Species 0.000 description 11
- 239000008280 blood Substances 0.000 description 11
- 210000004369 blood Anatomy 0.000 description 11
- 230000008859 change Effects 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 9
- 238000010248 power generation Methods 0.000 description 9
- 230000004424 eye movement Effects 0.000 description 7
- 230000004459 microsaccades Effects 0.000 description 6
- 210000000744 eyelid Anatomy 0.000 description 5
- INGWEZCOABYORO-UHFFFAOYSA-N 2-(furan-2-yl)-7-methyl-1h-1,8-naphthyridin-4-one Chemical compound N=1C2=NC(C)=CC=C2C(O)=CC=1C1=CC=CO1 INGWEZCOABYORO-UHFFFAOYSA-N 0.000 description 4
- 101100135609 Arabidopsis thaliana PAP10 gene Proteins 0.000 description 4
- 108010064719 Oxyhemoglobins Proteins 0.000 description 4
- 210000004204 blood vessel Anatomy 0.000 description 4
- 108010002255 deoxyhemoglobin Proteins 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000003595 spectral effect Effects 0.000 description 4
- 206010020565 Hyperaemia Diseases 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 230000004397 blinking Effects 0.000 description 3
- 238000002570 electrooculography Methods 0.000 description 3
- 238000002834 transmittance Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 108010054147 Hemoglobins Proteins 0.000 description 2
- 102000001554 Hemoglobins Human genes 0.000 description 2
- 101100212791 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) YBL068W-A gene Proteins 0.000 description 2
- 241001504424 Zosteropidae Species 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000008451 emotion Effects 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 238000005461 lubrication Methods 0.000 description 2
- 230000001179 pupillary effect Effects 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 206010010741 Conjunctivitis Diseases 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000000862 absorption spectrum Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000008033 biological extinction Effects 0.000 description 1
- 210000000795 conjunctiva Anatomy 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000001050 lubricating effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 201000005111 ocular hyperemia Diseases 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000004434 saccadic eye movement Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 229920003002 synthetic resin Polymers 0.000 description 1
- 239000000057 synthetic resin Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6821—Eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present technology relates to a detection device and method, and more particularly, to a detection device and method that can improve operability with a simple configuration.
- a user interface for moving a cursor or pointer on the screen requires an operation means.
- a method for detecting a user operation for moving a cursor or the like a method for detecting a movement of a user's limb or finger operation part photographed by a camera from a position of the operation part in the image, or a user's limb
- a method of detecting movement from a signal of a gyro sensor attached to a finger is also a method of detecting movement from a signal of a gyro sensor attached to a finger.
- an external detector such as a camera or a gyro sensor is required.
- a search coil method that utilizes the fact that a potential proportional to the angle between the magnetic field and the coil is generated in the coil placed in the magnetic field.
- a detection coil is incorporated into a contact lens and attached to an eyeball.
- a magnetic coil that applies a horizontal and vertical magnetic field to the outside is provided, and induction induced in the detection coil incorporated in the contact lens with respect to the magnetic field applied from the outside.
- the movement of the eyeball is detected by detecting the electric power.
- an electrode is attached around the eye using the fact that the cornea has a positive potential of 10-30 ⁇ V compared to the retina, and EOG (Electro oculography) is also known.
- scleral reflection method corneal reflection method
- pupillary corneal reflection method are known as methods for measuring the position and movement of the eyeball.
- the scleral reflection method is a method of detecting eye movement by photographing the light reflected by the eyeball with an externally prepared camera, utilizing the fact that the reflectance of infrared light hitting the eyes is different between white eyes and black eyes. .
- the corneal reflection method is a method in which a virtual image of infrared LED light on the cornea by an infrared LED (Light Emitting Diode) applied to the eye moves in parallel with the eye movement due to the difference in rotation center between the cornea and the eyeball.
- This is a method for detecting eye movement by taking a virtual image of infrared LED light reflected by the eyeball using an externally prepared camera.
- the pupil corneal reflection method has the same basic principle as the corneal reflection method, but differs from the corneal reflection method in that it is based on the center of the pupil. That is, the pupil corneal reflection method is a method in which the center of the pupil is detected by an externally prepared camera, and the eye movement is detected from the difference from the position of the virtual image of the infrared LED light.
- a contact lens type display device has been proposed as a small image display device (see, for example, Patent Document 1). Since this display device is used by being mounted on the user's eyeball, an image can be presented to the user regardless of the location of the user.
- the user interface for moving the cursor or pointer by the above-described method is externally detected to detect the operation means.
- a device is required.
- a contact lens type display device is used wirelessly because it is attached to a user's eyeball, but using an external detection device to operate a cursor or pointer causes the user who is a user to carry extra equipment. That would be a burden.
- the user in the method of detecting a motion photographed by a camera, the user needs to be positioned within the angle of view of the camera, and the user's action range is limited. Therefore, it is difficult to take the display device outdoors.
- the operation range in the camera screen is reduced, so that the number of pixels for detecting the user's movement is relatively reduced, so that the detection accuracy is lowered.
- the gyro sensor detects the relative position, in the method of detecting the user's movement by the gyro sensor, it is necessary to designate the reference position for each operation.
- the search coil method it is necessary to provide a magnetic coil that applies a horizontal and vertical magnetic field to the outside.
- the electromotive force generated by the detection coil moving with respect to the magnetic field emitted from the magnetic field coil is used, the position of the user's head is fixed so as not to move with respect to the magnetic field coil. There is a need.
- the EOG method has a wide detection range and can detect the movement of the eyeball even when the user closes his eyes, but the detection accuracy is low because it is vulnerable to external electromagnetic noise and cannot be detected with an accuracy of 1 degree or less. .
- the scleral reflection method, corneal reflection method, and pupillary corneal reflection method all have a small burden on the human body.
- these methods require an external camera.
- it since it is vulnerable to the influence of ambient light, it is necessary to prepare an environment with less disturbance light in order to improve detection accuracy.
- the above-described technique cannot improve the operability of a contact lens type display device with a simple configuration without using an external detection device.
- the present technology has been made in view of such a situation, and is intended to improve operability with a simple configuration.
- a detection device is a detection device that can be attached to an eyeball, and includes a light receiving element that receives light incident from the eyeball.
- the detection device may further include a light emitting element that outputs light, and the light receiving element may be provided in the vicinity of the light emitting element.
- the light emitting element can be composed of a plurality of light emitting parts, and the light receiving element can be provided in the vicinity of the light emitting part.
- the light receiving element receives the light output from the light emitting part and reflected by the eyeball, and detects the amount of light received by the plurality of light receiving elements arranged in each region of the detection device. Can be further provided.
- the light emitting unit can be a display pixel for displaying information.
- the detection device can be configured to cover the entire cornea when attached to the eyeball.
- At least one of the light emitting unit or the light receiving element is provided in a region of the detection device that faces a region where the pupil of the eyeball can move. Can be.
- the width in the horizontal direction can be made wider than the width in the vertical direction where the detection device covers the eyeball.
- An element different from the light emitting element and the light receiving element can be disposed in the vicinity of the lateral end of the detection device.
- the detection device may have a structure for fixing the detection device to a head having the eyeball.
- the signal processing unit can determine the orientation of the eyeball based on the amount of light received by the plurality of light receiving elements.
- the signal processing unit calculates the amount of convergence of the left and right eyes based on the direction of the eyeball and the direction of the eyeball paired with the eyeball, and based on the amount of convergence, to the object being watched Can be calculated.
- the signal processing unit can determine the diameter of the pupil of the eyeball based on the amount of light received by the plurality of light receiving elements.
- the signal processing unit can detect the state of the living body based on the amount of light received by the plurality of light receiving elements.
- the light emitting unit irradiates the eyeball with light of a predetermined wavelength, or sequentially irradiates the eyeball with a plurality of different wavelengths of light, and the signal processing unit irradiates the predetermined wavelength with which the eyeball is irradiated.
- the state of the living body can be detected based on the amount of light received by the light receiving element of the light or the light of the plurality of different wavelengths.
- the light emitting unit may be a display pixel that displays information, and the light emitting unit may irradiate the eyeball with light of the predetermined wavelength or the plurality of different wavelengths after a period of displaying the information. .
- a detection method includes a light receiving element that receives light incident from an eyeball, and a signal processing unit that detects the amount of light received by the light receiving element, and detects a detection device that can be attached to the eyeball
- the light emitting element provided in the detection device further includes a light emitting step for outputting light, and in the light receiving step, the light receiving element can receive light output from the light emitting element and reflected by the eyeball. .
- the signal processing unit may further include a calculation step of obtaining the orientation of the eyeball based on the amount of light received by the plurality of light receiving elements.
- the signal processing unit calculates the amount of convergence of the left and right eyes based on the direction of the eyeball and the direction of the eyeball paired with the eyeball, and is watched based on the amount of convergence.
- the distance to the target object can be calculated.
- a light receiving element that receives light incident from the eyeball is provided, and in the detection device that can be attached to the eyeball, the light reflected by the eyeball is received by the light receiving element.
- operability can be improved with a simple configuration.
- the present technology relates to a contact lens type display device.
- the contact lens type display device is mounted on the user's eyeball and used wirelessly, so when using it as a display device function, the user can move around freely while wearing the display device. is there.
- performing a selective movement operation such as a cursor or pointer on the information in the displayed screen with an external device such as a camera or a detection device places a burden on the user or imposes restrictions. .
- the light reflected from the eyeball surface of the light emitted from the display element is detected by the light receiving element.
- reflected light is detected by the white eye or iris on the surface of the eyeball.
- the reflected light is small. Therefore, a portion where the reflected light is weak is detected as a pupil, and the line of sight is detected from the detected movement of the pupil.
- the contact lens type display device is attached to the user's eyeball as shown in FIG.
- a contact lens type display device 11 is mounted on the surface of the user's eyeball EY11.
- the display device 11 has a shape that can be attached to and detached from the user's eyeball EY11 like a so-called contact lens.
- Such a display device 11 is configured as shown in FIG. 2, for example.
- the display device 11 includes a display area 21, a power feeding antenna 22, a signal antenna 23, a power generation unit 24, a sensor 25, a signal processing unit 26, and a display element driving unit 27.
- FIG. 2 is a view of the display device 11 viewed from the left to the right in FIG. 1, that is, a view of the user wearing the display device 11 from the front.
- the display device 11 has a circular shape. ing.
- the display area 21 is arranged adjacent to the display element, which includes a plurality of display pixels that display information such as images and characters to be presented to the user, and receives light reflected from the surface of the user's eyeball. And a light receiving element.
- the feeding antenna 22 is provided so as to surround the display area 21 and receives an induced electromotive force due to a magnetic field or an electric field supplied from the outside.
- the signal antenna 23 transmits information supplied from the signal processing unit 26 such as a result of performing a user interface operation based on the user's line of sight, or has been transmitted from the outside such as information displayed on a display pixel. Information is received and supplied to the signal processing unit 26.
- the power generation unit 24 obtains and stores electric power by rectifying the induced current generated in the power supply antenna 22 by electromagnetic induction caused by an external magnetic field or the like, and supplies the electric power to each unit of the display device 11.
- the display antenna 11 may not be provided with the power feeding antenna 22.
- the sensor 25 is composed of a gyro sensor, a gravity sensor, or the like, detects the posture or movement of the user wearing the display device 11, and supplies the detection result to the signal processing unit 26. For example, the sensor 25 detects the movement of the user's head.
- the signal processing unit 26 controls the entire display device 11. For example, the signal processing unit 26 detects a difference (difference) in the amount of light received by the light receiving elements arranged in each area of the display device 11 based on a signal supplied from the light receiving elements in the display area 21. The user's line of sight is detected. In addition, the signal processing unit 26 controls the display element driving unit 27 based on the detection result supplied from the sensor 25, the line-of-sight detection result, information received by the signal antenna 23, and the like in the display area 21. Is displayed.
- a difference difference in the amount of light received by the light receiving elements arranged in each area of the display device 11 based on a signal supplied from the light receiving elements in the display area 21. The user's line of sight is detected.
- the signal processing unit 26 controls the display element driving unit 27 based on the detection result supplied from the sensor 25, the line-of-sight detection result, information received by the signal antenna 23, and the like in the display area 21. Is displayed.
- the signal processing unit 26 controls the display element driving unit 27 to display the display area 21 by the rotation amount of the display device 11 in the direction opposite to the rotation direction of the display device 11 with respect to the eyeball supplied from the sensor 25. Rotate the image. As a result, even if the display device 11 rotates on the user's eyeball, the resulting image rotation can be corrected and presented to the user in an easy-to-view manner.
- the display element driving unit 27 drives the display elements in the display area 21 under the control of the signal processing unit 26 to display an image, or supplies a signal supplied from the light receiving elements in the display area 21 to the signal processing unit 26. Or Hereinafter, a signal corresponding to the amount of light received by the light receiving element output from the light receiving element in the display area 21 will be referred to as a light receiving signal.
- the display area 21 of the display device 11 is configured as shown in FIG. 3, for example. 3 shows a part of a cross section of the display device 11 when the display device 11 is viewed in the depth direction in FIG.
- the display area 21 of the display device 11 includes display pixels 51-1 to 51-7 for displaying information such as images, and a light receiving element 52-1 for receiving reflected light incident from the surface of the user's eyeball. Or a light receiving element 52-7.
- One display device including the display pixels 51-1 to 51-7 is used as the display element 53.
- the display pixels 51-1 to 51-7 are also simply referred to as display pixels 51 when it is not necessary to distinguish them.
- the light receiving elements 52-1 to 52-7 are also simply referred to as the light receiving elements 52 when it is not necessary to distinguish them.
- the display element 53 includes, for example, a liquid crystal display element or an organic electroluminescence (OLED (Organic Light Emitting Diode)) display element.
- OLED Organic Light Emitting Diode
- the display pixels 51 and the light receiving elements 52 are alternately arranged in the vertical direction on the right side in the drawing of the display device 11, that is, on the user's eyeball side. Therefore, for example, in FIG. 2, the display pixels 51 and the light receiving elements 52 are alternately arranged in the vertical direction and the horizontal direction in FIG.
- a lubricating layer 54 is provided on the left side in the drawing of the display pixel 51 and the light receiving element 52 in the display device 11, that is, on the outside of the display device 11.
- the lubrication layer 54 is made of, for example, a transparent synthetic resin, and the lubrication layer 54 allows the user's eyelid to move smoothly when the user wears the display device 11 on the eyes.
- the display pixel 51 and the light receiving element 52 are in close contact with each other.
- the display pixel 51 and the light receiving element 52 are not necessarily in close contact with each other. A gap may be provided between them.
- one light receiving element 52 is provided for one display pixel 51, but one light receiving element 52 may be provided for a plurality of display pixels 51.
- the display device 11 attached to the user's eyeball EY11 is provided with display pixels 51-1 to 51-11 and light receiving elements 52-1 to 52-12.
- a part of the display area 21 of the display device 11 that is different from the pupil BE11 in the user's eyeball EY11, for example, a region facing a white eye or an iris part is the area A, and the pupil BE11 of the display area 21 is the same as the pupil BE11.
- the area facing each other is referred to as area B.
- each display pixel 51 provided in such areas A and B emits light
- the light emitted from the display pixel 51 travels toward the eyeball EY11 as shown by the solid line arrow in the figure, and the eyeball Reach EY11.
- the pupil BE11 since the pupil BE11 is transparent, the light that has entered the pupil BE11 out of the light output from the display pixel 51 is hardly reflected by the pupil BE11, and reaches the retina in the eyeball EY11. Absorbed in. In other words, in the region B as shown by the solid line arrow in the figure, the light output from the display pixel 51 is absorbed by the retina with almost no reflection on the surface of the eyeball EY11. Therefore, in the region B, the light output from the display pixel 51 in the light receiving element 52 is hardly detected.
- the orientation of the eyeball EY11 indicating the direction in which the eyeball EY11 (pupil BE11) is directed that is, The direction of the user's line of sight can be specified.
- the direction of the user's line of sight at each time can be specified, the movement of the eyeball, that is, the movement of the line of sight can be detected, and the psychological state and emotion of the user can be estimated from the movement of the line of sight.
- the light receiving element 52 In the state where the user closes his eyes, only the light output from the display pixel 51 and reflected by the eyeball EY11 enters the light receiving element 52.
- the light receiving element 52 enters the eyeball EY11 from the outside through the display area 21 in addition to the light output from the display pixel 51. Then, ambient light reflected by the eyeball EY11 also enters.
- the ambient light incident on an opaque part of the eyeball EY11 such as the white eye or the iris is reflected by the eyeball EY11 and enters the light receiving element 52, whereas the environment light of the eyeball EY11 Most of the ambient light incident on the pupil BE11 passes through the pupil BE11 and reaches the retina. That is, the ambient light incident on the pupil BE11 is hardly reflected and the amount of ambient light received by the light receiving element 52 is reduced. Therefore, it is possible to distinguish between the area A and the area B regardless of whether the user opens or closes the eyes.
- the signal processing unit 26 obtains a received light signal map indicated by the arrow Q12.
- the light reception signal map indicated by the arrow Q12 is image data indicating the amount of light received by each light receiving element 52.
- the shading of each circle on the light reception signal map indicates the value of the light reception signal output from the light receiving element 52 on the display area 21 having the same positional relationship as those circles on the light reception signal map.
- the brighter the circle on the light reception signal map the greater the amount of light received by the light receiving element 52 corresponding to that circle, and the greater the value of the light reception signal.
- circles corresponding to the respective light receiving elements 52 are also arranged in the vertical direction and the horizontal direction on the light reception signal map. ing.
- the pupil BE11 In the state indicated by the arrow Q11, the pupil BE11 is directed leftward in the figure, so that the user's line-of-sight direction is substantially the front direction. Therefore, in the light reception signal map indicated by the arrow Q12, a substantially central region of the light reception signal map corresponding to the pupil BE11 is darker than the surrounding region, and the darkened region has the same circular shape as the pupil BE11. It has become. This is because almost no reflection occurs in the pupil BE11 as described above.
- the area around the central area corresponding to the pupil BE11 indicates the position of the portion of the eyeball EY11 where the amount of reflected light such as white eyes and iris is large.
- the center position of the pupil BE11 that is, the direction of the line of sight can be easily calculated based on the light reception signal map. it can.
- the detection result of the user's line-of-sight direction is a position where the center of the pupil BE11 is directed in a region in contact with the eyeball EY11 of the display device 11, that is, a position in contact with the center of the pupil BE11 (hereinafter also referred to as a line-of-sight position) And so on. Therefore, for example, in the example indicated by the arrow Q11, the position at the substantially center of the display device 11 is the line-of-sight position.
- the display device 11 can also calculate the diameter of the pupil BE11 from the light reception signal map. For example, a region where the value of the light reception signal is equal to or less than a predetermined value is the region of the pupil BE11, and the diameter of the region is the diameter of the pupil BE11.
- the signal processing unit 26 obtains a received light signal map indicated by the arrow Q14.
- the pupil BE11 faces slightly upward in the figure, so that the user's line-of-sight direction is slightly upward. Therefore, in the light reception signal map indicated by the arrow Q14, the region of the light reception signal map corresponding to the pupil BE11 is located slightly upward compared to the case of the light reception signal map indicated by the arrow Q12.
- the signal processing unit 26 determines the line-of-sight position, that is, the orientation of the eyeball EY11 by detecting the position of the pupil BE11 based on the light reception signal map obtained from the light reception signal output from each light receiving element 52. Can do. Further, the signal processing unit 26 can calculate the vertical and horizontal positions of the object on the image displayed in the display area 21 or in the real space from the orientation of the eyeball EY11.
- the distance to the target object which the user is gazing at can also be calculated
- the predetermined object OB11 is watched.
- the display device DP11L and the display device DP11R are devices corresponding to the display device 11, respectively.
- the object OB11 is at the gaze position AT11, and the object OB12 is at the gaze position AT12.
- the distance from the user to the gaze position AT11 is longer than the distance from the user to the gaze position AT12. That is, the gaze position AT11 is farther from the user than the gaze position AT12.
- the user is gazing at the object OB11 that is substantially at the gazing position AT11 at the front.
- the angle formed by the straight line connecting the center of the pupil BE21L of the left eyeball EY21L of the user and the object OB11 and the straight line connecting the center of the pupil BE21R of the right eyeball EY21R and the object OB11 is determined by the object OB11. This is the convergence angle of the viewing user. This convergence angle indicates the amount of convergence of the left and right eyes of the user.
- the left eye light reception signal map RM11L is obtained in the display device DP11L mounted on the left eyeball EY21L, and the right eye is displayed on the display device DP11R mounted on the right eyeball EY21R.
- a light reception signal map RM11R is obtained.
- the user is gazing at the object OB12 substantially at the gaze position AT12 at the front.
- the angle formed by the straight line connecting the center of the pupil BE21L of the left eyeball EY21L of the user and the object OB12 and the straight line connecting the center of the pupil BE21R of the right eyeball EY21R and the object OB12 is determined by the object OB12. This is the convergence angle of the viewing user.
- the left eye light reception signal map RM12L is obtained in the display device DP11L attached to the left eyeball EY21L, and the right eye in the display device DP11R attached to the right eyeball EY21R.
- a light reception signal map RM12R is obtained.
- the shorter the distance from the user to the target object the greater the convergence angle of the user when viewing the target object.
- the convergence angle when viewing the object OB12 is larger than the convergence angle when viewing the object OB11.
- the user's pupil position (line-of-sight position) in the received light signal map also changes.
- the pupil position when looking at the object OB12 is located more inside the user (center side of the user) than the pupil position when looking at the object OB11 in the received light signal map. I understand that.
- the display device 11 it is possible to calculate the convergence angle of the left and right eyes of the user from the orientations of the paired left and right eyeballs obtained from the light detection result of the light receiving element 52, that is, the pupil position on the light reception signal map. Yes, the distance from the obtained convergence angle to the gaze object and the vertical and horizontal positions of the gaze object can be obtained.
- the distance to the gaze object can be obtained in this way, the position in the depth direction as well as the left-right direction can be distinguished. Therefore, for example, when an image or a button with parallax is displayed on the left and right eyes, an operation with depth can be realized.
- the convergence angle may be calculated, for example, using only the light reception signal map obtained from the light reception signal of the light receiving element 52 included in the display device DP11L and the display device DP11R. It may be calculated from the light reception signal map of the eye.
- the display device DP11L communicates with the display device DP11R and receives the light reception signal map obtained by the display device DP11R. Then, the display device DP11L calculates a convergence angle from the light reception signal map obtained by the display device DP11L and the light reception signal map received from the display device DP11R, and transmits the obtained convergence angle to the display device DP11R. At the time of calculating the convergence angle, the convergence angle is calculated from the azimuth of the eyeball indicating the pupil position in the light reception signal map of the left eye and the azimuth of the eyeball indicating the pupil position in the light reception signal map of the right eye.
- the signal processor 26 calculates the convergence angle, and the signal antenna 23 transmits and receives the convergence angle. Further, the signal processing unit 26 obtains the distance to the gaze object and the vertical and horizontal positions of the gaze object using the convergence angle as necessary. The distances to these gaze objects may be transmitted from the display device DP11L to the display device DP11R.
- the convergence angle is calculated based on the pupil position in the left eye or right eye light reception signal map. Therefore, when the object is not in front of the user, the convergence angles obtained by the display device DP11L and the display device DP11R are different angles. That is, the convergence angle is asymmetric on the left and right. However, it is possible to determine the distance to the target object and the left, right, top and bottom positions of the target object from the left and right convergence angles.
- another device may calculate the convergence angle based on the light reception signal map obtained by the display device DP11L and the light reception signal map obtained by the display device DP11R.
- control device 81 communicates with the display device DP11L and the display device DP11R and receives the light reception signal map.
- control device 81 calculates a convergence angle based on the light reception signal map received from the display device DP11L and the light reception signal map received from the display device DP11R, and the obtained convergence angles are displayed on the display device DP11L and the display device DP11R. Send to.
- control device 81 may calculate the distance to the target object, the position of the target object, and the like based on the convergence angle, and transmit the calculated distance to the display device DP11L and the display device DP11R.
- the display device 11 it is possible to detect the movement of the eyeball, that is, the azimuth of the eyeball (line-of-sight position) at each time by directly receiving the reflected light from the eyeball.
- the display area 21 by providing a plurality of light receiving elements 52 in the display area 21, it is possible to accurately detect minute movements of the eyeball.
- microsaccade ⁇ Detection of microsaccade>
- the direction of the fine movement of the eyeball is not irregular, even if the line of sight is pointing elsewhere, it is biased towards the object that is secretly paying attention, and the microsaccade is hiding people. Thoughts and desires are expressed.
- microsaccade is detected in the display device 11, it is possible to identify not only the object that the user is gazing at, but also the object that the user is potentially interested in.
- the signal processing unit 26 detects the orientation of the eyeball at each time, that is, the line-of-sight position, based on the light reception signal map at each time. At this time, when the same line-of-sight position is detected at almost all times in the predetermined period, the line-of-sight position is the position where the user is gazing, that is, the position of the object focused on by the user.
- the signal processing unit 26 is configured so that the user has a potential gaze position at the time when the amount of movement of the eyeball direction is the largest among the gaze positions different from the position where the user is gazing in the predetermined period.
- the display device 11 can also detect the state of the living body.
- the display device 11 can detect the heartbeat of the user as the state of the living body.
- the principle of detecting pulsation will be described.
- the display pixel 51 outputs light having a predetermined wavelength, and the light receiving element 52 receives reflected light generated by reflection of the light on the eyeball surface. Then, the signal processing unit 26 detects the pulsation of the heartbeat of the user wearing the display device 11 based on the value of the light reception signal supplied from the light receiving element 52 via the display element driving unit 27.
- the pulsation of the heartbeat occurs periodically as shown in FIG. 8, the pulsation time is shorter than the period, and blood flow is generated at the timing of the pulsation.
- the horizontal axis indicates time
- the vertical axis indicates the value of the received light signal, that is, the amount of blood flow.
- deoxyhemoglobin has a higher extinction coefficient on the shorter wavelength side than 805 nm, and oxyhemoglobin is higher on the longer wavelength side than 805 nm Absorption coefficient.
- the signal processing unit 26 controls the display element driving unit 27 to sequentially (alternately) light having a predetermined wavelength shorter than 805 nm and light having a predetermined wavelength longer than 805 nm from the display pixel 51. Output. Further, the signal processing unit 26 causes the light receiving element 52 to receive the light output from the display pixel 51 and reflected from the eyeball surface.
- the light having a wavelength shorter than 805 nm may be visible light.
- the signal processing unit 26 obtains the difference between the value of the light reception signal obtained when outputting light with a short wavelength and the value of the light reception signal obtained when outputting light with a long wavelength, Determine whether the blood contains more oxyhemoglobin or deoxyhemoglobin. Further, the signal processing unit 26 determines the specific result obtained from the difference between the received light signals and the change in the value of the received light signal at each time in a predetermined period, that is, the time variation of the intensity of the reflected light received by the light receiving element 52. Based on this, blood flow (change in blood flow) is detected, and pulsation is obtained from the blood flow detection result.
- Hemoglobin in blood has a strong absorption spectrum for light in a specific wavelength band, and the reflected light from the blood (blood vessel) when irradiated with light in the specific wavelength band changes in hemoglobin due to changes in the volume of the blood vessel. Varies depending on the amount. Therefore, the blood flow rate can be detected from the intensity of the reflected light of the light irradiated on the eyeball (capillary blood vessel) surface.
- the blood flow volume itself may be detected by the signal processing unit 26 as the state of the living body.
- the display device 11 can detect the degree of redness of the eyeball.
- Eye redness is a phenomenon in which the blood vessels on the surface of the white eye dilate and float up due to some influence, causing the eyes to appear red, and the amount of blood flowing at the time of redness is greater than normal.
- the eye when the surface of the eye causes conjunctivitis due to infection with bacteria or viruses, the eye becomes red.
- the eye is overworked, such as personal computers, video games, and excessive reading, the effects of contact lenses, and when the eye is not adequately rested, blood is sent to the eye to supplement oxygen and nutrients. If the blood vessels dilate when trying to get in, it will cause the eyes to become congested.
- the signal processing unit 26 of the display device 11 performs the same processing as described in the above-described detection of the heartbeat, detects the blood flow in the user's eyeball, and detects the normal blood flow and the detected blood.
- the degree of hyperemia is obtained by comparing the flow rate. In this way, it is possible to detect that the amount of blood flowing is larger than that in the normal state by comparing the blood flow amounts, such as obtaining a difference in blood flow amount.
- the normal blood flow volume may be a predetermined value or may be a value obtained from a past blood flow volume.
- the display device 11 detects the degree of redness of the user's eyes, it is possible to determine that the eyeball is not normal due to fatigue or illness.
- the display device 11 detects the state of the living body (the state of the human body) such as the heart rate and the degree of hyperemia, the detection result can be used for various application programs.
- the display device 11 detects that the user is annoyed or excited and the heart rate increases, and the display device 11 detects the object of the surprise or excitement as the user's line of sight (movement of the eyeball) and the microsoccer. It becomes possible to specify from the detection result of the switch.
- the display device 11 is configured to cover a range wider than the cornea of the eyeball shown in FIG.
- the hatched portion in the eyeball EY31 represents the cornea CO11.
- the display device 11 is sized and shaped so that the entire cornea CO11 is covered by the display device 11 when the display device 11 is attached to the eyeball EY31.
- the pupil BE11 of the eyeball EY11 is also facing the front.
- the same reference numerals are given to the portions corresponding to those in FIG. 4, and description thereof will be omitted as appropriate.
- the pupil BE11 faces the front, and the light output from each display pixel 51 of the display device 11 travels toward the eyeball EY11.
- solid arrows indicate light that has reached the retina through the pupil BE11 out of the light output from the display pixel 51.
- the dotted arrow indicates that the light output from the display pixel 51 that is absorbed or reflected by the eyeball EY11 without passing through the pupil BE11 is not absorbed or reflected by the surface of the eyeball EY11. Represents the light beam when it reaches.
- the light output from the display pixel 51 in the region facing the pupil BE11 in the display device 11 reaches the retina, and from the display pixel 51 in the region facing a portion different from the pupil BE11. It can be seen that the output light does not reach the retina.
- the state of the eyeball EY11 when the user is facing upward is shown on the lower side.
- the pupil BE11 moves in the end direction (peripheral direction) of the display device 11 as compared with the upper example in the figure, and the light output from the display pixels 51 near the upper end of the display device 11 is displayed. It can be seen that it reaches the retina through the pupil BE11. It can also be seen that the light output from the display pixel 51 near the center of the display device 11 does not pass through the pupil BE11 but is reflected or absorbed by the surface of the eyeball EY11.
- the display pixel 51 and the light receiving element 52 are provided in the entire region facing the. In other words, the display pixel 51 and the light receiving element 52 are arranged in the vicinity of the region of the display device 11 facing the pupil BE11 regardless of which direction the eyeball EY11 faces.
- the display pixel 51 can present information to the user, and the light receiving element 52 can detect the position of the user's line of sight. In this case, light for displaying an image is emitted from the entire field of view of the user, and the image is displayed in the entire field of view of the user.
- the display pixel 51 or the light receiving element 52 is provided in an area wider than the center of the display device 11 (pupil movement range), and the display pixels are always included in the entire range in which the eyeball moves. 51 and the light receiving element 52 do not need to be provided.
- the display device 11 performs a calibration process so that the position of the user's line of sight and the position of the information displayed in the display area 21 are obtained. The positional relationship can be corrected correctly.
- This calibration process is started, for example, when the user wears the contact lens type display device 11 on the eyeball.
- step S11 the signal processing unit 26 controls the display element driving unit 27 to cause the display pixels 51 to emit light.
- the display pixel 51 emits light according to the control of the display element driving unit 27 and outputs light for displaying a predetermined image.
- step S12 the light receiving element 52 starts detecting light incident from the eyeball. That is, the light receiving element 52 receives the light incident on the eyeball from the outside of the display device 11 or from the display pixel 51 and reflected on the surface of the eyeball, photoelectrically converts the received light signal according to the amount of received light, to the display element driving unit. 27 to the signal processing unit 26.
- step S13 the signal processing unit 26 controls the display element driving unit 27 to display the calibration positioning image on the display pixel 51.
- the display pixel 51 displays a calibration positioning image by emitting light according to the control of the display element driving unit 27.
- the calibration positioning image is an image of a calibration mark or the like, and the calibration positioning image is displayed in order at a total of five positions in the center of the display area 21 and up / down / left / right.
- step S13 the signal processing unit 26 selects one position where the calibration positioning image is not yet displayed from among the center, the upper, lower, left, and right positions, and displays the calibration positioning image at the selected position.
- a message that prompts the user to perform a position fixing operation by viewing the calibration positioning image may be displayed together with the calibration positioning image, if necessary.
- the calibration positioning image When the calibration positioning image is displayed, light for displaying the calibration positioning image is output from the display pixel 51, and a part of the light is reflected by the eyeball surface and received by the light receiving element 52. Then, the light receiving element 52 supplies a light reception signal corresponding to the amount of received light to the signal processing unit 26 via the display element driving unit 27.
- the user performs a predetermined position determination operation such as turning the line of sight toward the calibration positioning image, gazing at the calibration positioning image for a predetermined time or blinking.
- step S14 the signal processing unit 26 determines the user's line-of-sight position based on the light reception signal supplied from the light receiving element 52.
- the signal processing unit 26 when the operation of gazing at the same position for a predetermined time or longer is a position fixing operation by the user, the signal processing unit 26 generates a light reception signal map based on the light reception signal, and each of the obtained light reception signal maps The user's gaze position at the time is obtained.
- the signal processing unit 26 sets the same line-of-sight position detected continuously for a predetermined time or more among the obtained line-of-sight positions at each time as the line-of-sight position for the calibration positioning image. That is, the determined line-of-sight position is the line-of-sight position when the user is looking at the calibration positioning image.
- the signal processing unit 26 when the blinking operation is a position fixing operation by the user, the signal processing unit 26 generates a received light signal map based on the received light signal, and the user's blink is determined based on the received light signal map at each time. In addition to detection, the user's line-of-sight position at each time is obtained.
- the signal processing unit 26 sets the user's line-of-sight position at the time when blinking is detected as the line-of-sight position with respect to the calibration positioning image.
- the blink detection is performed based on, for example, the intensity of light detected by the light receiving element 52, that is, the value at each position of the light reception signal map (value of the light reception signal).
- the light received by the light receiving element 52 includes ambient light in addition to the light from the display pixel 51.
- the intensity of light received by the light receiving element 52 is different. Therefore, the user's blink can be detected from the change in the light amount level detected by the light receiving element 52, that is, the change in the value of the light reception signal. Note that the blink detection accuracy can be further improved by considering temporal variation in addition to the change in the light amount level.
- step S15 the signal processing unit 26 determines whether or not processing has been performed for all positions. For example, when calibration positioning images are displayed at the center and the vertical and horizontal positions of the display device 11 and the line-of-sight position is obtained for each position, it is determined that the processing has been performed for all positions.
- step S15 If it is determined in step S15 that processing has not yet been performed for all positions, the processing returns to step S13, and the above-described processing is repeated. That is, the calibration positioning image is displayed at the next position, and the line-of-sight position is obtained.
- step S15 if it is determined in step S15 that processing has been performed for all positions, the signal processing unit 26 performs calibration in step S16, and the calibration processing ends.
- the signal processing unit 26 obtains a deviation amount between the display position of the calibration positioning image for each position and the line-of-sight position when the calibration positioning image is displayed at the position, and performs calibration. That is, a correction value for matching the display position of the image in the display area 21 with the line-of-sight position when the user actually gazes at the image is obtained.
- the display device 11 displays the calibration positioning image, and performs calibration based on the display position and the line-of-sight position of the user.
- the calibration in this way, the deviation between the display position and the line-of-sight position can be corrected correctly, and the operability of the display device 11 can be improved.
- the display device 11 when executing the application program, the user can perform various operations by moving the line of sight.
- the display device 11 performs a line-of-sight detection process to detect the user's line-of-sight position, and performs a process according to the detection result.
- step S41 the signal processing unit 26 controls the display element driving unit 27 to cause the display pixel 51 to emit light.
- the display pixel 51 emits light according to the control of the display element driving unit 27 and outputs light for displaying a predetermined image. Thereby, for example, buttons and pointers for selecting information are displayed in the display area 21 as necessary.
- step S42 the light receiving element 52 starts detecting light incident from the eyeball. That is, the light receiving element 52 receives the light incident on the eyeball from the outside of the display device 11 or from the display pixel 51 and reflected on the surface of the eyeball, photoelectrically converts the received light signal according to the amount of received light, to the display element driving unit. 27 to the signal processing unit 26.
- step S43 the signal processing unit 26 obtains the user's line-of-sight position based on the light reception signal supplied from the light receiving element 52. That is, the signal processing unit 26 generates a light reception signal map based on the light reception signal, and obtains the user's line-of-sight position by detecting the pupil center (eyeball orientation) of the user from the obtained light reception signal map.
- step S44 the signal processing unit 26 performs a selection process based on the line-of-sight position.
- the signal processing unit 26 controls the display element driving unit 27 to drive the display pixel 51 according to the movement of the line-of-sight position.
- the pointer or cursor displayed on the pixel 51) is moved.
- the display is controlled so that a pointer or the like is positioned at the line-of-sight position.
- the pointer or cursor that is, the user's line-of-sight position is located in a selection target area such as a button or icon displayed in the display area 21, it is assumed that the selection target is selected. .
- the selection target may be selected when the user's line-of-sight position is at the selection target position for a predetermined time or more.
- the selection target may be selected when the line-of-sight position is located in the selection target area such as a button.
- the signal processing unit 26 detects a blink or a time when the user closes the eyes together with the line-of-sight position based on the light reception signal map, and the selection process for the selection target is performed.
- the selection process is performed using the distance from the user to the gaze position.
- the signal antenna 23 receives the azimuth of the eyeball or the received light signal map from the display device 11 attached to another pair of eyeballs and supplies the received signal map to the signal processing unit 26.
- the signal processing unit 26 calculates a convergence angle from the azimuth (line-of-sight position) of the eyeball obtained in step S43 and the received azimuth of the eyeball or the light reception signal map, and from the obtained convergence angle to the gaze position. Calculate the distance.
- the signal processing unit 26 controls the display pixel 51 via the display element driving unit 27, and the selection target such as the selected button is different from the color or shape of the other selection target that is not in the selected state. Alternatively, it may be displayed in a different display format. Thereby, the user can easily know which selection target is in the selected state.
- the selection target is not limited to a button or the like, and may be any object that can be a selection target, such as an image or text information.
- step S45 the signal processing unit 26 executes a process according to the selection by the selection process in step S44, and the line-of-sight detection process ends.
- the signal processing unit 26 executes software or calculation processing associated with the selected selection target, or controls the display pixel 51 to enlarge and display the image or character information to be selected. To do. Further, according to the selection by the selection process, for example, the diameter of the pupil may be obtained from the received light signal map as information used in the application program.
- step S45 a process corresponding to the detection result of the living body state or the detection result of the microsaccade described above and the selection by the selection process may be executed.
- the display device 11 receives the light from the display pixels 51 and the like by the light receiving element 52, detects the line-of-sight position based on the obtained light-receiving signal, performs the selection process based on the line-of-sight position, Processing according to the selection result is executed.
- the operability of the display device 11 can be improved with a simple configuration.
- the display device 11 can detect the orientation of the eyeball, that is, the line-of-sight position with high accuracy even when the user closes his eyes. At this time, the detection accuracy of the line-of-sight position can be improved as the distance (pitch) between the light receiving elements 52 adjacent to each other in the display region 21 is shortened.
- the display device 11 can detect the state of the living body.
- this biological state detection process is performed alternately with, for example, the line-of-sight detection process described with reference to FIG. That is, the line-of-sight detection process is performed during a period in which information such as an image is displayed in the display area 21, and after that period, the biological state detection process is performed to detect the state of the biological body. An image or the like is displayed on the screen 21 and a line-of-sight detection process is performed. After that, the line-of-sight detection process and the biological state detection process are alternately performed.
- step S71 the signal processing unit 26 controls the display element driving unit 27 to cause the display pixel 51 to emit light.
- the display pixel 51 emits light according to the control of the display element driving unit 27 and outputs light in a predetermined wavelength band set in advance.
- step S72 the light receiving element 52 detects light incident from the eyeball. That is, the light receiving element 52 receives the light incident on the eyeball from the outside of the display device 11 or from the display pixel 51 and reflected on the surface of the eyeball, photoelectrically converts the received light signal according to the amount of received light, to the display element driving unit. 27 to the signal processing unit 26.
- step S71 and step S72 is alternately performed a predetermined number of times for each wavelength of light output from the display pixel 51.
- step S73 the signal processing unit 26 obtains the difference between the received light signals at each time. For example, by calculating the difference between the light reception signal obtained when light with a short wavelength is output and the light reception signal obtained when light with a long wavelength is output, either oxyhemoglobin or deoxyhemoglobin is present in the blood. It can be specified whether a lot of ingredients are contained.
- step S74 the signal processing unit 26 obtains the state of the living body based on the light reception signal at each time obtained in step S72.
- the signal processing unit 26 calculates blood based on the difference obtained in step S73 and the change in the received light signal at each time in the predetermined period obtained in step S72.
- the flow (change in blood flow) is detected, and the pulsation is obtained from the blood flow detection result.
- the signal processing unit 26 detects the blood flow volume in the user's eyeball based on a change in the light reception signal at each time in a predetermined period, and holds it in advance.
- the degree of hyperemia is obtained by comparing the normal blood flow rate with the detected blood flow rate. Note that the blood flow itself may be detected as the state of the living body.
- the signal processing unit 26 When the signal processing unit 26 obtains the state of the living body such as the beat of the heartbeat or the degree of redness, the signal processing unit 26 outputs information indicating the obtained state of the living body to an application program that uses the state of the living body. finish.
- information indicating the state of the living body is recorded in the signal processing unit 26 for use in an application program, or transmitted to the outside from the signal processing unit 26 via the signal antenna 23.
- information indicating the state of the living body may be used in the selection processing of the selection target in the signal processing unit 26.
- the display device 11 outputs light of a specific wavelength from the display pixel 51, receives the light by the light receiving element 52, and detects the state of the living body from the light reception signal obtained as a result.
- the state of the living body can be easily detected, and more information can be obtained from the detection result of the living body state. For example, if the beat of the user's heartbeat is detected as the state of the living body, it becomes possible to estimate the user's emotions and psychological state such as being throbbing.
- a display area 14 includes a display area 21, a power feeding antenna 22, a signal antenna 23, a power generation unit 24, a sensor 25, a signal processing unit 26, and a display element driving unit 27.
- the display area 21 to the display element driving section 27 of the display device 101 have the same configuration and operation as the display area 21 to the display element driving section 27 of the display apparatus 11, and the display apparatus 101 and the display apparatus 11 are the overall appearance of the apparatus. And only the shape of the display area 21 are different.
- FIG. 14 is a view of the display device 101 viewed from the same direction as when the user wearing the contact lens type display device 101 is viewed from the front.
- the display device 101 has an elliptical shape that is long in the horizontal direction. ing. Therefore, when the user wears the display device 101 on the eyeball, the display device 101 is less likely to rotate with respect to the eyeball compared to the circular display device 11. Thereby, the rotation shift with respect to the eyeball of the display apparatus 101 can be suppressed.
- the display area 21 of the display device 101 has an elliptical shape that is long in the horizontal direction.
- the display device 101 has a shape that is longer in the horizontal direction than in the vertical direction in the figure, the region of the portion adjacent to the left and right of the display region 21 is larger than the region of the portion adjacent to the top and bottom of the display region 21.
- the area is larger.
- the area on the display device 101 that is not the display area 21 is an area outside the moving range of the user's pupil.
- the power generation unit 24 to the display element driving unit 27 are arranged in a region near the left and right (lateral direction) ends of the display device 101 adjacent to the left and right in the drawing of the display region 21.
- these elements hinder the image display. Can be prevented.
- FIG. 15 shows a mounting structure when the user wears the contact lens type display device 101 as viewed from the front of the user.
- the display device 101 has a longer width in the horizontal direction than in the vertical direction in the figure. Therefore, in a state where the display device 101 is attached to the user's eyeball, the width in the horizontal direction is wider than the width in the vertical direction in which the display device 101 covers the eyeball. Further, the position of the vertical end of the display device 101 is set to a position just before the user's eyelid and the base of the eyeball, and the width in the left-right direction is set to a width up to a range in which the eyeball moves to the left and right. It is getting longer.
- the display device 101 has a structure that is fixed to the head so that the display device 101 does not move relative to the user's head.
- the absolute position of the information (image) displayed by the display area 21 with respect to the user's head Will also move. Since the movement of the absolute position of information (image) with respect to the user's head is recognized as the movement of the display position, it is desirable to always fix the position of the contact lens type display device 101 with respect to the user's head.
- a convex portion is provided in the vicinity of the outer periphery of the display device 101.
- the display device 101 is mounted so as to cover the entire cornea CO21 of the user's eyeball.
- the upper and lower ends of the display device 101 are positioned on the near side of the base of the portion where the eyelid and the eyeball are connected at the upper and lower ends, that is, in the vicinity of the ring portion.
- a convex portion 141 and a convex portion 142 projecting outward are provided on the surface.
- the projections 141 and 142 are fixed so that the display device 101 does not move with respect to the user's head even when the eyeball moves or when the user blinks.
- the high friction part 151 and the high friction part 152 are provided in the upper and lower outer periphery vicinity of the display apparatus 101.
- the display device 101 may be fixed to the user's head.
- the high-friction part 151 and the high-friction part 152 are subjected to a process in which the friction with the heel is increased as compared with the central part of the display device 101. Therefore, when the display device 101 is attached to the user's eyeball, the display device 101 moves relative to the user's head due to friction between the high friction portion 151 and the high friction portion 152 and the user's eyelid (conjunctiva). It is fixed so that there is no.
- the convex portion and the high friction portion are provided in the region SR11 and the region SR12 illustrated in FIG. Provided.
- FIG. 18 shows the display device 101 viewed from the same direction as when the user wearing the display device 101 is viewed from the front. Accordingly, the upper side of the display device 101 corresponds to the upper side of the user's eyes, and the lower side of the display device 101 corresponds to the lower side of the user's eyes.
- the convex portion 141 shown in FIG. 16 or the high friction portion 151 shown in FIG. 17 is provided in the region SR11 provided along the upper end of the display device 101. Further, the convex portion 142 shown in FIG. 16 or the high friction portion 152 shown in FIG. 17 is provided in the region SR12 provided along the lower end of the display device 101.
- the convex portion 141 and the convex portion 142, or the high friction portion 151 and the high friction portion 152 are provided on the near side in the drawing of the display device 101, that is, on the outside side.
- a convex part and a high friction part may be provided along the upper and lower ends of the display apparatus 11 shown in FIG. .
- the display area 21 is configured as shown in FIG. 19, for example.
- the vertical direction and the horizontal direction in FIG. 19 correspond to, for example, the vertical direction and the horizontal direction in FIG. In FIG. 19, one rectangular area represents the display pixel 51, the light receiving element 52, or the transmissive area.
- the black square represents the area of one display pixel 51
- the hatched square represents the area of one light receiving element 52
- the white square represents the transmissive area.
- the transmissive region is a region having a higher light transmittance (transparency) than the display pixel 51 and the light receiving element 52.
- a quadrangle indicated by an arrow Q31 represents a region of one display pixel 51, and in the figure of the display pixel 51, the top, bottom, left and right are transmissive regions.
- a light receiving element 52 is disposed obliquely above and obliquely below the display pixel 51 indicated by the arrow Q31. Therefore, each display pixel 51 is surrounded by four light receiving elements 52 and four transmission regions.
- a display area 21 may be configured as shown in FIG. In FIG. 20, one rectangular area represents the display pixel 51, the light receiving element 52, or the transmission area.
- the black square represents the area of one display pixel 51
- the hatched square represents the area of one light receiving element 52
- the white square represents the transmissive area
- a square indicated by an arrow Q41 represents a region of one display pixel 51, and the display pixel 51 is surrounded by a transmission region.
- a square indicated by an arrow Q42 represents a region of one display pixel 51.
- one light receiving element 52 is disposed diagonally to the right, and is indicated by an arrow Q42.
- Another area adjacent to the display pixel 51 is a transmission area.
- the number of light receiving elements 52 provided in the display area 21 is smaller than the number of display pixels 51 provided in the display area 21, so that a larger number of transmission areas are provided. ing.
- the number of light receiving elements 52 By providing a smaller number of light receiving elements 52 than the number of display pixels 51 in this way, more light (environment light) can be transmitted through the display region 21 from the outside of the display device. Compared with the example shown, the user can see the surroundings more brightly.
- ⁇ Modification 3> ⁇ Example of display area configuration> Furthermore, when the display pixel 51 provided in the display area 21 has transmission performance, the user can see the surroundings through the display pixel 51 without providing the transmission area in the display area 21. In such a case, the display area 21 is configured as shown in FIG.
- the black area represents the area of the display pixel 51
- the hatched square represents the area of one light receiving element 52.
- a light receiving element 52 is provided adjacent to the display pixel 51. Further, the light transmittance of the display pixel 51 is higher than the light transmittance of the light receiving element 52, and the user can see the user's surroundings through the display pixel 51.
- a pressure sensor may be provided in the display device 101 so as to detect opening / closing of the user's heel wearing the display device 101 or detecting pressure when the user strongly closes the heel.
- the display device 101 is configured as shown in FIG. 22, for example.
- parts corresponding to those in FIG. 14 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
- the display device 101 shown in FIG. 22 is different from the display device 101 in FIG. 14 in that a pressure sensor 181 is further provided for the display device 101 shown in FIG. It is the same composition as.
- the display device 101 in FIG. 22 includes a display area 21, a power supply antenna 22, a signal antenna 23, a power generation unit 24, a sensor 25, a signal processing unit 26, a display element driving unit 27, and a pressure sensor 181.
- the pressure sensor 181 is located in the vicinity of the right end in the figure and operates by receiving electric power from the power generation unit 24.
- the pressure sensor 181 detects the pressure applied in the depth direction in the drawing of the display device 101, and supplies the detection result to the signal processing unit 26 via the display element driving unit 27.
- the signal processing unit 26 detects the opening / closing of the user's bag based on the detection result of the pressure supplied from the pressure sensor 181. For example, when the user closes the bag, the signal processing unit 26 determines the selection of the selection target. When the bag is strongly closed, that is, when a pressure higher than a predetermined value is detected, the selection of the selection target is determined.
- the operability of the display device 101 can be further improved.
- FIG. 23 is a diagram illustrating a configuration example of a visual line detection device to which the present technology is applied.
- the same reference numerals are given to the portions corresponding to those in FIG. 2, and the description thereof will be omitted as appropriate.
- the contact lens type gaze detection device 211 has a shape that can be attached to the user's eyeball. When the gaze detection device 211 is worn on the eyeball, the gaze detection device 211 covers the entire cornea of the user's eyeball. Has been made.
- the line-of-sight detection device 211 includes a detection area 221, a power feeding antenna 22, a signal antenna 23, a power generation unit 24, a sensor 25, a signal processing unit 26, and a light emitting element driving unit 222.
- FIG. 23 is a view of the line-of-sight detection device 211 viewed from the same direction as when the user wearing the line-of-sight detection device 211 is viewed from the front.
- the line-of-sight detection device 211 has an elliptical shape.
- the detection area 221 is arranged adjacent to the light emitting element, which is a light emitting element composed of a plurality of light emitting units that emit light for detecting the line of sight to the user's eyeball surface, and receives light reflected from the user's eyeball surface. And a light receiving element.
- the detection area 221 similarly to the display area 21 described above, at least one of a light emitting unit and a light receiving element is provided in an area in the detection area 221 that faces an area in which the pupil of the user's eyeball can move. Yes.
- the light emitting element driving unit 222 drives the light emitting elements in the detection region 221 according to the control of the signal processing unit 26, emits light from each light emitting unit, or performs signal processing on the light reception signals supplied from the light receiving elements in the detection region 221. Or supplied to the unit 26.
- the arrangement positions of the feeding antenna 22 to the signal processing unit 26 are the same as those in the display apparatus 101 of FIG.
- the detection area 221 of the visual line detection device 211 is configured as shown in FIG. 24, for example. 24 shows a part of a cross section of the line-of-sight detection device 211 when the line-of-sight detection device 211 is viewed from the horizontal direction in FIG. In FIG. 24, parts corresponding to those in FIG. 3 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
- the detection region 221 of the line-of-sight detection device 211 receives light emitting units 251-1 to 251-7 that emit light for detecting the direction of the line of sight, and reflected light incident from the surface of the user's eyeball.
- Light receiving elements 52-1 to 52-7 are provided.
- One light-emitting device including the light-emitting portions 251-1 to 251-7 serves as the light-emitting element 252.
- the light emitting units 251-1 to 251-7 are also simply referred to as the light emitting units 251 when it is not necessary to distinguish them.
- the light emitting unit 251 has a function of emitting light for line-of-sight detection by emitting light, but does not have a function of displaying information like the display pixel 51 shown in FIG.
- light emitting units 251 and light receiving elements 52 are alternately arranged in the vertical direction on the right side, that is, on the user's eyeball side in the drawing of the line-of-sight detection device 211. Therefore, for example, in FIG. 23, the light emitting units 251 and the light receiving elements 52 are alternately arranged in the vertical direction and the horizontal direction in FIG. 23 in the detection region 221.
- FIG. 24 illustrates an example in which the light emitting unit 251 and the light receiving element 52 are in close contact with each other, but the light emitting unit 251 and the light receiving element 52 are not necessarily in close contact with each other. A gap may be provided between them.
- one light receiving element 52 is provided for one light emitting unit 251, but one light receiving element 52 may be provided for a plurality of light emitting units 251.
- the light emitting element 252 including the plurality of light emitting units 251 is provided in the detection region 221
- the light emitting element 252 including one light emitting unit that emits light in the entire detection region 221 is provided in the detection region 221. It may be.
- a light receiving element 52 for detecting the amount of light received in each region of the light emitting element 252 may be provided.
- the light-emitting element 252 is not necessarily provided in the line-of-sight detection device 211.
- the calibration process is started.
- the calibration process by the visual line detection device 211 will be described.
- step S101 the signal processing unit 26 controls the light emitting element driving unit 222 to cause the light emitting unit 251 to emit light.
- the light emitting unit 251 emits light according to the control of the light emitting element driving unit 222 and outputs light for detecting the user's line-of-sight direction.
- step S102 the light receiving element 52 starts detecting light incident from the eyeball. That is, the light receiving element 52 receives the light incident on the eyeball from the outside of the line-of-sight detection device 211 or the light emitting unit 251 and reflected on the surface of the eyeball, performs photoelectric conversion, and drives the light receiving signal corresponding to the amount of received light to the light emitting element The signal is supplied to the signal processing unit 26 via the unit 222.
- the user turns his / her line of sight in a predetermined direction. For example, the user moves the line of sight up, down, left, and right as much as possible in accordance with voice guidance or the like output from a control device that is wirelessly connected to the line-of-sight detection device 211.
- the light emitting unit 251 outputs light for detecting the direction of the line of sight
- the light receiving element 52 is output from the light emitting unit 251 and reflected by the eyeball.
- the light received from the eyeball surface, such as the received light, is received, and a light reception signal corresponding to the received light amount is supplied to the signal processing unit 26 via the light emitting element driving unit 222.
- step S103 the signal processing unit 26 obtains the user's line-of-sight position based on the light reception signal supplied from the light receiving element 52.
- the signal processing unit 26 obtains the position of each end of the line of sight moved by the user as the line-of-sight position. Thereby, the range in which the user can move the line of sight from each line-of-sight position can be obtained. Note that when calculating the line-of-sight position, for example, processing similar to step S14 in FIG. 11 is performed.
- step S104 the signal processing unit 26 performs calibration based on the obtained line-of-sight position, and the calibration process ends.
- the line-of-sight detection result by the line-of-sight detection device 211 is used for a process of moving a cursor on an external display different from the line-of-sight detection device 211 when the user moves the line of sight.
- the signal processing unit 26 obtains a range in which the user's line of sight can move based on the line-of-sight position for each of the upper, lower, left, and right positions obtained in step S103. Then, the signal processing unit 26 performs calibration by associating each position of the region obtained by subtracting the margin from the range in which the user's line of sight is movable and each position of the display.
- the line-of-sight detection device 211 performs calibration based on several line-of-sight positions. By performing calibration in this way, correspondence between a specific area such as an external display and a movement destination area of the user's line of sight can be taken, and the operability of the interface operation by the user can be improved. .
- ⁇ Description of gaze detection processing> For example, when the line-of-sight detection device 211 and an external control device are connected wirelessly, when calibration processing is performed, the user can activate an arbitrary application program and execute desired processing.
- the line-of-sight detection device 211 performs a line-of-sight detection process to detect the user's line-of-sight position and outputs the detection result to an external control device.
- step S131 the signal processing unit 26 controls the light emitting element driving unit 222 to cause the light emitting unit 251 to emit light.
- the light emitting unit 251 emits light according to the control of the light emitting element driving unit 222 and outputs light for detecting the user's line-of-sight direction.
- step S132 the light receiving element 52 starts detecting light incident from the eyeball. That is, the light receiving element 52 receives the light incident on the eyeball from the outside of the line-of-sight detection device 211 or the light emitting unit 251 and reflected on the surface of the eyeball, performs photoelectric conversion, and drives the light receiving signal corresponding to the amount of received light to the light emitting element.
- the signal is supplied to the signal processing unit 26 via the unit 222.
- step S133 the signal processing unit 26 obtains the user's line-of-sight position based on the light reception signal supplied from the light receiving element 52. That is, the signal processing unit 26 generates a light reception signal map based on the light reception signal, and obtains the user's line-of-sight position by detecting the pupil center (eyeball orientation) of the user from the obtained light reception signal map.
- step S134 the signal processing unit 26 outputs the obtained line-of-sight position, and the line-of-sight detection process ends.
- the signal processing unit 26 supplies the obtained line-of-sight position to the signal antenna 23 and transmits it to the control device.
- the control device executes processing corresponding to the line-of-sight position, such as moving a cursor or the like according to the line-of-sight position received from the line-of-sight detection device 211, for example.
- the line-of-sight detection device 211 receives the light from the light emitting unit 251 and the like by the light receiving element 52, detects the line-of-sight position based on the obtained light reception signal, and outputs the detection result.
- the user's operation can be easily specified without requiring an external device other than the line-of-sight detection device 211. In other words, operability can be improved with a simple configuration.
- the gaze detection apparatus 211 outputs the light of a specific wavelength from the light emission part 251, or detects a biological state, The amount of convergence of the left and right eyes and the distance to the object may be calculated, or the diameter of the pupil may be obtained.
- the light reception signal value in the pupil region on the light reception signal map is smaller than the light reception signal value in the white eye or iris region, and the pupil region, that is, the line of sight An example of detecting the position has been described.
- the reflectance of the retina may be larger than the reflectance of the white eye or the iris portion.
- the value of the light reception signal in the pupil region is larger than the value of the light reception signal in the white eye or iris region.
- the signal processing unit 26 uses the light reception signal map. It is possible to detect the pupil region. In this case, the signal processing unit 26 detects a region where the value of the light reception signal in the light reception signal map is large as a pupil region.
- the signal processing unit 26 can detect the pupil region (line-of-sight position) based on the value of the light reception signal in each region of the light reception signal map. At this time, whether the region where the light reception signal value is large is the pupil region or the region where the light reception signal value is small is the pupil region, the wavelength of light output from the display pixel 51 or the light emitting unit 251, and the white eye and iris Or the spectral reflectance characteristics of the retina.
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- the present technology can be configured as follows.
- a detection device that can be attached to an eyeball, A detection apparatus comprising a light receiving element that receives light incident from the eyeball. [2] It further includes a light emitting element that outputs light, The detection device according to [1], wherein the light receiving element is provided in the vicinity of the light emitting element. [3] The light emitting element comprises a plurality of light emitting portions, The detection device according to [2], wherein the light receiving element is provided in the vicinity of the light emitting unit. [4] The light receiving element receives light output from the light emitting unit and reflected by the eyeball, The detection device according to [3], further including a signal processing unit that detects the amount of light received by the plurality of light receiving elements arranged in each region of the detection device.
- the signal processing unit calculates the amount of convergence of the left and right eyes based on the orientation of the eyeball and the orientation of the eyeball paired with the eyeball, and based on the convergence amount, The detection device according to [11], wherein the distance is calculated. [13] The detection device according to [4], wherein the signal processing unit obtains the diameter of the pupil of the eyeball based on the amount of light received by the plurality of light receiving elements. [14] The detection device according to [4], wherein the signal processing unit detects a state of a living body based on the amount of light received by the plurality of light receiving elements.
- the light emitting unit irradiates the eyeball with light of a predetermined wavelength, or sequentially irradiates the eyeball with light of a plurality of different wavelengths
- the detection apparatus according to [14], wherein the signal processing unit detects the state of the living body based on a light reception amount of the light having a predetermined wavelength or the light having a plurality of different wavelengths irradiated on the eyeball.
- the detection unit according to [15] wherein the light emitting unit is a display pixel that displays information, and irradiates the eyeball with the light having the predetermined wavelength or the light with the plurality of different wavelengths after the period for displaying the information. apparatus.
- a light receiving element that receives light incident from the eyeball;
- a signal processing unit that detects the amount of light received by the light receiving element, and a detection method of a detection device that can be attached to the eyeball,
- a light receiving step in which the light receiving element receives light reflected by the eyeball;
- a detection method comprising: a detection step in which the signal processing unit detects the amount of light received by the plurality of light receiving elements arranged in each region of the detection device.
- the light emitting element provided in the detection device further includes a light emitting step of outputting light, The detection method according to [17], wherein in the light receiving step, the light receiving element receives light output from the light emitting element and reflected by the eyeball.
- the signal processing unit further includes a calculation step of obtaining an azimuth of the eyeball based on the amount of light received by the plurality of light receiving elements.
- the signal processing unit calculates the amount of convergence of the left and right eyes based on the orientation of the eyeball and the orientation of the eyeball paired with the eyeball, and is watched based on the amount of convergence.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Eye Examination Apparatus (AREA)
- User Interface Of Digital Computer (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
〈コンタクトレンズ型の表示装置の構成例〉
本技術は、コンタクトレンズ型の表示装置に関するものである。コンタクトレンズ型の表示装置はユーザの眼球に装着されてワイヤレスで使用されるため、表示装置としての機能で使用する際には、ユーザは表示装置を装着したまま自由に歩き回るなどの動作が可能である。しかし、表示される画面内の情報に対してカーソルやポインタなどの選択移動操作をカメラや検出装置等の外部機器にて行うことは、使用者に負担を与えたり制約を与えたりすることになる。
続いて、表示装置11によるユーザの視線の検出について説明する。
また、表示装置11では、受光信号マップに基づいて、ユーザが注視している対象物までの距離を求めることもできる。
ところで、眼球運動にはサッカードと呼ばれる微動があることが知られている。特に視線が留まっている間の無意識の眼球運動の中で、1回の動きの幅が最も大きいものはマイクロサッカードと呼ばれている。
また、表示装置11では、生体の状態を検出することも可能である。
ところで、表示装置11は、図9に示す眼球の角膜よりも広い範囲を覆う構造とされている。図9では、眼球EY31において斜線が施された部分が角膜CO11を表している。
次に、表示装置11の動作について説明する。
キャリブレーション処理が行なわれると、ユーザは任意のアプリケーションプログラムを起動させて、所望の処理を実行させることができる。
さらに、表示装置11では生体の状態を検出することができる。
〈コンタクトレンズ型の表示装置の外観の構成例〉
また、以上においては、図2に示したように表示装置11を正面から見たときの形状が円形である例について説明したが、例えば図14に示すように楕円形状とされてもよい。なお、図14において図2における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。
〈表示領域の構成例〉
また、以上においては、図3に示したように表示領域21内に表示画素51と受光素子52とが密着して設けられている例について説明したが、表示領域21に外部からの環境光を透過させる透過領域が設けられるようにしてもよい。
〈表示領域の構成例〉
また、図20に示すように表示領域21が構成されるようにしてもよい。なお、図20において、1つの四角形の領域は、表示画素51、受光素子52、または透過領域を表している。
〈表示領域の構成例〉
さらに、表示領域21に設けられる表示画素51が透過性能を有する場合には、表示領域21に透過領域を設けなくても、ユーザは表示画素51を通して周囲を見ることができる。そのような場合、表示領域21は例えば図21に示すように構成される。
〈コンタクトレンズ型の表示装置の構成例〉
さらに、表示装置101に圧力センサを設け、表示装置101を装着したユーザの瞼の開閉を検出したり、ユーザが瞼を強く閉じたときの圧力を検出したりするようにしてもよい。そのような場合、表示装置101は例えば図22に示すように構成される。なお、図22において、図14における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。
〈コンタクトレンズ型の視線検出装置の構成例〉
なお、以上においては、ユーザの視線方向を検出する本技術を表示装置に適用する例について説明したが、本技術は表示装置に限らず、ユーザの視線方向(眼球の方位)を検出する装置全般に対して適用可能である。以下では、ユーザの視線方向を検出する視線検出装置に本技術を適用した実施の形態について説明する。
次に、視線検出装置211の動作について説明する。
例えば視線検出装置211と外部の制御装置とが無線により接続されている場合、キャリブレーション処理が行なわれると、ユーザは任意のアプリケーションプログラムを起動させて、所望の処理を実行させることができる。
眼球に装着可能な検出装置であって、
前記眼球から入射する光を受光する受光素子を備える検出装置。
[2]
光を出力する発光素子をさらに備え、
前記受光素子は、前記発光素子近傍に設けられている
[1]に記載の検出装置。
[3]
前記発光素子は複数の発光部からなり、
前記受光素子は前記発光部近傍に設けられている
[2]に記載の検出装置。
[4]
前記受光素子は、前記発光部から出力されて前記眼球で反射された光を受光し、
前記検出装置の各領域に配置された複数の前記受光素子による光の受光量を検出する信号処理部をさらに備える
[3]に記載の検出装置。
[5]
前記発光部は、情報を表示する表示画素である
[3]または[4]に記載の検出装置。
[6]
前記検出装置は、前記眼球に装着されたときに角膜全体を覆うようになされている
[2]乃至[5]の何れか一項に記載の検出装置。
[7]
前記検出装置が前記眼球に装着された状態において、前記眼球の瞳孔が移動可能な範囲の領域に対向する前記検出装置の領域に、前記発光部または前記受光素子の少なくとも一方が設けられている
[3]乃至[6]の何れか一項に記載の検出装置。
[8]
前記検出装置が前記眼球を覆う縦方向の幅よりも横方向の幅が広くなるようになされている
[2]乃至[7]の何れか一項に記載の検出装置。
[9]
前記検出装置における横方向の端近傍に、前記発光素子および前記受光素子とは異なる素子が配置されている
[8]に記載の検出装置。
[10]
前記検出装置は、前記眼球を有する頭部に対して前記検出装置を固定する構造を有している
[2]乃至[9]の何れか一項に記載の検出装置。
[11]
前記信号処理部は、前記複数の前記受光素子の受光量に基づいて前記眼球の方位を求める
[4]に記載の検出装置。
[12]
前記信号処理部は、前記眼球の方位と、前記眼球と対となる眼球の方位とに基づいて左右の目の輻輳量を算出し、前記輻輳量に基づいて、注視されている対象物までの距離を算出する
[11]に記載の検出装置。
[13]
前記信号処理部は、前記複数の前記受光素子の受光量に基づいて前記眼球の瞳孔の直径を求める
[4]に記載の検出装置。
[14]
前記信号処理部は、前記複数の前記受光素子の受光量に基づいて生体の状態を検出する
[4]に記載の検出装置。
[15]
前記発光部は、所定波長の光を前記眼球に照射するか、または複数の異なる波長の光を順番に前記眼球に照射し、
前記信号処理部は、前記眼球に照射された前記所定波長の光または前記複数の異なる波長の光の前記受光素子における受光量に基づいて前記生体の状態を検出する
[14]に記載の検出装置。
[16]
前記発光部は、情報を表示する表示画素であり、前記情報を表示する期間の後に、前記所定波長の光、または前記複数の異なる波長の光を前記眼球に照射する
[15]に記載の検出装置。
[17]
眼球から入射する光を受光する受光素子と、
前記受光素子による光の受光量を検出する信号処理部と
を備え、前記眼球に装着可能な検出装置の検出方法であって、
前記受光素子が前記眼球で反射された光を受光する受光ステップと、
前記信号処理部が、前記検出装置の各領域に配置された複数の前記受光素子による光の受光量を検出する検出ステップと
を含む検出方法。
[18]
前記検出装置に設けられた発光素子が光を出力する発光ステップをさらに含み、
前記受光ステップにおいて、前記受光素子は前記発光素子から出力され、前記眼球で反射された光を受光する
[17]に記載の検出方法。
[19]
前記信号処理部が、前記複数の前記受光素子の受光量に基づいて前記眼球の方位を求める算出ステップをさらに含む
[18]に記載の検出方法。
[20]
前記算出ステップにおいて、前記信号処理部は、前記眼球の方位と、前記眼球と対となる眼球の方位とに基づいて左右の目の輻輳量を算出し、前記輻輳量に基づいて、注視されている対象物までの距離を算出する
[19]に記載の検出方法。
Claims (20)
- 眼球に装着可能な検出装置であって、
前記眼球から入射する光を受光する受光素子を備える検出装置。 - 光を出力する発光素子をさらに備え、
前記受光素子は、前記発光素子近傍に設けられている
請求項1に記載の検出装置。 - 前記発光素子は複数の発光部からなり、
前記受光素子は前記発光部近傍に設けられている
請求項2に記載の検出装置。 - 前記受光素子は、前記発光部から出力されて前記眼球で反射された光を受光し、
前記検出装置の各領域に配置された複数の前記受光素子による光の受光量を検出する信号処理部をさらに備える
請求項3に記載の検出装置。 - 前記発光部は、情報を表示する表示画素である
請求項3に記載の検出装置。 - 前記検出装置は、前記眼球に装着されたときに角膜全体を覆うようになされている
請求項2に記載の検出装置。 - 前記検出装置が前記眼球に装着された状態において、前記眼球の瞳孔が移動可能な範囲の領域に対向する前記検出装置の領域に、前記発光部または前記受光素子の少なくとも一方が設けられている
請求項3に記載の検出装置。 - 前記検出装置が前記眼球を覆う縦方向の幅よりも横方向の幅が広くなるようになされている
請求項2に記載の検出装置。 - 前記検出装置における横方向の端近傍に、前記発光素子および前記受光素子とは異なる素子が配置されている
請求項8に記載の検出装置。 - 前記検出装置は、前記眼球を有する頭部に対して前記検出装置を固定する構造を有している
請求項2に記載の検出装置。 - 前記信号処理部は、前記複数の前記受光素子の受光量に基づいて前記眼球の方位を求める
請求項4に記載の検出装置。 - 前記信号処理部は、前記眼球の方位と、前記眼球と対となる眼球の方位とに基づいて左右の目の輻輳量を算出し、前記輻輳量に基づいて、注視されている対象物までの距離を算出する
請求項11に記載の検出装置。 - 前記信号処理部は、前記複数の前記受光素子の受光量に基づいて前記眼球の瞳孔の直径を求める
請求項4に記載の検出装置。 - 前記信号処理部は、前記複数の前記受光素子の受光量に基づいて生体の状態を検出する
請求項4に記載の検出装置。 - 前記発光部は、所定波長の光を前記眼球に照射するか、または複数の異なる波長の光を順番に前記眼球に照射し、
前記信号処理部は、前記眼球に照射された前記所定波長の光または前記複数の異なる波長の光の前記受光素子における受光量に基づいて前記生体の状態を検出する
請求項14に記載の検出装置。 - 前記発光部は、情報を表示する表示画素であり、前記情報を表示する期間の後に、前記所定波長の光、または前記複数の異なる波長の光を前記眼球に照射する
請求項15に記載の検出装置。 - 眼球から入射する光を受光する受光素子と、
前記受光素子による光の受光量を検出する信号処理部と
を備え、前記眼球に装着可能な検出装置の検出方法であって、
前記受光素子が前記眼球で反射された光を受光する受光ステップと、
前記信号処理部が、前記検出装置の各領域に配置された複数の前記受光素子による光の受光量を検出する検出ステップと
を含む検出方法。 - 前記検出装置に設けられた発光素子が光を出力する発光ステップをさらに含み、
前記受光ステップにおいて、前記受光素子は前記発光素子から出力され、前記眼球で反射された光を受光する
請求項17に記載の検出方法。 - 前記信号処理部が、前記複数の前記受光素子の受光量に基づいて前記眼球の方位を求める算出ステップをさらに含む
請求項18に記載の検出方法。 - 前記算出ステップにおいて、前記信号処理部は、前記眼球の方位と、前記眼球と対となる眼球の方位とに基づいて左右の目の輻輳量を算出し、前記輻輳量に基づいて、注視されている対象物までの距離を算出する
請求項19に記載の検出方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015527261A JP6380814B2 (ja) | 2013-07-19 | 2014-07-08 | 検出装置および方法 |
KR1020157036928A KR102376854B1 (ko) | 2013-07-19 | 2014-07-08 | 검출 장치 및 방법 |
BR112016000639-9A BR112016000639B1 (pt) | 2013-07-19 | 2014-07-08 | Aparelho e método de detecção |
US14/902,953 US9996151B2 (en) | 2013-07-19 | 2014-07-08 | Detecting eye movement based on a wearable detection apparatus |
CN201480039759.9A CN105378598B (zh) | 2013-07-19 | 2014-07-08 | 检测装置和方法 |
EP14826483.1A EP3023864B1 (en) | 2013-07-19 | 2014-07-08 | Detection device and method |
RU2016100860A RU2682798C2 (ru) | 2013-07-19 | 2014-07-08 | Устройство обнаружения и способ |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-150433 | 2013-07-19 | ||
JP2013150433 | 2013-07-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015008654A1 true WO2015008654A1 (ja) | 2015-01-22 |
Family
ID=52346122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/068126 WO2015008654A1 (ja) | 2013-07-19 | 2014-07-08 | 検出装置および方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US9996151B2 (ja) |
EP (1) | EP3023864B1 (ja) |
JP (1) | JP6380814B2 (ja) |
KR (1) | KR102376854B1 (ja) |
CN (1) | CN105378598B (ja) |
BR (1) | BR112016000639B1 (ja) |
RU (1) | RU2682798C2 (ja) |
WO (1) | WO2015008654A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019514092A (ja) * | 2016-02-26 | 2019-05-30 | ザ コカ・コーラ カンパニーThe Coca‐Cola Company | タッチレス制御グラフィカルユーザインターフェース |
WO2020016969A1 (ja) * | 2018-07-18 | 2020-01-23 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
WO2020016970A1 (ja) * | 2018-07-18 | 2020-01-23 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
JP2020514831A (ja) * | 2017-03-21 | 2020-05-21 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 光走査プロジェクタと連動した眼移動の追跡のための方法およびシステム |
JP2020141772A (ja) * | 2019-03-05 | 2020-09-10 | Kikura株式会社 | 瞳孔測定器具及び瞳孔測定装置 |
JP2022532806A (ja) * | 2019-07-23 | 2022-07-19 | インプランダータ オフタルミック プロドゥクツ ゲーエムベーハー | 視野を測定するための配置および方法ならびにインプラントの使用 |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9699433B2 (en) * | 2013-01-24 | 2017-07-04 | Yuchen Zhou | Method and apparatus to produce re-focusable vision with detecting re-focusing event from human eye |
US9880401B2 (en) * | 2014-06-13 | 2018-01-30 | Verily Life Sciences Llc | Method, device and system for accessing an eye-mountable device with a user interface |
EP3163410A4 (en) * | 2014-06-30 | 2017-12-13 | Toppan Printing Co., Ltd. | Line-of-sight measurement system, line-of-sight measurement method, and program |
EP3234920A4 (en) | 2014-12-23 | 2017-10-25 | Meta Company | Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest |
US10353463B2 (en) * | 2016-03-16 | 2019-07-16 | RaayonNova LLC | Smart contact lens with eye driven control system and method |
US11030975B2 (en) * | 2016-07-04 | 2021-06-08 | Sony Corporation | Information processing apparatus and information processing method |
WO2018098436A1 (en) | 2016-11-28 | 2018-05-31 | Spy Eye, Llc | Unobtrusive eye mounted display |
US20180239422A1 (en) * | 2017-02-17 | 2018-08-23 | International Business Machines Corporation | Tracking eye movements with a smart device |
US11157073B2 (en) * | 2017-10-04 | 2021-10-26 | Tectus Corporation | Gaze calibration for eye-mounted displays |
US11333902B2 (en) * | 2017-12-12 | 2022-05-17 | RaayonNova LLC | Smart contact lens with embedded display and image focusing system |
US11534065B2 (en) * | 2017-12-15 | 2022-12-27 | Sony Corporation | Contact lens and communication system |
US10673414B2 (en) | 2018-02-05 | 2020-06-02 | Tectus Corporation | Adaptive tuning of a contact lens |
CN108459720B (zh) * | 2018-04-19 | 2023-11-21 | 京东方科技集团股份有限公司 | 视控装置及用视控装置控制终端的方法 |
US10505394B2 (en) | 2018-04-21 | 2019-12-10 | Tectus Corporation | Power generation necklaces that mitigate energy absorption in the human body |
US10895762B2 (en) | 2018-04-30 | 2021-01-19 | Tectus Corporation | Multi-coil field generation in an electronic contact lens system |
US10838239B2 (en) | 2018-04-30 | 2020-11-17 | Tectus Corporation | Multi-coil field generation in an electronic contact lens system |
US10528131B2 (en) * | 2018-05-16 | 2020-01-07 | Tobii Ab | Method to reliably detect correlations between gaze and stimuli |
US10790700B2 (en) | 2018-05-18 | 2020-09-29 | Tectus Corporation | Power generation necklaces with field shaping systems |
US11137622B2 (en) | 2018-07-15 | 2021-10-05 | Tectus Corporation | Eye-mounted displays including embedded conductive coils |
WO2020032239A1 (ja) * | 2018-08-09 | 2020-02-13 | 株式会社ジオクリエイツ | 情報出力装置、設計支援システム、情報出力方法及び情報出力プログラム |
US10529107B1 (en) | 2018-09-11 | 2020-01-07 | Tectus Corporation | Projector alignment in a contact lens |
US10838232B2 (en) | 2018-11-26 | 2020-11-17 | Tectus Corporation | Eye-mounted displays including embedded solenoids |
US10644543B1 (en) | 2018-12-20 | 2020-05-05 | Tectus Corporation | Eye-mounted display system including a head wearable object |
EP3939837A1 (en) * | 2019-05-17 | 2022-01-19 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Control system and presentation system |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US10944290B2 (en) | 2019-08-02 | 2021-03-09 | Tectus Corporation | Headgear providing inductive coupling to a contact lens |
CN114371555A (zh) * | 2020-10-14 | 2022-04-19 | Oppo广东移动通信有限公司 | 穿戴式电子设备 |
CN115670368A (zh) * | 2021-07-23 | 2023-02-03 | 京东方科技集团股份有限公司 | 一种成像调整装置及方法、可穿戴设备、存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07121114A (ja) * | 1993-10-22 | 1995-05-12 | Hitachi Ltd | 表示装置 |
US20030020477A1 (en) * | 2001-07-30 | 2003-01-30 | Tim Goldstein | System and method for providing power to electrical devices |
JP2003177449A (ja) * | 2001-07-30 | 2003-06-27 | Hewlett Packard Co <Hp> | 電子装置を制御するシステム及び方法 |
JP2005535942A (ja) * | 2002-08-09 | 2005-11-24 | イー・ビジョン・エルエルシー | 電気駆動のコンタクトレンズ系 |
JP4752309B2 (ja) | 2005-04-07 | 2011-08-17 | ソニー株式会社 | 画像表示装置および方法 |
GB2497424A (en) * | 2011-12-06 | 2013-06-12 | E Vision Smart Optics Inc | Contact lens providing images to a wearer. |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW275112B (en) * | 1995-03-15 | 1996-05-01 | Ciba Geigy Ag | Rotationally stabilized contact lens and methods of lens stabilization |
US6120460A (en) * | 1996-09-04 | 2000-09-19 | Abreu; Marcio Marc | Method and apparatus for signal acquisition, processing and transmission for evaluation of bodily functions |
US6544193B2 (en) * | 1996-09-04 | 2003-04-08 | Marcio Marc Abreu | Noninvasive measurement of chemical substances |
US20020024631A1 (en) * | 1999-08-31 | 2002-02-28 | Roffman Jeffrey H. | Rotationally stabilized contact lenses |
EP1755441B1 (en) * | 2004-04-01 | 2015-11-04 | Eyefluence, Inc. | Biosensors, communicators, and controllers monitoring eye movement and methods for using them |
US7350919B2 (en) * | 2004-12-03 | 2008-04-01 | Searete Llc | Vision modification with reflected image |
US8446341B2 (en) * | 2007-03-07 | 2013-05-21 | University Of Washington | Contact lens with integrated light-emitting component |
US8786675B2 (en) * | 2008-01-23 | 2014-07-22 | Michael F. Deering | Systems using eye mounted displays |
JP5622431B2 (ja) * | 2010-04-21 | 2014-11-12 | オリンパス株式会社 | 頭部装着型瞳孔検出装置 |
US8446676B2 (en) * | 2010-09-16 | 2013-05-21 | Olympus Corporation | Head-mounted display device |
US8184069B1 (en) * | 2011-06-20 | 2012-05-22 | Google Inc. | Systems and methods for adaptive transmission of data |
AU2011204946C1 (en) * | 2011-07-22 | 2012-07-26 | Microsoft Technology Licensing, Llc | Automatic text scrolling on a head-mounted display |
US8971978B2 (en) * | 2012-08-21 | 2015-03-03 | Google Inc. | Contact lens with integrated pulse oximeter |
US8888277B2 (en) * | 2012-10-26 | 2014-11-18 | Johnson & Johnson Vision Care, Inc. | Contact lens with improved fitting characteristics |
CN105122119B (zh) * | 2012-12-06 | 2017-06-09 | E-视觉有限公司 | 提供影像的系统、装置、和/或方法 |
-
2014
- 2014-07-08 JP JP2015527261A patent/JP6380814B2/ja active Active
- 2014-07-08 EP EP14826483.1A patent/EP3023864B1/en active Active
- 2014-07-08 CN CN201480039759.9A patent/CN105378598B/zh active Active
- 2014-07-08 BR BR112016000639-9A patent/BR112016000639B1/pt active IP Right Grant
- 2014-07-08 RU RU2016100860A patent/RU2682798C2/ru active
- 2014-07-08 US US14/902,953 patent/US9996151B2/en active Active
- 2014-07-08 WO PCT/JP2014/068126 patent/WO2015008654A1/ja active Application Filing
- 2014-07-08 KR KR1020157036928A patent/KR102376854B1/ko active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07121114A (ja) * | 1993-10-22 | 1995-05-12 | Hitachi Ltd | 表示装置 |
US20030020477A1 (en) * | 2001-07-30 | 2003-01-30 | Tim Goldstein | System and method for providing power to electrical devices |
JP2003177449A (ja) * | 2001-07-30 | 2003-06-27 | Hewlett Packard Co <Hp> | 電子装置を制御するシステム及び方法 |
JP2005535942A (ja) * | 2002-08-09 | 2005-11-24 | イー・ビジョン・エルエルシー | 電気駆動のコンタクトレンズ系 |
JP4752309B2 (ja) | 2005-04-07 | 2011-08-17 | ソニー株式会社 | 画像表示装置および方法 |
GB2497424A (en) * | 2011-12-06 | 2013-06-12 | E Vision Smart Optics Inc | Contact lens providing images to a wearer. |
Non-Patent Citations (1)
Title |
---|
See also references of EP3023864A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019514092A (ja) * | 2016-02-26 | 2019-05-30 | ザ コカ・コーラ カンパニーThe Coca‐Cola Company | タッチレス制御グラフィカルユーザインターフェース |
JP7273720B2 (ja) | 2017-03-21 | 2023-05-15 | マジック リープ, インコーポレイテッド | 光走査プロジェクタと連動した眼移動の追跡のための方法およびシステム |
JP2020514831A (ja) * | 2017-03-21 | 2020-05-21 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 光走査プロジェクタと連動した眼移動の追跡のための方法およびシステム |
US11838496B2 (en) | 2017-03-21 | 2023-12-05 | Magic Leap, Inc. | Method and system for tracking eye movement in conjunction with a light scanning projector |
WO2020016970A1 (ja) * | 2018-07-18 | 2020-01-23 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
JPWO2020016970A1 (ja) * | 2018-07-18 | 2021-08-12 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
US11243609B2 (en) | 2018-07-18 | 2022-02-08 | Sony Interactive Entertainment Inc. | Information processing apparatus, information processing method, and program |
JP7181293B2 (ja) | 2018-07-18 | 2022-11-30 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
WO2020016969A1 (ja) * | 2018-07-18 | 2020-01-23 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、及びプログラム |
JP2020141772A (ja) * | 2019-03-05 | 2020-09-10 | Kikura株式会社 | 瞳孔測定器具及び瞳孔測定装置 |
JP7284498B2 (ja) | 2019-03-05 | 2023-05-31 | Kikura株式会社 | 瞳孔測定装置 |
JP2022532806A (ja) * | 2019-07-23 | 2022-07-19 | インプランダータ オフタルミック プロドゥクツ ゲーエムベーハー | 視野を測定するための配置および方法ならびにインプラントの使用 |
JP7354406B2 (ja) | 2019-07-23 | 2023-10-02 | インプランダータ オフタルミック プロドゥクツ ゲーエムベーハー | 視野を測定するための配置および方法ならびにインプラントの使用 |
Also Published As
Publication number | Publication date |
---|---|
RU2016100860A (ru) | 2017-07-17 |
US20160147301A1 (en) | 2016-05-26 |
EP3023864A1 (en) | 2016-05-25 |
KR102376854B1 (ko) | 2022-03-21 |
CN105378598B (zh) | 2018-12-25 |
BR112016000639B1 (pt) | 2022-08-09 |
KR20160032043A (ko) | 2016-03-23 |
BR112016000639A2 (pt) | 2017-07-25 |
RU2682798C2 (ru) | 2019-03-21 |
EP3023864B1 (en) | 2022-12-21 |
JPWO2015008654A1 (ja) | 2017-03-02 |
BR112016000639A8 (pt) | 2020-01-14 |
EP3023864A4 (en) | 2017-03-22 |
JP6380814B2 (ja) | 2018-08-29 |
CN105378598A (zh) | 2016-03-02 |
US9996151B2 (en) | 2018-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6380814B2 (ja) | 検出装置および方法 | |
US10231614B2 (en) | Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance | |
CN106663183B (zh) | 眼睛跟踪及用户反应探测 | |
US9994228B2 (en) | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment | |
ES2957329T3 (es) | Sistemas y métodos para el seguimiento ocular en aplicaciones de realidad virtual y de realidad aumentada | |
CN109997174B (zh) | 可穿戴光谱检查系统 | |
CN106132284B (zh) | 光学眼动追踪 | |
CA3081251C (en) | Systems and methods for identifying gaze tracking scene reference locations | |
US20190235624A1 (en) | Systems and methods for predictive visual rendering | |
CN104755023B (zh) | 图像显示设备和信息输入设备 | |
US20170172476A1 (en) | Body worn measurement device | |
KR20190138852A (ko) | 다중 모드 눈 추적 | |
US10901505B1 (en) | Eye-based activation and tool selection systems and methods | |
US20190101984A1 (en) | Heartrate monitor for ar wearables | |
CN110621212A (zh) | 用于评估用户健康状况的系统 | |
WO2015172988A1 (en) | Display cap | |
JP2020077271A (ja) | 表示装置、学習装置、及び、表示装置の制御方法 | |
US20100134297A1 (en) | Activity monitoring eyewear | |
KR102315680B1 (ko) | 가상현실 기반 조리개 운동 유도를 통한 시력 회복 장치 | |
US20230368478A1 (en) | Head-Worn Wearable Device Providing Indications of Received and Monitored Sensor Data, and Methods and Systems of Use Thereof | |
Guo et al. | A wearable non-contact optical system based on muscle tracking for ultra-long-term and indirect eye-tracking | |
CN118402771A (zh) | 使用头戴式设备的面向内的眼动追踪摄像头来测量心率的技术、以及使用这些技术的系统和方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14826483 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015527261 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20157036928 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14902953 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014826483 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016100860 Country of ref document: RU Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016000639 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112016000639 Country of ref document: BR Kind code of ref document: A2 Effective date: 20160112 |