US9867532B2 - System for detecting optical parameter of eye, and method for detecting optical parameter of eye - Google Patents

System for detecting optical parameter of eye, and method for detecting optical parameter of eye Download PDF

Info

Publication number
US9867532B2
US9867532B2 US14/781,306 US201314781306A US9867532B2 US 9867532 B2 US9867532 B2 US 9867532B2 US 201314781306 A US201314781306 A US 201314781306A US 9867532 B2 US9867532 B2 US 9867532B2
Authority
US
United States
Prior art keywords
image
eye
optical
parameter
clearest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/781,306
Other versions
US20160135675A1 (en
Inventor
Lin Du
HongJiang Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Assigned to BEIJING ZHIGU RUI TUO TECH CO., LTD reassignment BEIJING ZHIGU RUI TUO TECH CO., LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DU, LIN, ZHANG, HONGJIANG
Publication of US20160135675A1 publication Critical patent/US20160135675A1/en
Application granted granted Critical
Publication of US9867532B2 publication Critical patent/US9867532B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0075Apparatus for testing the eyes; Instruments for examining the eyes provided with adjusting devices, e.g. operated by control lever
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors

Definitions

  • the subject disclosure relates to an optical system, and in particular, to detection an optical parameter of an eye.
  • applications that control a device by detecting a natural action of a user include the following applications.
  • the single lens reflex EOS 3 published by Canon has an exclusive eye control focusing function with 45 focusing points, which can automatically detect the motion of the pupil of the user eyeball that is watching the eyepiece of the camera.
  • the eyeball is lighted by an infrared light emitting diode installed on the frame of the camera eyepiece, and the infrared light reflected back by the eyeball is cast onto an eye control BASIS (base stored image sensor).
  • BASIS base stored image sensor
  • Tobii Technology develops an eye control interaction system, where an eye tracker (eye tracker) shoots and locates in real time a micro projection pattern reflected on the user eyeball (or directly shoots and locates the eyeball motion), so that the system can very precisely track a direction watched by the eye, to perform user eye control interaction or analyze a user reading behavior.
  • an eye tracker eye tracker
  • Google discloses, in U.S. Patent publication No. 2012/0290401, an apparatus and a method for determining a direction watched by an eye of a user through a wearable device, where the position of the user pupil is detected in real time by installing a camera or a CCD on a pair of glasses, so that the sight line direction of the user eye can be obtained.
  • the sight line direction of the user is mainly obtained through the image on the eyeball surface.
  • the distance between the object and the eye is generally preset, as described in the foregoing eye control interaction technology, and the position of the focusing point can be obtained according to the sight line direction of the user and the set distance.
  • the position of the focusing point of the eye cannot be obtained.
  • the position of the eye focusing point is obtained according to sight line directions of two eyes of the user and the intersection between the sight lines, so in this case, the sight line directions of the two eyes of the user need to be detected simultaneously, and the detection precision is not very high.
  • a fundus camera (Fundus Camera) is often used in ophthalmic diagnosis, where refined pictures of the fundus retina are captured to assist in diagnosis of some possible ophthalmic diseases, including determination of the eye diopter, as recorded in U.S. Pat. No. 7,001,020.
  • the patent requires that the user stares at a specific target and has a test before an optical parameter of an eye can be determined. Therefore, the position of the focusing point of the eye in daily use cannot be determined.
  • Various embodiments provide a system and a method for detecting an optical parameter of an eye, which are used to determine an optical parameter of an eye and particularly a position of a focusing point of the eye.
  • a system that detects an optical parameter of an eye, including
  • an image collection apparatus configured to collect at least one image presented by a fundus of an eye
  • an imaging apparatus configured to adjust at least one imaging parameter of an optical path between the eye and the image collection apparatus, so that the image collection apparatus obtains a clearest image
  • an image processing apparatus configured to process the image obtained by the image collection apparatus, to obtain at least one optical parameter of the eye when the image collection apparatus obtains the clearest image.
  • a wearable optical device e.g., including the foregoing system for detecting an optical parameter of an eye.
  • a method for detecting an optical parameter of an eye including
  • processing the collected image to obtain at least one optical parameter of the eye when the collected image is the clearest.
  • the image of the fundus is captured, and an imaging parameter known in the optical path when the clearest image is captured is found, so that a direction which is currently being watched by the eye and a distance between the eye and a focusing point can be obtained through optical calculation, and a position of the focusing point of the eye can further be determined precisely, providing a further application basis for an eye control interaction related technology at the same time.
  • FIG. 1 is a structural block diagram of a system for detecting an optical parameter of an eye according to an embodiment
  • FIG. 2 is a structural block diagram of another system for detecting an optical parameter of an eye according to an embodiment
  • FIG. 3 a is a schematic diagram of a light spot pattern used in a system for detecting an optical parameter of an eye according to an embodiment
  • FIG. 3 b is a schematic diagram of a fundus image with a light spot pattern captured by a system for detecting an optical parameter of an eye according to an embodiment
  • FIG. 4 a is a schematic diagram of an optical path for eye imaging according to an embodiment
  • FIG. 4 b is a schematic diagram of a distance from an eye focusing point to an eye and obtained according to an imaging parameter known by a system and an optical parameter of the eye according to an embodiment
  • FIG. 5 is a schematic diagram of an application of a system for detecting an optical parameter of an eye on a pair of glasses according to an embodiment
  • FIG. 6 is a schematic diagram of an application of another system for detecting an optical parameter of an eye on a pair of glasses according to an embodiment
  • FIG. 7 is a flowchart of a method for detecting an optical parameter of an eye according to an embodiment
  • FIG. 8 is a flowchart of step S 130 of a method for detecting an optical parameter of an eye according to an embodiment.
  • FIG. 9 is a structural block diagram of an image processing apparatus of a system for detecting an optical parameter of an eye according to an embodiment.
  • an embodiment provides a system 100 for detecting an optical parameter of an eye, including
  • an image collection apparatus 110 configured to collect an image presented by a fundus
  • an imaging apparatus 120 configured to adjust an imaging parameter between an eye and the image collection apparatus 110 , so that the image collection apparatus 110 obtains a clearest image
  • an image processing apparatus 130 configured to process the image obtained by the image collection apparatus 110 , to obtain an optical parameter of the eye when the image collection apparatus obtains the clearest image.
  • the optical parameter of the eye when the image collection apparatus obtains the clearest image is obtained by analyzing and processing the image of the eye fundus, so that a position of a current focusing point of the eye can be obtained through calculation, providing a basis for further implementing an eye self-adaptive operation.
  • the image presented by the “fundus” is mainly an image presented on a retina, which may be an image of the fundus, or an image of another object cast to the fundus.
  • the eye herein may be a human eye or an eye of another animal.
  • the image collection apparatus 110 is a micro camera.
  • the image collection apparatus 110 may also use a photosensitive imaging device directly, such as a CCD or a CMOS.
  • the imaging apparatus 120 includes: an adjustable lens unit 121 , located in the optical path between the eye and the image collection apparatus 110 , where a focal length of the adjustable lens unit 121 is adjustable and/or a position of the adjustable lens unit 121 in the optical path is adjustable.
  • a system equivalent focal length from the eye to the image collection apparatus 110 is adjustable.
  • the image collection apparatus 110 obtains the clearest image of the fundus when the adjustable lens unit 121 is located at a certain position or is in a certain state.
  • the adjustable lens unit 121 is adjusted continuously and in real time during detection.
  • the adjustable lens unit 121 is a focal length adjustable lens, configured to adjust a focal length thereof by adjusting a refractive index and/or shape thereof, which is specifically: 1) the focal length is adjusted by adjusting a curvature of at least one surface of the focal length adjustable lens, for example, adjusting the curvature of the focal length adjustable lens by increasing or reducing the quantity of a liquid medium in a cavity formed by two layers of transparent layers; and 2) the focal length is adjusted by changing the refractive index of the focal length adjustable lens, for example, the focal length adjustable lens is filled with a specific liquid crystal medium, and the voltage of an electrode corresponding to the liquid crystal medium is adjusted to adjust an arrangement manner of the liquid crystal medium, so as to change the refractive index of the focal length adjustable lens.
  • the adjustable lens unit 121 includes a lens set, configured to adjust a relative position between lenses in the lens set, to adjust a focal length of the lens set.
  • the optical path parameter of the system may also be changed by adjusting the position of the adjustable lens unit 121 in the optical path.
  • the imaging apparatus 120 in order not to affect the viewing experience of the user on a viewed object, and in order to enable the system to be portably applied to a wearable device, the imaging apparatus 120 further includes: a light splitting apparatus 122 , configured to form a light transferring path between the eye and the viewed object, and a light transferring path between the eye and the image collection apparatus 110 . In this way, the optical path can be folded, thereby reducing the volume of the system while not affecting other experience of the user as much as possible.
  • the light splitting apparatus includes a first light splitting unit, located between the eye and the viewed object, and configured to transmit light from the viewed object to the eye, and transfer light from the eye to the image collection apparatus.
  • the first light splitting unit may be a beam splitter, a light splitting optical waveguide (including optical fiber), or another proper light splitting device.
  • the image processing apparatus 130 includes an optical path calibration module, configured to calibrate the optical path of the system, for example, align and calibrate an optical axis of the optical path, so as to ensure the detection precision.
  • the image processing apparatus 130 includes
  • an image analyzing module 131 configured to analyze the image obtained by the image collection apparatus, to find the clearest image
  • a parameter calculation module 132 configured to calculate the optical parameter of the eye according to the clearest image and the imaging parameter known by the system when the clearest image is obtained.
  • the image collection apparatus 110 can obtain the clearest image, but the clearest image needs to be found through the image analyzing module 131 .
  • the optical parameter of the eye can be obtained through calculation according to the clearest image and the optical path parameter known by the system.
  • the optical parameter of the eye may include an optical axis direction of the eye.
  • the system further includes a casting apparatus 140 configured to cast a light spot to the fundus.
  • a function of the casting apparatus may be implemented by a micro projector.
  • the cast light spot herein may not have a specific pattern and be only used to lighten the fundus.
  • the cast light spot includes a pattern rich in features.
  • the pattern is rich in features, which can facilitate detection and improve the detection precision.
  • FIG. 3 a is a schematic diagram of a light spot pattern 500 .
  • the pattern may be generated by a light spot pattern generator, such as frosted glass.
  • FIG. 3 b shows an image of the fundus captured when the light spot pattern 500 is cast.
  • the light spot is an infrared light spot invisible to the eye.
  • an emergent surface of the casting apparatus may be provided with an eye invisible light transmission filter, for example, an infrared transmission filter, and
  • an incident surface of the image collection apparatus is provided with an eye invisible light transmission filter.
  • the image processing apparatus 130 further includes
  • a casting control module 134 configured to control, according to a result obtained by the image analyzing module, brightness of the light spot cast by the casting apparatus.
  • the casting control module 134 may self-adaptively adjust the brightness according to the feature of the image obtained by the image collection apparatus 110 .
  • the feature of the image includes an image feature contrast, a texture feature, and so on.
  • a special case of controlling the brightness of the light spot cast by the casting apparatus 140 is: turning on or off the casting apparatus 140 .
  • the casting apparatus may be turned off periodically; when the fundus of the user is bright enough, the casting apparatus 140 may be turned off, and only fundus information is used to detect a distance from a focusing point of a current sight line of the eye to the eye.
  • the casting control module 134 may further control, according to environment light, the brightness of the light spot cast by the casting apparatus.
  • the image processing apparatus 130 further includes an image calibration module 133 configured to calibrate the image of the fundus to obtain at least one reference image corresponding to the image presented by the fundus.
  • the image analyzing module 131 performs comparison calculation for the image obtained by the image collection apparatus 130 and the reference image, to obtain the clearest image.
  • the clearest image may be an obtained image least different from the reference image.
  • the difference between the currently obtained image and the reference image is calculated through an existing image processing algorithm, for example, using a classic phase difference automatic focusing algorithm.
  • the parameter calculation module 132 includes
  • an eye optical axis direction determination unit 1321 configured to obtain an eye optical axis direction according to a feature of the eye when the clearest image is obtained.
  • the feature of the eye may be obtained from the clearest image or obtained elsewhere.
  • the eye optical axis direction indicates a direction watched by the sight line of the eye.
  • the eye optical axis direction determination unit 1321 includes a first determination subunit, configured to obtain the eye optical axis direction according to a feature of the fundus when the clearest image is obtained. Compared with the manner of obtaining the eye optical axis direction through features of the pupil and eyeball surface, the manner of determining the eye optical axis direction through the feature of the fundus is higher in precision.
  • the size of the light spot pattern may be greater or smaller than that of a visible region of the fundus, where,
  • the eye optical axis direction may be determined by using a classic feature point matching algorithm (for example, a scale invariant feature transform (SIFT) algorithm) to detect a position of the light spot pattern on the image relative to the fundus.
  • a classic feature point matching algorithm for example, a scale invariant feature transform (SIFT) algorithm
  • the eye optical axis direction may be determined through a position of the obtained light spot pattern on the image relative to an original light spot pattern (obtained by the image calibration module), to determine the sight line direction of the user.
  • the eye optical axis direction determination unit 1321 includes a second determination subunit, configured to obtain the eye optical axis direction according to a feature of the pupil of the eye when the clearest image is obtained.
  • the feature of the eye pupil may be obtained from the clearest image or obtained elsewhere.
  • the manner of obtaining the eye optical axis direction through the feature of the eye pupil belongs to the prior art, which is not detailed herein again.
  • the image processing apparatus 130 further includes an eye optical axis direction calibration module 135 , configured to calibrate the eye optical axis direction, so that the eye optical axis direction is determined more precisely.
  • the imaging parameter known by the system includes: a constant imaging parameter and a real-time imaging parameter, where the real-time imaging parameter is adjustable parameter information of the adjustable lens unit when the clearest image is obtained, and the parameter information may be obtained through real-time record when the clearest image is obtained.
  • the distance from the eye focusing point to the eye can be obtained through a calculation, e.g., as follows:
  • FIG. 4 a is a schematic diagram of eye imaging. With reference to a lens imaging formula in the classic optical theory, the following formula (1) may be obtained according to FIG. 4 a :
  • d o represents a distance from a current viewed object 1010 of the eye to an eye equivalent lens 1030
  • d e represents a distance from a real image 1020 on the retina to the eye equivalent lens 1030
  • f e represents an equivalent focal length of the eye equivalent lens 1030
  • X represents the sight line direction of the eye (which may be obtained according to the optical axis direction of the eye).
  • FIG. 4 b is a schematic diagram of a distance from the eye focusing point to the eye and obtained according to the optical parameter known by the system and the optical parameter of the eye.
  • a light spot 1040 in FIG. 4 b forms a virtual image (not shown in FIG. 4 b ) through the adjustable lens unit 121 .
  • a distance from the virtual image to the lens is x (not shown in FIG. 4 b )
  • the following equation set may be obtained with reference to formula (1):
  • d p represents an optical equivalent distance from the light spot 1040 to the adjustable lens unit 121
  • d l represents an optical equivalent distance from the adjustable lens unit 121 to the eye equivalent lens 1030
  • f p represents a value of the focal length of the adjustable lens unit 121
  • d l represents a distance from the eye equivalent lens 1030 to the adjustable lens unit 121 .
  • the distance d o from the current viewed object 1010 (the eye focusing point) to the eye equivalent lens 1030 can be obtained according to (1) and (2), as shown in formula (3):
  • the position of the eye focusing point can be obtained easily, which provides a basis for subsequent further eye related interaction.
  • FIG. 5 shows an embodiment where a system 400 for detecting an eye focusing point is applied to a pair of glasses 200 in a possible implementation manner of an embodiment, which includes the content recorded in the implementation manner shown in FIG. 2 .
  • the system 400 of this implementation manner is integrated on the right side (not limited thereto) of the glasses 200 , which includes
  • a micro camera 410 which functions in the same way as the image collection apparatus recorded in the implementation manner of FIG. 2 , and is arranged on the right outside of the glasses 200 , so as not to affect the sight line of the user in viewing the object normally.
  • the glasses 200 can include a first beam splitter 420 , which functions in the same way as the first light splitting unit recorded in the implementation manner of FIG. 2 , and is arranged with a certain tilt angle at the intersection between a direction watched by an eye 300 and an incident direction of the camera 410 , and transmits light of the viewed object entering the eye 300 and reflects light from the eye to the camera 410 .
  • a first beam splitter 420 which functions in the same way as the first light splitting unit recorded in the implementation manner of FIG. 2 , and is arranged with a certain tilt angle at the intersection between a direction watched by an eye 300 and an incident direction of the camera 410 , and transmits light of the viewed object entering the eye 300 and reflects light from the eye to the camera 410 .
  • an focal length adjustable lens 430 can be included, which functions in the same way as the focal length adjustable lens recorded in the implementation manner of FIG. 2 , and is located between the first beam splitter 420 and the camera 410 , and adjusts the value of the focal length in real time, so that the camera 410 can capture a clearest image of the fundus when the focal length is set to a certain value.
  • FIG. 5 no image processing apparatus is shown in FIG. 5 , which functions in the same way as the image processing apparatus shown in FIG. 2 .
  • the fundus is not bright enough, and therefore, it would be best to increase light to the fundus.
  • the fundus is lighted by a light emitting source 440 .
  • the light emitting source 440 herein is a light source emitting light invisible to the eye, which is a near-infrared light emitting source that affects the eye 300 slightly and to which the camera 410 is sensitive.
  • a second beam splitter 450 is required to transfer, to the fundus together with the first beam splitter 420 , the light emitted by the light emitting source 440 .
  • the second beam splitter 450 is located before the incident surface of the camera 410 , it is further required to transmit the light from the fundus to the second beam splitter 450 .
  • the first beam splitter 420 may have features such as a high infrared reflectance and a high visible light transmittance.
  • the foregoing features may be achieved by arranging an infrared reflective film on a side of the first beam splitter 420 towards the eye 300 .
  • the system 400 for detecting an eye focusing point is located on a side of a lens of the glasses 200 far away from the eye 300 , when the optical parameter of the eye is calculated, the lens may also be considered as a part of the eye. In this case, it is not required to know the optical feature of the lens.
  • the system 400 for detecting an eye focusing point may be located on a side of a lens of the glasses 200 close to the eye 300 .
  • the optical feature parameter of the lens needs to be obtained in advance, and the influence factor of the lens is considered when the distance between the eye and the focusing point is calculated.
  • the light emitted by the light emitting source is reflected by the second beam splitter 420 , cast by the focal length adjustable lens 430 , and reflected by the first beam splitter 420 , then enters the eye of the user through the lens of the glasses 200 , and finally reaches the retina of the fundus.
  • the camera 410 captures the image of the fundus through an optical path formed by the first beam splitter 420 , the focal length adjustable lens 430 , and the second beam splitter 450 through the pupil of the eye 300 .
  • FIG. 6 is a schematic structural diagram of a system 600 for detecting an eye focusing point in another implementation manner of the embodiment.
  • this implementation manner is similar to that shown in FIG. 5 in that, the system includes a micro camera 610 , a second beam splitter 620 , and a focal length adjustable lens 630 , and different from that shown in FIG. 5 in that, in this implementation manner, a casting apparatus 640 is a casting apparatus 640 casting a light spot pattern, and the first beam splitter in the implementation manner of FIG. 5 is replaced with a curved beam splitter 650 .
  • the curved beam splitter 650 transfers, to an image collection apparatus, images presented by a fundus and separately corresponding to positions of a pupil in the case of different eye optical axis directions.
  • the camera can capture images formed by blending and superimposing images of the eyeball from different angles.
  • only the fundus part of the pupil can be imaged clearly on the camera, while other parts fail to be clearly imaged due to out of focus. Therefore, the imaging of the fundus part is not affected seriously, and the feature of the fundus part can still be detected. Therefore, compared with the implementation manner of FIG. 5 , in this implementation manner, an image of the fundus can be obtained very well even though the eye watches an object in different directions, so that the application range of the system for detecting eye focusing point of this implementation manner is wider, and the detection precision thereof is higher.
  • FIG. 7 shows a method for detecting an optical parameter of an eye according to an embodiment, including the following operations.
  • S 110 Collect in real time an image presented by a fundus.
  • S 130 Process the collected image, to obtain an optical parameter of the eye when the collected image is the clearest.
  • the method further includes operation S 140 : obtaining a position of a focusing point of the eye according to the optical parameter of the eye.
  • the method before the operation S 130 : processing the collected image, the method further includes
  • the operation S 130 processing the collected image, to obtain an optical parameter of the eye when the collected image is the clearest includes the following operations.
  • the operation S 131 includes
  • the operation S 132 includes
  • the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained includes
  • the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained includes
  • the method before the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained, the method further includes an operation of calibrating the eye optical axis direction.
  • the imaging parameter of the optical path between the eye and the image collection apparatus is adjusted in a manner of adjusting a focal length of a lens unit in the optical path between the eye and the image collection apparatus and/or a position of the lens unit in the optical path.
  • the operation of collecting in real time an image presented by a fundus includes
  • the method further includes an operation of casting a light spot to the fundus.
  • the cast light spot includes a pattern rich in features and is an infrared light spot.
  • the light spot cast to the fundus is filtered by an eye invisible light transmission filter; and the collected image is filtered by the eye invisible light transmission filter.
  • an eye invisible light transmission filter For example, a near-infrared transmission light filter may be used, so that only near-infrared light can pass through the light filter.
  • the method further includes an operation of: controlling brightness of the cast light spot according to a result obtained by analyzing the collected image.
  • the method of the embodiment may be implemented through the apparatus embodiment in FIG. 1 to FIG. 6 .
  • the specific implementation means refer to the description of the apparatus embodiment. Details are not described herein again.
  • sequence numbers of the steps or operations do not mean an execution order, the execution order of the steps or operations should be determined according to their functions and internal logic, and shall not be construed as any limitation to the implementation process of the specific implementation manner.
  • FIG. 9 is a schematic structural diagram of an image processing apparatus 800 in a system for detecting an optical parameter of an eye provided in an embodiment, and the specific implementation of the image processing apparatus 800 is not limited in the specific embodiment. As shown in FIG. 9 , the image processing apparatus 800 may include
  • processor processor
  • communications interface Communication Interface
  • memory memory
  • communication bus 840 a communication bus
  • the processor 810 , the communications interface 820 , and the memory 830 communicate with each other through the communication bus 840 .
  • the communications interface 820 is configured to communicate with a network element such as a client.
  • the processor 810 is configured to execute a program 832 , and may specifically execute related operations in the method embodiment shown in FIG. 8 .
  • the program 832 may include a program code, where the program code includes a computer operation instruction.
  • the processor 810 may be a central processing unit CPU, or an application specific integrated circuit (ASIC), or be configured as one or more integrated circuits for implementing the embodiments.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • the memory 830 is configured to store the program 832 .
  • the memory 830 may include a high-speed RAM memory, and may further include a non-volatile memory, for example, at least one disk memory.
  • the program 832 may specifically enable the image processing apparatus 800 to execute the following operations analyzing the image obtained by the image collection apparatus, to find the clearest image, and calculating the optical parameter of the eye according to the clearest image and the imaging parameter known by the system when the clearest image is obtained.
  • An embodiment further provides a wearable optical device.
  • the wearable optical device may be the frame glasses shown in FIG. 5 or FIG. 6 , or a contact lens.
  • the wearable optical device includes the system for detecting an optical parameter of an eye recorded in the foregoing embodiments.
  • the system for detecting an optical parameter of an eye may also be applied to another eye related device, for example, a non-wearable optical device such as a telescope; or the system for detecting an optical parameter may be further applied to another imaging and receiving apparatus, such as a camera, except the eye.
  • another eye related device for example, a non-wearable optical device such as a telescope
  • another imaging and receiving apparatus such as a camera
  • the functions When the functions are implemented in a form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the various embodiments essentially, or the part contributing to the prior art, or a part of the technical solutions may be implemented in a form of a software product.
  • the computer software product can be stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or a part of the operations of the methods described in the embodiments.
  • the foregoing storage medium includes: any medium that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.

Abstract

An optical parameter of an eye is detected using an image collection apparatus that collects an image presented by a fundus. An imaging apparatus adjusts an imaging parameter of an optical path between an eye and the image collection apparatus, wherein the image collection apparatus obtains a clearest image that satisfies a defined clarity criterion or condition. An image processing apparatus processes the clearest image to obtain an optical parameter of the eye. The image of the fundus is captured, and an imaging parameter known in the optical path when the clearest image is captured is found, so that a direction currently being watched by the eye and a distance between the eye and a focusing point can be obtained through optical calculation. Consequently, a position of the focusing point of the eye can be determined precisely for a wide range of concurrent applications for eye control interaction.

Description

RELATED APPLICATIONS
This application is a U.S. National Stage filing under 35 U.S.C. §371 of international patent cooperation treaty (PCT) application No. PCT/CN2013/088544, filed Dec. 4, 2013, and entitled “SYSTEM FOR DETECTING OPTICAL PARAMETER OF EYE, AND METHOD FOR DETECTING OPTICAL PARAMETER OF EYE,” which claims priority to Chinese Patent Application No. 201310329797.5, filed with the Chinese Patent Office on Jul. 31, 2013 and entitled “SYSTEM AND METHOD FOR DETECTING OPTICAL PARAMETER OF EYE”, which applications are hereby incorporated herein by reference in their respective entireties.
TECHNICAL FIELD
The subject disclosure relates to an optical system, and in particular, to detection an optical parameter of an eye.
BACKGROUND
With the development of technologies, there are more and more applications that control a device by detecting a natural action of a user, where applications that interact with the device by tracking an eye include the following applications.
The single lens reflex EOS 3 published by Canon has an exclusive eye control focusing function with 45 focusing points, which can automatically detect the motion of the pupil of the user eyeball that is watching the eyepiece of the camera. The eyeball is lighted by an infrared light emitting diode installed on the frame of the camera eyepiece, and the infrared light reflected back by the eyeball is cast onto an eye control BASIS (base stored image sensor). After the system detects the relative relationship between a position of the eyeball pupil and a calibration position, the camera identifies which focusing point in the focusing points is being watched by the user, so as to determine a sight line direction of the user, and perform automatic focusing on an object in the direction.
Tobii Technology develops an eye control interaction system, where an eye tracker (eye tracker) shoots and locates in real time a micro projection pattern reflected on the user eyeball (or directly shoots and locates the eyeball motion), so that the system can very precisely track a direction watched by the eye, to perform user eye control interaction or analyze a user reading behavior.
Google discloses, in U.S. Patent publication No. 2012/0290401, an apparatus and a method for determining a direction watched by an eye of a user through a wearable device, where the position of the user pupil is detected in real time by installing a camera or a CCD on a pair of glasses, so that the sight line direction of the user eye can be obtained.
As can be seen from the above, in the prior art, the sight line direction of the user is mainly obtained through the image on the eyeball surface. If the position of the focusing point of the eye is desired, the distance between the object and the eye is generally preset, as described in the foregoing eye control interaction technology, and the position of the focusing point can be obtained according to the sight line direction of the user and the set distance. In this case, if the distance between the object and the eye is not known, the position of the focusing point of the eye cannot be obtained. Alternatively, as disclosed in international patent publication No. WO2005077258A1, the position of the eye focusing point is obtained according to sight line directions of two eyes of the user and the intersection between the sight lines, so in this case, the sight line directions of the two eyes of the user need to be detected simultaneously, and the detection precision is not very high.
Medically, a fundus camera (Fundus Camera) is often used in ophthalmic diagnosis, where refined pictures of the fundus retina are captured to assist in diagnosis of some possible ophthalmic diseases, including determination of the eye diopter, as recorded in U.S. Pat. No. 7,001,020. However, the patent requires that the user stares at a specific target and has a test before an optical parameter of an eye can be determined. Therefore, the position of the focusing point of the eye in daily use cannot be determined.
SUMMARY
Various embodiments provide a system and a method for detecting an optical parameter of an eye, which are used to determine an optical parameter of an eye and particularly a position of a focusing point of the eye.
In a first aspect, a system is provided that detects an optical parameter of an eye, including
an image collection apparatus configured to collect at least one image presented by a fundus of an eye,
an imaging apparatus configured to adjust at least one imaging parameter of an optical path between the eye and the image collection apparatus, so that the image collection apparatus obtains a clearest image, and
an image processing apparatus configured to process the image obtained by the image collection apparatus, to obtain at least one optical parameter of the eye when the image collection apparatus obtains the clearest image.
In a second aspect, a wearable optical device is provided, e.g., including the foregoing system for detecting an optical parameter of an eye.
In a third aspect, a method is provided for detecting an optical parameter of an eye, including
collecting in real time at least one image presented by a fundus of an eye,
adjusting at least one imaging parameter of an optical path between the eye and an image collection apparatus, so as to collect a clearest image, and
processing the collected image, to obtain at least one optical parameter of the eye when the collected image is the clearest.
In one or more embodiments of the present application, the image of the fundus is captured, and an imaging parameter known in the optical path when the clearest image is captured is found, so that a direction which is currently being watched by the eye and a distance between the eye and a focusing point can be obtained through optical calculation, and a position of the focusing point of the eye can further be determined precisely, providing a further application basis for an eye control interaction related technology at the same time.
BRIEF DESCRIPTION OF THE DRAWINGS
The various embodiments of the subject application will become more fully understood from the detailed description given herein below for illustration only, and thus are not limiting, and wherein:
FIG. 1 is a structural block diagram of a system for detecting an optical parameter of an eye according to an embodiment;
FIG. 2 is a structural block diagram of another system for detecting an optical parameter of an eye according to an embodiment;
FIG. 3a is a schematic diagram of a light spot pattern used in a system for detecting an optical parameter of an eye according to an embodiment;
FIG. 3b is a schematic diagram of a fundus image with a light spot pattern captured by a system for detecting an optical parameter of an eye according to an embodiment;
FIG. 4a is a schematic diagram of an optical path for eye imaging according to an embodiment;
FIG. 4b is a schematic diagram of a distance from an eye focusing point to an eye and obtained according to an imaging parameter known by a system and an optical parameter of the eye according to an embodiment;
FIG. 5 is a schematic diagram of an application of a system for detecting an optical parameter of an eye on a pair of glasses according to an embodiment;
FIG. 6 is a schematic diagram of an application of another system for detecting an optical parameter of an eye on a pair of glasses according to an embodiment;
FIG. 7 is a flowchart of a method for detecting an optical parameter of an eye according to an embodiment;
FIG. 8 is a flowchart of step S130 of a method for detecting an optical parameter of an eye according to an embodiment; and
FIG. 9 is a structural block diagram of an image processing apparatus of a system for detecting an optical parameter of an eye according to an embodiment.
DETAILED DESCRIPTION
The various embodiments are described in detail in the following with reference to accompanying drawings and embodiments.
As shown in FIG. 1, an embodiment provides a system 100 for detecting an optical parameter of an eye, including
an image collection apparatus 110 configured to collect an image presented by a fundus,
an imaging apparatus 120 configured to adjust an imaging parameter between an eye and the image collection apparatus 110, so that the image collection apparatus 110 obtains a clearest image, and
an image processing apparatus 130 configured to process the image obtained by the image collection apparatus 110, to obtain an optical parameter of the eye when the image collection apparatus obtains the clearest image.
In an embodiment, the optical parameter of the eye when the image collection apparatus obtains the clearest image is obtained by analyzing and processing the image of the eye fundus, so that a position of a current focusing point of the eye can be obtained through calculation, providing a basis for further implementing an eye self-adaptive operation.
Herein, the image presented by the “fundus” is mainly an image presented on a retina, which may be an image of the fundus, or an image of another object cast to the fundus. The eye herein may be a human eye or an eye of another animal.
As shown in FIG. 2, in a possible implementation manner of the embodiment, the image collection apparatus 110 is a micro camera. In another possible implementation manner of the embodiment, the image collection apparatus 110 may also use a photosensitive imaging device directly, such as a CCD or a CMOS.
In a possible implementation manner of the embodiment, the imaging apparatus 120 includes: an adjustable lens unit 121, located in the optical path between the eye and the image collection apparatus 110, where a focal length of the adjustable lens unit 121 is adjustable and/or a position of the adjustable lens unit 121 in the optical path is adjustable. Through the adjustable lens unit 121, a system equivalent focal length from the eye to the image collection apparatus 110 is adjustable. Through adjustment of the adjustable lens unit 121, the image collection apparatus 110 obtains the clearest image of the fundus when the adjustable lens unit 121 is located at a certain position or is in a certain state. In this implementation manner, the adjustable lens unit 121 is adjusted continuously and in real time during detection.
In a possible implementation manner of the embodiment, the adjustable lens unit 121 is a focal length adjustable lens, configured to adjust a focal length thereof by adjusting a refractive index and/or shape thereof, which is specifically: 1) the focal length is adjusted by adjusting a curvature of at least one surface of the focal length adjustable lens, for example, adjusting the curvature of the focal length adjustable lens by increasing or reducing the quantity of a liquid medium in a cavity formed by two layers of transparent layers; and 2) the focal length is adjusted by changing the refractive index of the focal length adjustable lens, for example, the focal length adjustable lens is filled with a specific liquid crystal medium, and the voltage of an electrode corresponding to the liquid crystal medium is adjusted to adjust an arrangement manner of the liquid crystal medium, so as to change the refractive index of the focal length adjustable lens.
In another possible implementation manner of the embodiment, the adjustable lens unit 121 includes a lens set, configured to adjust a relative position between lenses in the lens set, to adjust a focal length of the lens set.
Besides the foregoing two manners where the optical path parameter of the system is changed by adjusting the feature of the adjustable lens unit 121, the optical path parameter of the system may also be changed by adjusting the position of the adjustable lens unit 121 in the optical path.
In a possible implementation manner of the embodiment, in order not to affect the viewing experience of the user on a viewed object, and in order to enable the system to be portably applied to a wearable device, the imaging apparatus 120 further includes: a light splitting apparatus 122, configured to form a light transferring path between the eye and the viewed object, and a light transferring path between the eye and the image collection apparatus 110. In this way, the optical path can be folded, thereby reducing the volume of the system while not affecting other experience of the user as much as possible.
In this implementation manner, the light splitting apparatus includes a first light splitting unit, located between the eye and the viewed object, and configured to transmit light from the viewed object to the eye, and transfer light from the eye to the image collection apparatus.
The first light splitting unit may be a beam splitter, a light splitting optical waveguide (including optical fiber), or another proper light splitting device.
In a possible implementation manner of the embodiment, the image processing apparatus 130 includes an optical path calibration module, configured to calibrate the optical path of the system, for example, align and calibrate an optical axis of the optical path, so as to ensure the detection precision.
In a possible implementation manner of the embodiment, the image processing apparatus 130 includes
an image analyzing module 131 configured to analyze the image obtained by the image collection apparatus, to find the clearest image, and
a parameter calculation module 132 configured to calculate the optical parameter of the eye according to the clearest image and the imaging parameter known by the system when the clearest image is obtained.
In this implementation manner, through the imaging apparatus 120, the image collection apparatus 110 can obtain the clearest image, but the clearest image needs to be found through the image analyzing module 131. In this case, the optical parameter of the eye can be obtained through calculation according to the clearest image and the optical path parameter known by the system. Herein, the optical parameter of the eye may include an optical axis direction of the eye.
In a possible implementation manner of the embodiment, the system further includes a casting apparatus 140 configured to cast a light spot to the fundus. In a possible implementation manner, a function of the casting apparatus may be implemented by a micro projector.
The cast light spot herein may not have a specific pattern and be only used to lighten the fundus.
In a preferable implementation manner of the embodiment, the cast light spot includes a pattern rich in features. The pattern is rich in features, which can facilitate detection and improve the detection precision. FIG. 3a is a schematic diagram of a light spot pattern 500. The pattern may be generated by a light spot pattern generator, such as frosted glass. FIG. 3b shows an image of the fundus captured when the light spot pattern 500 is cast.
In order not to affect the eye in viewing an object normally, the light spot is an infrared light spot invisible to the eye.
In this case, to reduce interference from another optical spectrum
an emergent surface of the casting apparatus may be provided with an eye invisible light transmission filter, for example, an infrared transmission filter, and
an incident surface of the image collection apparatus is provided with an eye invisible light transmission filter.
In a possible implementation manner of the embodiment, the image processing apparatus 130 further includes
a casting control module 134 configured to control, according to a result obtained by the image analyzing module, brightness of the light spot cast by the casting apparatus.
For example, the casting control module 134 may self-adaptively adjust the brightness according to the feature of the image obtained by the image collection apparatus 110. Herein, the feature of the image includes an image feature contrast, a texture feature, and so on.
Herein, a special case of controlling the brightness of the light spot cast by the casting apparatus 140 is: turning on or off the casting apparatus 140. For example, when the user keeps watching a certain point, the casting apparatus may be turned off periodically; when the fundus of the user is bright enough, the casting apparatus 140 may be turned off, and only fundus information is used to detect a distance from a focusing point of a current sight line of the eye to the eye.
Besides, the casting control module 134 may further control, according to environment light, the brightness of the light spot cast by the casting apparatus.
In a possible implementation manner of the embodiment, the image processing apparatus 130 further includes an image calibration module 133 configured to calibrate the image of the fundus to obtain at least one reference image corresponding to the image presented by the fundus.
The image analyzing module 131 performs comparison calculation for the image obtained by the image collection apparatus 130 and the reference image, to obtain the clearest image. Herein, the clearest image may be an obtained image least different from the reference image. In this implementation manner, the difference between the currently obtained image and the reference image is calculated through an existing image processing algorithm, for example, using a classic phase difference automatic focusing algorithm.
In a possible implementation manner of the embodiment, the parameter calculation module 132 includes
an eye optical axis direction determination unit 1321 configured to obtain an eye optical axis direction according to a feature of the eye when the clearest image is obtained.
Herein, the feature of the eye may be obtained from the clearest image or obtained elsewhere. The eye optical axis direction indicates a direction watched by the sight line of the eye.
In a possible implementation manner of the embodiment, the eye optical axis direction determination unit 1321 includes a first determination subunit, configured to obtain the eye optical axis direction according to a feature of the fundus when the clearest image is obtained. Compared with the manner of obtaining the eye optical axis direction through features of the pupil and eyeball surface, the manner of determining the eye optical axis direction through the feature of the fundus is higher in precision.
When the light spot pattern is cast to the fundus, the size of the light spot pattern may be greater or smaller than that of a visible region of the fundus, where,
when the area of the light spot pattern is smaller than or equal to that of the visible region of the fundus, the eye optical axis direction may be determined by using a classic feature point matching algorithm (for example, a scale invariant feature transform (SIFT) algorithm) to detect a position of the light spot pattern on the image relative to the fundus.
Further, when the area of the light spot pattern is greater than or equal to that of the visible region of the fundus, the eye optical axis direction may be determined through a position of the obtained light spot pattern on the image relative to an original light spot pattern (obtained by the image calibration module), to determine the sight line direction of the user.
In another possible implementation manner of the embodiment, the eye optical axis direction determination unit 1321 includes a second determination subunit, configured to obtain the eye optical axis direction according to a feature of the pupil of the eye when the clearest image is obtained. Herein, the feature of the eye pupil may be obtained from the clearest image or obtained elsewhere. The manner of obtaining the eye optical axis direction through the feature of the eye pupil belongs to the prior art, which is not detailed herein again.
In a possible implementation manner of the embodiment, the image processing apparatus 130 further includes an eye optical axis direction calibration module 135, configured to calibrate the eye optical axis direction, so that the eye optical axis direction is determined more precisely.
In this implementation manner, the imaging parameter known by the system includes: a constant imaging parameter and a real-time imaging parameter, where the real-time imaging parameter is adjustable parameter information of the adjustable lens unit when the clearest image is obtained, and the parameter information may be obtained through real-time record when the clearest image is obtained.
After the current optical parameter of the eye is obtained, the distance from the eye focusing point to the eye can be obtained through a calculation, e.g., as follows:
FIG. 4a is a schematic diagram of eye imaging. With reference to a lens imaging formula in the classic optical theory, the following formula (1) may be obtained according to FIG. 4a :
1 d o + 1 d e = 1 f e ( 1 )
where do represents a distance from a current viewed object 1010 of the eye to an eye equivalent lens 1030, de represents a distance from a real image 1020 on the retina to the eye equivalent lens 1030, fe represents an equivalent focal length of the eye equivalent lens 1030, and X represents the sight line direction of the eye (which may be obtained according to the optical axis direction of the eye).
FIG. 4b is a schematic diagram of a distance from the eye focusing point to the eye and obtained according to the optical parameter known by the system and the optical parameter of the eye. A light spot 1040 in FIG. 4b forms a virtual image (not shown in FIG. 4b ) through the adjustable lens unit 121. Assuming that a distance from the virtual image to the lens is x (not shown in FIG. 4b ), the following equation set may be obtained with reference to formula (1):
{ 1 d p - 1 x = 1 f p 1 d i + x + 1 d e = 1 f e ( 2 )
where dp represents an optical equivalent distance from the light spot 1040 to the adjustable lens unit 121, dl represents an optical equivalent distance from the adjustable lens unit 121 to the eye equivalent lens 1030, fp represents a value of the focal length of the adjustable lens unit 121, and dl represents a distance from the eye equivalent lens 1030 to the adjustable lens unit 121.
The distance do from the current viewed object 1010 (the eye focusing point) to the eye equivalent lens 1030 can be obtained according to (1) and (2), as shown in formula (3):
d o = d i + d p · f p f p - d p ( 3 )
Because the distance from the viewed object 1010 to the eye is obtained according to the foregoing calculation, and the eye optical axis direction can be obtained through the previous record, the position of the eye focusing point can be obtained easily, which provides a basis for subsequent further eye related interaction.
FIG. 5 shows an embodiment where a system 400 for detecting an eye focusing point is applied to a pair of glasses 200 in a possible implementation manner of an embodiment, which includes the content recorded in the implementation manner shown in FIG. 2. Specifically, as can be seen from FIG. 5, in this implementation manner, the system 400 of this implementation manner is integrated on the right side (not limited thereto) of the glasses 200, which includes
a micro camera 410, which functions in the same way as the image collection apparatus recorded in the implementation manner of FIG. 2, and is arranged on the right outside of the glasses 200, so as not to affect the sight line of the user in viewing the object normally.
The glasses 200 can include a first beam splitter 420, which functions in the same way as the first light splitting unit recorded in the implementation manner of FIG. 2, and is arranged with a certain tilt angle at the intersection between a direction watched by an eye 300 and an incident direction of the camera 410, and transmits light of the viewed object entering the eye 300 and reflects light from the eye to the camera 410.
Further, an focal length adjustable lens 430 can be included, which functions in the same way as the focal length adjustable lens recorded in the implementation manner of FIG. 2, and is located between the first beam splitter 420 and the camera 410, and adjusts the value of the focal length in real time, so that the camera 410 can capture a clearest image of the fundus when the focal length is set to a certain value.
In this implementation manner, no image processing apparatus is shown in FIG. 5, which functions in the same way as the image processing apparatus shown in FIG. 2.
Generally, the fundus is not bright enough, and therefore, it would be best to increase light to the fundus. In this implementation manner, the fundus is lighted by a light emitting source 440. In order not to affect the user experience, the light emitting source 440 herein is a light source emitting light invisible to the eye, which is a near-infrared light emitting source that affects the eye 300 slightly and to which the camera 410 is sensitive.
In this implementation manner, because the light emitting source 440 is located on the right outside of the frame of the glasses, a second beam splitter 450 is required to transfer, to the fundus together with the first beam splitter 420, the light emitted by the light emitting source 440. In this implementation manner, because the second beam splitter 450 is located before the incident surface of the camera 410, it is further required to transmit the light from the fundus to the second beam splitter 450.
As can be seen, in this implementation manner, to improve the user experience and the collection definition of the camera 410, the first beam splitter 420 may have features such as a high infrared reflectance and a high visible light transmittance. For example, the foregoing features may be achieved by arranging an infrared reflective film on a side of the first beam splitter 420 towards the eye 300.
As can be seen from FIG. 5, in this implementation manner, because the system 400 for detecting an eye focusing point is located on a side of a lens of the glasses 200 far away from the eye 300, when the optical parameter of the eye is calculated, the lens may also be considered as a part of the eye. In this case, it is not required to know the optical feature of the lens.
In another implementation manner of the embodiment, the system 400 for detecting an eye focusing point may be located on a side of a lens of the glasses 200 close to the eye 300. In this case, the optical feature parameter of the lens needs to be obtained in advance, and the influence factor of the lens is considered when the distance between the eye and the focusing point is calculated.
The light emitted by the light emitting source is reflected by the second beam splitter 420, cast by the focal length adjustable lens 430, and reflected by the first beam splitter 420, then enters the eye of the user through the lens of the glasses 200, and finally reaches the retina of the fundus. The camera 410 captures the image of the fundus through an optical path formed by the first beam splitter 420, the focal length adjustable lens 430, and the second beam splitter 450 through the pupil of the eye 300.
FIG. 6 is a schematic structural diagram of a system 600 for detecting an eye focusing point in another implementation manner of the embodiment. As can be seen from FIG. 6, this implementation manner is similar to that shown in FIG. 5 in that, the system includes a micro camera 610, a second beam splitter 620, and a focal length adjustable lens 630, and different from that shown in FIG. 5 in that, in this implementation manner, a casting apparatus 640 is a casting apparatus 640 casting a light spot pattern, and the first beam splitter in the implementation manner of FIG. 5 is replaced with a curved beam splitter 650.
Herein, the curved beam splitter 650 transfers, to an image collection apparatus, images presented by a fundus and separately corresponding to positions of a pupil in the case of different eye optical axis directions. In this way, the camera can capture images formed by blending and superimposing images of the eyeball from different angles. However, only the fundus part of the pupil can be imaged clearly on the camera, while other parts fail to be clearly imaged due to out of focus. Therefore, the imaging of the fundus part is not affected seriously, and the feature of the fundus part can still be detected. Therefore, compared with the implementation manner of FIG. 5, in this implementation manner, an image of the fundus can be obtained very well even though the eye watches an object in different directions, so that the application range of the system for detecting eye focusing point of this implementation manner is wider, and the detection precision thereof is higher.
FIG. 7 shows a method for detecting an optical parameter of an eye according to an embodiment, including the following operations.
S110: Collect in real time an image presented by a fundus.
S120: Adjust an imaging parameter of an optical path between an eye and an image collection apparatus, so as to collect a clearest image.
S130: Process the collected image, to obtain an optical parameter of the eye when the collected image is the clearest.
In a possible implementation manner of the embodiment, the method further includes operation S140: obtaining a position of a focusing point of the eye according to the optical parameter of the eye.
In a possible implementation manner of the embodiment, before the operation S130: processing the collected image, the method further includes
calibrating the image of the fundus, to obtain at least one reference image corresponding to the image presented by the fundus.
As shown in FIG. 8, a possible implementation manner of the embodiment, the operation S130: processing the collected image, to obtain an optical parameter of the eye when the collected image is the clearest includes the following operations.
S131: Analyze the collected image, to find the clearest image.
S132: Calculate the optical parameter of the eye according to the clearest image and the imaging parameter known in the optical path when the clearest image is obtained.
In a possible implementation manner of the embodiment, the operation S131 includes
performing comparison calculation for the collected image and the reference image, to obtain the clearest image.
In a possible implementation manner of the embodiment, the operation S132 includes
obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained.
In a possible implementation manner of the embodiment, the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained includes
obtaining an eye optical axis direction according to a feature of the fundus when the clearest image is obtained.
In another possible implementation manner of the embodiment, the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained includes
obtaining an eye optical axis direction according to a feature of a pupil of the eye when the clearest image is obtained.
In a possible implementation manner of the embodiment, before the operation of obtaining an eye optical axis direction according to a feature of the eye when the clearest image is obtained, the method further includes an operation of calibrating the eye optical axis direction.
In a possible implementation manner of the embodiment, in operation S120, the imaging parameter of the optical path between the eye and the image collection apparatus is adjusted in a manner of adjusting a focal length of a lens unit in the optical path between the eye and the image collection apparatus and/or a position of the lens unit in the optical path.
In a possible implementation manner of the embodiment, the operation of collecting in real time an image presented by a fundus includes
collecting images presented by the fundus and separately corresponding to positions of the pupil in the case of different eye optical axis directions.
In a possible implementation manner of the embodiment, the method further includes an operation of casting a light spot to the fundus.
The cast light spot includes a pattern rich in features and is an infrared light spot.
To reduce the influence of visible light on the detection precision, this implementation manner, the light spot cast to the fundus is filtered by an eye invisible light transmission filter; and the collected image is filtered by the eye invisible light transmission filter. For example, a near-infrared transmission light filter may be used, so that only near-infrared light can pass through the light filter.
In a possible implementation manner of the embodiment, the method further includes an operation of: controlling brightness of the cast light spot according to a result obtained by analyzing the collected image.
The method of the embodiment may be implemented through the apparatus embodiment in FIG. 1 to FIG. 6. For the specific implementation means, refer to the description of the apparatus embodiment. Details are not described herein again.
A person skilled in the art may understand that, in the method of the specific implementation manner, the sequence numbers of the steps or operations do not mean an execution order, the execution order of the steps or operations should be determined according to their functions and internal logic, and shall not be construed as any limitation to the implementation process of the specific implementation manner.
FIG. 9 is a schematic structural diagram of an image processing apparatus 800 in a system for detecting an optical parameter of an eye provided in an embodiment, and the specific implementation of the image processing apparatus 800 is not limited in the specific embodiment. As shown in FIG. 9, the image processing apparatus 800 may include
a processor (processor) 810, a communications interface (Communications Interface) 820, a memory (memory) 830, and a communication bus 840.
The processor 810, the communications interface 820, and the memory 830 communicate with each other through the communication bus 840.
The communications interface 820 is configured to communicate with a network element such as a client.
The processor 810 is configured to execute a program 832, and may specifically execute related operations in the method embodiment shown in FIG. 8.
Specifically, the program 832 may include a program code, where the program code includes a computer operation instruction.
The processor 810 may be a central processing unit CPU, or an application specific integrated circuit (ASIC), or be configured as one or more integrated circuits for implementing the embodiments.
The memory 830 is configured to store the program 832. The memory 830 may include a high-speed RAM memory, and may further include a non-volatile memory, for example, at least one disk memory. The program 832 may specifically enable the image processing apparatus 800 to execute the following operations analyzing the image obtained by the image collection apparatus, to find the clearest image, and calculating the optical parameter of the eye according to the clearest image and the imaging parameter known by the system when the clearest image is obtained.
For the specific implementation of the operations in the program 832, reference may be made to corresponding descriptions about corresponding operations and units in the foregoing embodiments, and details are not described herein again. It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for the specific working processes of the foregoing devices and modules, reference may be made to the description about the corresponding process in the foregoing method embodiment, and the details are not described herein again.
An embodiment further provides a wearable optical device. The wearable optical device may be the frame glasses shown in FIG. 5 or FIG. 6, or a contact lens. The wearable optical device includes the system for detecting an optical parameter of an eye recorded in the foregoing embodiments.
In another possible implementation manner of the embodiment, the system for detecting an optical parameter of an eye may also be applied to another eye related device, for example, a non-wearable optical device such as a telescope; or the system for detecting an optical parameter may be further applied to another imaging and receiving apparatus, such as a camera, except the eye.
A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and method operations may be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraint conditions of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope.
When the functions are implemented in a form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the various embodiments essentially, or the part contributing to the prior art, or a part of the technical solutions may be implemented in a form of a software product. The computer software product can be stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or a part of the operations of the methods described in the embodiments. The foregoing storage medium includes: any medium that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.
The foregoing implementation manners are merely intended for describing the present invention rather than limiting the present invention. A person of ordinary skill in the art should understand that modifications and variations may still be made without departing from the spirit and scope. Therefore, all equivalent technical solutions shall fall within the scope, and the patent protection scope various embodiments shall be subject to the claims.

Claims (42)

What is claimed is:
1. A system, comprising:
an image collection apparatus configured to collect at least one image presented by a fundus of an eye;
an imaging apparatus configured to adjust at least one imaging parameter of an optical path between the eye and the image collection apparatus, wherein the image collection apparatus obtains an image of the at least one image that satisfies at least a defined clarity criterion; and
an image processing apparatus configured to process the image to obtain at least one optical parameter of the eye in response to the image collection apparatus obtaining the image that satisfies at least the defined clarity criterion, wherein the image processing apparatus comprises a parameter calculation module configured to determine the at least one optical parameter of the eye according to the image that satisfies the at least the defined clarity criterion and the at least one imaging parameter corresponding to a time when the image that satisfies the at least the defined clarity criterion was obtained.
2. The system according to claim 1, wherein the image processing apparatus further comprises:
an image analyzing module configured to analyze the at least one image collected by the image collection apparatus to find a clearest image of the at least one image relative to the defined clarity criterion; wherein the parameter calculation module is configured to determine the at least one optical parameter of the eye according to the clearest image and the at least one imaging parameter corresponding to a time when the clearest image was obtained.
3. The system according to claim 2, wherein the image processing apparatus further comprises:
an image calibration module configured to obtain at least one reference image corresponding to the at least one image presented by the fundus.
4. The system according to claim 3, wherein the image analyzing module is further configured to compare the at least one image collected by the image collection apparatus and the at least one reference image to obtain the clearest image.
5. The system according to claim 2, wherein the parameter calculation module is further configured to obtain a position of a focusing point of the eye according to the at least one optical parameter of the eye.
6. The system according to claim 2, wherein the parameter calculation module comprises:
an eye optical axis direction determination unit configured to obtain an eye optical axis direction according to a feature of the eye in response to the clearest image being obtained.
7. The system according to claim 6, wherein the eye optical axis direction determination unit comprises:
a first determination subunit configured to obtain the eye optical axis direction according to a feature of the fundus in response to the clearest image being obtained.
8. The system according to claim 6, wherein the eye optical axis direction determination unit comprises:
a second determination subunit configured to obtain the eye optical axis direction according to a feature of a pupil of the eye in response to the clearest image being obtained.
9. The system according to claim 6, wherein the image processing apparatus further comprises:
an eye optical axis direction calibration module configured to calibrate the eye optical axis direction.
10. The system according to claim 2, further comprising:
a casting apparatus configured to cast a light spot to the fundus.
11. The system according to claim 10, wherein the light spot cast by the casting apparatus comprises a pattern.
12. The system according to claim 10, wherein the casting apparatus is an infrared light spot casting apparatus.
13. The system according to claim 12, wherein an emergent surface of the casting apparatus is provided with an eye invisible light transmission filter.
14. The system according to claim 12, wherein an incident surface of the image collection apparatus is provided with an eye invisible light transmission filter.
15. The system according to claim 10, wherein the image processing apparatus further comprises:
a casting control module configured to control, according to a result obtained by the image analyzing module, a brightness of the light spot cast by the casting apparatus.
16. The system according to claim 1, wherein the imaging apparatus comprises:
an adjustable lens unit, located in the optical path between the eye and the image collection apparatus, wherein a focal length of the adjustable lens unit is adjustable or a position of the adjustable lens unit in the optical path is adjustable.
17. The system according to claim 16, wherein the adjustable lens unit comprises:
a focal length adjustable lens, having the focal length or alternatively having a different focal length, configured to adjust the focal length or alternatively the different focal length by adjusting a refractive index and/or shape of the focal length adjustable lens.
18. The system according to claim 16, wherein the adjustable lens unit comprises:
a lens set configured to adjust a relative position between at least two lenses in the lens set to adjust another focal length of the lens set.
19. The system according to claim 1, wherein the imaging apparatus further comprises:
a light splitting apparatus configured to form a first light transferring path between the eye and a viewed object, and a second light transferring path between the eye and the image collection apparatus.
20. The system according to claim 19, wherein the light splitting apparatus comprises:
a first light splitting unit, located between the eye and the viewed object, and configured to transmit light from the viewed object to the eye, and transfer light from the eye to the image collection apparatus.
21. The system according to claim 20, wherein the first light splitting unit is a beam splitter or a light splitting optical waveguide.
22. The system according to claim 21, wherein the first light splitting unit is a curved beam splitter configured to transfer, to the image collection apparatus, the at least one image presented by the fundus, and wherein the at least one image respectively correspond to at least one position of a pupil associated with different eye optical axis directions.
23. The system according to claim 1, wherein the system is a wearable optical device comprising the image collection apparatus, the imaging apparatus, and the image processing apparatus.
24. A method, comprising:
collecting at least one image presented by a fundus of an eye;
adjusting at least one imaging parameter of an optical path between the eye and an image collection apparatus to collect an image of the at least one image that at least satisfies a defined clarity condition; and
processing the image to obtain at least one optical parameter of the eye in response to the image being determined to satisfy the defined clarity condition, wherein the processing the image to obtain the at least one optical parameter of the eye comprises calculating the at least one optical parameter of the eye according to the image that satisfies the defined clarity condition and the at least one imaging parameter known from the optical path in response to the image that satisfies the defined clarity condition being obtained.
25. The method according to claim 24, wherein the processing the image to obtain the at least one optical parameter of the eye further comprises:
analyzing the at least one image to find a clearest image; and
calculating the at least one optical parameter of the eye according to the clearest image and the at least one imaging parameter known in the optical path in response to the clearest image being obtained.
26. The method according to claim 25, further comprising:
before the processing of the image, calibrating the at least one image of the fundus to obtain at least one reference image corresponding to the at least one image presented by the fundus.
27. The method according to claim 26, wherein the analyzing the at least one image to find the clearest image comprises:
comparing the at least one image and the at least one reference image to obtain the clearest image.
28. The method according to claim 25, further comprising:
obtaining a position of a focusing point of the eye according to the at least one optical parameter of the eye.
29. The method according to claim 25, wherein the calculating the at least one optical parameter of the eye according to the clearest image and the at least one imaging parameter known in the optical path in response to the clearest image being obtained comprises:
obtaining an eye optical axis direction according to a feature of the eye in response to the clearest image being obtained.
30. The method according to claim 29, further comprising:
calibrating the eye optical axis direction before the eye optical axis direction is obtained according to the feature of the eye.
31. The method according to claim 25, wherein the calculating the at least one optical parameter of the eye according to the clearest image and the at least one imaging parameter known in the optical path in response to the clearest image being obtained comprises:
obtaining the eye optical axis direction according to a feature of the fundus in response the clearest image being obtained.
32. The method according to claim 25, wherein the calculating the at least one optical parameter of the eye according to the clearest image and the at least one imaging parameter known in the optical path in response to the clearest image being obtained comprises:
obtaining the eye optical axis direction according to a feature of a pupil of the eye in response to the clearest image being obtained.
33. The method according to claim 25, further comprising:
casting a light spot to the fundus.
34. The method according to claim 33, wherein the light spot comprises a defined pattern.
35. The method according to claim 33, wherein the light spot is an infrared light spot.
36. The method according to claim 35, wherein the light spot cast to the fundus is directed through a filter that blocks visible light transmission.
37. The method according to claim 35, wherein the at least one image is filtered by an eye invisible light transmission filter.
38. The method according to claim 33, further comprising:
controlling a brightness of the light spot according to a result obtained by analyzing the at least one image.
39. The method according to claim 24, wherein the at least one imaging parameter of the optical path between the eye and the image collection apparatus is adjusted in a manner of adjusting a focal length of a lens unit in the optical path between the eye and the image collection apparatus or a position of the lens unit in the optical path.
40. The method according to claim 24, wherein the collecting the at least one image presented by the fundus comprises:
collecting images presented by the fundus that respectively correspond to positions of a pupil associated with different eye optical axis directions.
41. A computer readable storage device comprising executable instructions that, in response to execution, cause a wearable optical apparatus comprising a processor to perform operations, comprising:
receiving at least one image of a fundus of an eye;
determining an image of the at least one image that at least satisfies a defined clarity characteristic, wherein the determining comprises modifying at least one imaging parameter of an optical path between the eye and an image collection apparatus; and
processing the image to obtain at least one optical parameter of the eye in response to the image being determined to satisfy the defined clarity characteristic, wherein the processing the image to obtain the at least one optical parameter of the eye comprises determining the at least one optical parameter of the eye based on information represented by the image that satisfies the defined clarity characteristic and the at least one imaging parameter of the optical path in response to the image that satisfies the defined clarity characteristic being determined.
42. The computer readable storage device according to claim 41, wherein the processing the image to obtain the at least one optical parameter of the eye further comprises:
analyzing the at least one image to determine a clearest image; and
determining the at least one optical parameter of the eye based on information represented by the clearest image and the at least one imaging parameter of the optical path in response to the clearest image being determined.
US14/781,306 2013-07-31 2013-12-04 System for detecting optical parameter of eye, and method for detecting optical parameter of eye Active 2033-12-29 US9867532B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201310329797 2013-07-31
CN201310329797.5 2013-07-31
CN201310329797.5A CN103431840B (en) 2013-07-31 2013-07-31 Eye optical parameter detecting system and method
PCT/CN2013/088544 WO2015014058A1 (en) 2013-07-31 2013-12-04 System for detecting optical parameter of eye, and method for detecting optical parameter of eye

Publications (2)

Publication Number Publication Date
US20160135675A1 US20160135675A1 (en) 2016-05-19
US9867532B2 true US9867532B2 (en) 2018-01-16

Family

ID=49685635

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/781,306 Active 2033-12-29 US9867532B2 (en) 2013-07-31 2013-12-04 System for detecting optical parameter of eye, and method for detecting optical parameter of eye

Country Status (3)

Country Link
US (1) US9867532B2 (en)
CN (1) CN103431840B (en)
WO (1) WO2015014058A1 (en)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016036390A (en) * 2014-08-05 2016-03-22 富士通株式会社 Information processing unit, focal point detection method and focal point detection program
CN104469152B (en) * 2014-12-02 2017-11-24 广东欧珀移动通信有限公司 The automatic camera method and system of Wearable
CN104921697B (en) * 2015-05-18 2017-04-26 华南师范大学 Method for quickly measuring longitudinal distances of sight of human eyes
CN106296796B (en) 2015-06-04 2019-08-13 北京智谷睿拓技术服务有限公司 Information processing method, information processing unit and user equipment
CN106294911B (en) 2015-06-04 2019-09-10 北京智谷睿拓技术服务有限公司 Information processing method, information processing unit and user equipment
CN106293031B (en) * 2015-06-04 2019-05-21 北京智谷睿拓技术服务有限公司 Information processing method, information processing unit and user equipment
CN106254752B (en) * 2015-09-22 2019-05-31 北京智谷睿拓技术服务有限公司 Focusing method and device, image capture device
ES2653913B1 (en) * 2016-07-06 2019-01-04 Univ Murcia Optical instrument for measuring the density of macular pigment in the eye and associated method
KR102648770B1 (en) 2016-07-14 2024-03-15 매직 립, 인코포레이티드 Deep neural network for iris identification
EP3484343B1 (en) 2016-07-14 2024-01-10 Magic Leap, Inc. Iris boundary estimation using cornea curvature
EP3500911B1 (en) 2016-08-22 2023-09-27 Magic Leap, Inc. Augmented reality display device with deep learning sensors
RU2016138608A (en) * 2016-09-29 2018-03-30 Мэджик Лип, Инк. NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE
CA3038967A1 (en) 2016-10-04 2018-04-12 Magic Leap, Inc. Efficient data layouts for convolutional neural networks
CA3043352A1 (en) 2016-11-15 2018-05-24 Magic Leap, Inc. Deep learning system for cuboid detection
KR20230070318A (en) 2016-12-05 2023-05-22 매직 립, 인코포레이티드 Virual user input controls in a mixed reality environment
US10729320B2 (en) * 2016-12-17 2020-08-04 Alcon Inc. Determining eye surface contour using multifocal keratometry
KR102302725B1 (en) 2017-03-17 2021-09-14 매직 립, 인코포레이티드 Room Layout Estimation Methods and Techniques
KR102368661B1 (en) 2017-07-26 2022-02-28 매직 립, 인코포레이티드 Training a neural network using representations of user interface devices
WO2019060283A1 (en) 2017-09-20 2019-03-28 Magic Leap, Inc. Personalized neural network for eye tracking
CN111373419A (en) 2017-10-26 2020-07-03 奇跃公司 Gradient normalization system and method for adaptive loss balancing in deep multitask networks
WO2019087209A1 (en) * 2017-11-03 2019-05-09 Imran Akthar Mohammed Wearable ophthalmoscope device and a method of capturing fundus image
CN110726532A (en) * 2018-07-17 2020-01-24 亨泰光学股份有限公司 Focusing point detection method of contact lens
CN109620137B (en) * 2018-12-17 2022-02-08 深圳盛达同泽科技有限公司 Retina exposure method, retina exposure device, electronic equipment and readable storage medium
CN110568627A (en) * 2019-09-04 2019-12-13 爱诺刻(深圳)高科有限公司 Control method of zoom glasses
CN111208905A (en) * 2020-01-08 2020-05-29 北京未动科技有限公司 Multi-module sight tracking method and system and sight tracking equipment
CN112075921B (en) * 2020-10-14 2022-03-18 上海鹰瞳医疗科技有限公司 Fundus camera and focal length adjusting method thereof
GB202019278D0 (en) * 2020-12-08 2021-01-20 Give Vision Ltd Vision aid device
EP4167199A1 (en) 2021-10-14 2023-04-19 Telefonica Digital España, S.L.U. Method and system for tracking and quantifying visual attention on a computing device

Citations (156)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264154A (en) 1979-06-05 1981-04-28 Polaroid Corporation Apparatus for automatically controlling transmission of light through a lens system
US4572616A (en) 1982-08-10 1986-02-25 Syracuse University Adaptive liquid crystal lens
JPH0323431A (en) 1989-06-20 1991-01-31 Canon Inc Photography device with gaze point detecting means
US5182585A (en) 1991-09-26 1993-01-26 The Arizona Carbon Foil Company, Inc. Eyeglasses with controllable refracting power
US5537163A (en) 1994-06-17 1996-07-16 Nikon Corporation Ophthalmologic apparatus with automatic focusing using two reference marks and having an in-focus detecting system
CN1141602A (en) 1994-01-18 1997-01-29 Qqc公司 Using laser for fabricating coatings substrate
JPH09289973A (en) 1996-04-26 1997-11-11 Canon Inc Eye ground camera
JP2676870B2 (en) 1989-01-17 1997-11-17 キヤノン株式会社 Optical device having gazing point detection means
US6072443A (en) 1996-03-29 2000-06-06 Texas Instruments Incorporated Adaptive ocular projection display
DE19959379A1 (en) 1998-12-09 2000-07-06 Asahi Optical Co Ltd Glasses with variable refractive power
US6111597A (en) 1996-12-28 2000-08-29 Olympus Optical Co., Ltd. Stereo image forming apparatus
US6151061A (en) 1996-08-29 2000-11-21 Olympus Optical Co., Ltd. Biocular image display apparatus
US6152563A (en) 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
US6325513B1 (en) 1997-02-05 2001-12-04 Carl Zeiss Jena Gmbh Arrangement for projecting a two-dimensional image onto an eye to be examined for use with a device that performs a subjective determination of refraction and/or a device that performs other vision functions
US20020101568A1 (en) 2001-01-30 2002-08-01 Eberl Heinrich A. Interactive data view and command system
US20020113943A1 (en) 2000-12-29 2002-08-22 Miroslav Trajkovic System and method for automatically adjusting a lens power through gaze tracking
CN1372650A (en) 1999-07-02 2002-10-02 E-视觉有限公司 System, apparatus and method for correcting vision using electro-active spectacles
JP3383228B2 (en) 1998-11-09 2003-03-04 シャープ株式会社 Head mounted display device
US20030043303A1 (en) 2001-06-12 2003-03-06 Bonaventure Karuta System and method for correcting multiple axis displacement distortion
US20030125638A1 (en) 2000-11-22 2003-07-03 Peter Husar Optical stimulation of the human eye
JP2003307466A (en) 2002-02-18 2003-10-31 Topcon Corp Apparatus, method and chart for calibration as well as result diagnostic device
CN1470227A (en) 2002-07-22 2004-01-28 林超群 Dynamic lens vision training apparatus and method thereof
WO2004023167A2 (en) 2002-09-04 2004-03-18 Josef Bekerman Apparatus and method for eyesight rehabilitation
CN1527126A (en) 2003-03-07 2004-09-08 精工爱普生株式会社 Image processing system, projector and image processing method
US20050003043A1 (en) 2003-07-03 2005-01-06 Vincent Sewalt Composition and method for reducing caking and proteinaceous products
US20050014092A1 (en) 2003-07-17 2005-01-20 Shin-Etsu Chemical Co., Ltd. Novel compound, polymer, resist composition, and patterning process
JP2005058399A (en) 2003-08-11 2005-03-10 Nikon Corp Display device
CN1604014A (en) 2003-09-30 2005-04-06 佳能株式会社 Image display apparatus and method
CN1645244A (en) 2004-01-20 2005-07-27 精工爱普生株式会社 Projector and focusing method thereof
CN1653374A (en) 2002-03-13 2005-08-10 E-视觉有限公司 Electro-optic lens with integrated components
WO2005077258A1 (en) 2004-02-17 2005-08-25 National University Corporation Shizuoka University Eyeshot detection device using distance image sensor
US20060016459A1 (en) 2004-05-12 2006-01-26 Mcfarlane Graham High rate etching using high pressure F2 plasma with argon dilution
US7001020B2 (en) 2001-08-02 2006-02-21 Daphne Instruments, Inc. Complete autorefractor system in an ultra-compact package
US20060103808A1 (en) 2003-01-16 2006-05-18 Hidenori Horie Eyesight improving device
US20060122531A1 (en) 2004-12-03 2006-06-08 Goodall Eleanor V Method and system for adaptive vision modification
US20060122530A1 (en) 2004-12-03 2006-06-08 Goodall Eleanor V Adjustable lens system with neural-based control
US20060146281A1 (en) 2004-12-03 2006-07-06 Goodall Eleanor V Method and system for vision enhancement
US20060164593A1 (en) 2005-01-21 2006-07-27 Nasser Peyghambarian Adaptive electro-active lens with variable focal length
CN1901833A (en) 2004-01-02 2007-01-24 视觉仪器控股有限公司 Devices to facilitate alignment and focussing of a fundus camera
US20070019157A1 (en) 2004-12-03 2007-01-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Vision modification with reflected image
CN2868183Y (en) 2006-01-19 2007-02-14 杨宏军 Intelligent eyesight protector
CN1912672A (en) 2005-06-07 2007-02-14 因·S·唐 Motionless lens systems and methods
CN1951314A (en) 2005-08-05 2007-04-25 株式会社拓普康 Fundus camera
JP2007129587A (en) 2005-11-04 2007-05-24 Canon Inc Imaging apparatus, interchangeable lens apparatus, imaging control method, data processing method for distortion aberration correction and computer program
US20070211207A1 (en) 2004-03-31 2007-09-13 Yuhwa Lo Fluidic Adaptive Lens
CN101072534A (en) 2004-11-08 2007-11-14 光视有限公司 Optical apparatus and method for comprehensive eye diagnosis
US7298414B2 (en) 2003-01-29 2007-11-20 Hewlett-Packard Development Company, L.P. Digital camera autofocus using eye focus measurement
CN101097293A (en) 2006-06-29 2008-01-02 贾怀昌 Freely curved face total reflection type visual optical edge glass and emergent light axis depth of parallelism regulating mechanism thereof
US20080002262A1 (en) 2006-06-29 2008-01-03 Anthony Chirieleison Eye tracking head mounted display
CN101103902A (en) 2006-06-22 2008-01-16 株式会社拓普康 Ophthalmologic apparatus
CN201005945Y (en) 2007-01-15 2008-01-16 陈清祥 Light filter adjustable sight guiding auxiliary device
CN101116609A (en) 2007-08-30 2008-02-06 中国科学技术大学 Scanning type automatic zooming iris image gathering system and gathering method thereof
CN101155258A (en) 2006-09-27 2008-04-02 索尼株式会社 Imaging apparatus and imaging method
US20080106633A1 (en) 2002-03-13 2008-05-08 Blum Ronald D Electro-optic lens with integrated components for varying refractive properties
CN101194198A (en) 2005-01-21 2008-06-04 庄臣及庄臣视力保护公司 Adaptive electro-active lens with variable focal length
US20090066915A1 (en) 2004-06-30 2009-03-12 Lai Shui T Apparatus and method for determining sphere and cylinder components of subjective refraction using objective wavefront measurement
CN101430429A (en) 2008-06-17 2009-05-13 沈嘉琦 Myoporthosis spectacles
US20090279046A1 (en) 2008-05-12 2009-11-12 Dreher Andreas W Adjustable eye glasses with a magnetic attachment
CN201352278Y (en) 2008-12-23 2009-11-25 黄玲 Automatic zoom spectacles
CN201360319Y (en) 2008-12-17 2009-12-09 胡超 Eyeglass type multifunctional three-dimensional image device
US20090303212A1 (en) 2008-06-10 2009-12-10 Sony Corporation Optical device and virtual image display
CN101662696A (en) 2008-08-28 2010-03-03 联想(北京)有限公司 Method and device for adjusting camera system
TW201012448A (en) 2008-09-30 2010-04-01 Univ Ishou Method and apparatus for eyesight care
CN201464738U (en) 2009-04-13 2010-05-12 段亚东 Multifunctional health-care glasses
CN101782685A (en) 2009-09-04 2010-07-21 上海交通大学 System for realizing real-time multi-angle three-dimensional sight
US7764433B2 (en) 2006-05-18 2010-07-27 The Regents Of The University Of California Method and system for correcting optical aberrations, including widefield imaging applications
US7766479B2 (en) 2006-03-31 2010-08-03 National University Corporation Shizuoka University View point detecting device
CN101819334A (en) 2010-04-01 2010-09-01 夏翔 Multifunctional electronic glasses
CN101819331A (en) 2010-04-27 2010-09-01 中国计量学院 Remote-control variable-focal length glasses
CN201637953U (en) 2010-03-19 2010-11-17 熊火胜 Intelligent focus-variable lens
CN101917638A (en) 2010-07-07 2010-12-15 深圳超多维光电子有限公司 Stereo display device, mobile terminal and stereo display tracking method
US20110019258A1 (en) 2008-02-13 2011-01-27 Nokia Corporation Display device and a method for illuminating a light modulator array of a display device
US20110018903A1 (en) 2004-08-03 2011-01-27 Silverbrook Research Pty Ltd Augmented reality device for presenting virtual imagery registered to a viewed surface
CN201754203U (en) 2010-05-06 2011-03-02 徐晗 Liquid crystal glasses with dynamic visual angle
US20110051087A1 (en) * 2009-09-01 2011-03-03 Canon Kabushiki Kaisha Fundus camera
JP2011043876A (en) 2009-08-19 2011-03-03 Brother Industries Ltd Image display device
CN102008288A (en) 2010-12-17 2011-04-13 中国科学院光电技术研究所 System and method for line scan confocal ophthalmoscope
CN102083390A (en) 2008-03-18 2011-06-01 像素光学公司 Advanced electro-active optic device
US20110213462A1 (en) 2007-08-02 2011-09-01 Elenza, Inc. Multi-Focal Intraocular Lens System and Methods
CN102203850A (en) 2008-09-12 2011-09-28 格斯图尔泰克公司 Orienting displayed elements relative to a user
US20110242277A1 (en) 2010-03-30 2011-10-06 Do Minh N Systems and methods for embedding a foreground video into a background feed based on a control input
US20110279277A1 (en) 2008-11-17 2011-11-17 Roger Li-Chung Vision protection method and system thereof
CN102292017A (en) 2009-01-26 2011-12-21 托比技术股份公司 Detection of gaze point assisted by optical reference signals
US20120013389A1 (en) 2010-07-16 2012-01-19 Linear Technology Corporation Capacitively Coupled Switched Current Source
CN102419631A (en) 2010-10-15 2012-04-18 微软公司 Fusing virtual content into real content
US20120092618A1 (en) 2009-07-09 2012-04-19 Nike, Inc. Eye And Body Movement Tracking For Testing And/Or Training
US20120113235A1 (en) 2010-11-08 2012-05-10 Sony Corporation 3d glasses, systems, and methods for optimized viewing of 3d video content
US20120127422A1 (en) 2010-11-20 2012-05-24 Tian Yibin Automatic accommodative spectacles using a scene analyzer and focusing elements
CN102481097A (en) 2009-09-01 2012-05-30 佳能株式会社 Fundus camera
US20120133891A1 (en) 2010-05-29 2012-05-31 Wenyu Jiang Systems, methods and apparatus for making and using eyeglasses with adaptive lens driven by gaze distance and low power gaze tracking
CN102487393A (en) 2010-12-01 2012-06-06 深圳市同洲软件有限公司 Method and system for interaction between digital television receiving terminal and mobile terminal, and apparatuses
CN202267785U (en) 2011-08-30 2012-06-06 福州瑞芯微电子有限公司 Naked eye three-dimensional display structure for automatically tracking human eye position
WO2012075218A1 (en) 2010-12-01 2012-06-07 Urban Schnell Variable binocular loupe utilizing fluid filled lens technology
US20120140044A1 (en) 2010-12-06 2012-06-07 Lensvector, Inc. Motionless adaptive stereoscopic scene capture with tuneable liquid crystal lenses and stereoscopic auto-focusing methods
US20120154277A1 (en) 2010-12-17 2012-06-21 Avi Bar-Zeev Optimized focal area for augmented reality displays
CN101149254B (en) 2007-11-12 2012-06-27 北京航空航天大学 High accuracy vision detection system
WO2012083415A1 (en) 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
US20120169730A1 (en) 2009-09-28 2012-07-05 Panasonic Corporation 3d image display device and 3d image display method
CN102572483A (en) 2011-12-02 2012-07-11 深圳超多维光电子有限公司 Tracking type autostereoscopic display control method, device and system, and display equipment
CN102576154A (en) 2009-10-30 2012-07-11 惠普发展公司,有限责任合伙企业 Stereo display systems
CN202383380U (en) 2011-12-31 2012-08-15 张欣 Multifunctional spectacles
US20120206485A1 (en) 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered user movement control of ar eyepiece facilities
US20120212499A1 (en) 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content control during glasses movement
US20120212508A1 (en) 2011-02-22 2012-08-23 Qualcomm Incorporated Providing a corrected view based on the position of a user with respect to a mobile platform
US20120242698A1 (en) 2010-02-28 2012-09-27 Osterhout Group, Inc. See-through near-eye display glasses with a multi-segment processor-controlled optical layer
JP2012199621A (en) 2011-03-18 2012-10-18 Jvc Kenwood Corp Compound-eye imaging apparatus
US20120290401A1 (en) 2011-05-11 2012-11-15 Google Inc. Gaze tracking system
US20120307208A1 (en) 2011-06-01 2012-12-06 Rogue Technologies, Inc. Apparatus and method for eye tracking
JP2012247449A (en) 2011-05-25 2012-12-13 Canon Inc Projection type video display device
CN102918444A (en) 2011-03-25 2013-02-06 松下电器产业株式会社 Dispay device
US20130044042A1 (en) 2011-08-18 2013-02-21 Google Inc. Wearable device with input and output structures
US8384999B1 (en) 2012-01-09 2013-02-26 Cerr Limited Optical modules
US20130050646A1 (en) 2011-08-31 2013-02-28 Nidek Co., Ltd. Fundus photographing apparatus
CN102981270A (en) 2012-12-25 2013-03-20 中国科学院长春光学精密机械与物理研究所 Unblocked adaptive varifocal optical system and calibration method thereof
US20130072828A1 (en) 2011-09-15 2013-03-21 Jason Sweis Shutter glasses
CN103065605A (en) 2012-12-10 2013-04-24 惠州Tcl移动通信有限公司 Method and system of adjusting display effect according to eyesight condition
CN103054695A (en) 2013-01-09 2013-04-24 中山市目明视光视力科技有限公司 Lens combination method for adjusting intraocular optical focal positions
US20130107066A1 (en) 2011-10-27 2013-05-02 Qualcomm Incorporated Sensor aided video stabilization
US20130127980A1 (en) 2010-02-28 2013-05-23 Osterhout Group, Inc. Video display modification based on sensor input for a see-through near-to-eye display
WO2013074851A1 (en) 2011-11-18 2013-05-23 Optovue, Inc. Fundus camera
CN101900927B (en) 2009-05-29 2013-05-29 精工爱普生株式会社 Projector and method for controlling the same
US20130135203A1 (en) 2011-11-30 2013-05-30 Research In Motion Corporation Input gestures using device movement
CN103150013A (en) 2012-12-20 2013-06-12 天津三星光电子有限公司 Mobile terminal
US20130147836A1 (en) 2011-12-07 2013-06-13 Sheridan Martin Small Making static printed content dynamic with virtual data
CN103197757A (en) 2012-01-09 2013-07-10 癸水动力(北京)网络科技有限公司 Immersion type virtual reality system and implementation method thereof
CN103190883A (en) 2012-12-20 2013-07-10 乾行讯科(北京)科技有限公司 Head-mounted display device and image adjusting method
CN103280175A (en) 2013-06-17 2013-09-04 苏州旭宇升电子有限公司 Projecting apparatus
CN103297735A (en) 2013-07-16 2013-09-11 苏州旭宇升电子有限公司 Projection device
US20130241927A1 (en) 2011-07-03 2013-09-19 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
US20130241805A1 (en) 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
CN103353667A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging adjustment device and method
CN103353663A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging adjustment apparatus and method
CN103353677A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging device and method thereof
US20130278631A1 (en) 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20130335301A1 (en) 2011-10-07 2013-12-19 Google Inc. Wearable Computer with Nearby Object Response
US20130335404A1 (en) 2012-06-15 2013-12-19 Jeff Westerinen Depth of field control for see-thru display
US20130342572A1 (en) 2012-06-26 2013-12-26 Adam G. Poulos Control of displayed content in virtual environments
CN103558909A (en) 2013-10-10 2014-02-05 北京智谷睿拓技术服务有限公司 Interactive projection display method and interactive projection display system
US20140078175A1 (en) 2012-09-18 2014-03-20 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
US20140160157A1 (en) 2012-12-11 2014-06-12 Adam G. Poulos People-triggered holographic reminders
US20140225915A1 (en) 2013-02-14 2014-08-14 Research In Motion Limited Wearable display system with detached projector
US20140225918A1 (en) 2013-02-14 2014-08-14 Qualcomm Incorporated Human-body-gesture-based region and volume selection for hmd
US20140232746A1 (en) 2013-02-21 2014-08-21 Hyundai Motor Company Three dimensional augmented reality display apparatus and method using eye tracking
US20140240351A1 (en) 2013-02-27 2014-08-28 Michael Scavezze Mixed reality augmentation
US20140267420A1 (en) 2013-03-15 2014-09-18 Magic Leap, Inc. Display system and method
US20140267400A1 (en) 2013-03-14 2014-09-18 Qualcomm Incorporated User Interface for a Head Mounted Display
US20140282224A1 (en) 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a scrolling gesture
US20140375680A1 (en) 2013-06-24 2014-12-25 Nathan Ackerman Tracking head movement when wearing mobile device
US20150002542A1 (en) 2013-06-28 2015-01-01 Calvin Chan Reprojection oled display for augmented reality experiences
US20150035861A1 (en) 2013-07-31 2015-02-05 Thomas George Salter Mixed reality graduated information delivery
US20150070391A1 (en) 2012-06-29 2015-03-12 Sony Computer Entertainment Inc. Image processing device, image processing method, and image processing system
US20150235632A1 (en) 2011-06-23 2015-08-20 Microsoft Technology Licensing, Llc Total field of view classification
US20160035139A1 (en) 2013-03-13 2016-02-04 The University Of North Carolina At Chapel Hill Low latency stabilization for head-worn displays
US20160171772A1 (en) 2013-07-08 2016-06-16 Ops Solutions Llc Eyewear operational guide system and method
US20160189432A1 (en) 2010-11-18 2016-06-30 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US20160196603A1 (en) 2012-05-04 2016-07-07 Microsoft Technology Licensing, Llc Product augmentation and advertising in see through displays

Patent Citations (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264154A (en) 1979-06-05 1981-04-28 Polaroid Corporation Apparatus for automatically controlling transmission of light through a lens system
US4572616A (en) 1982-08-10 1986-02-25 Syracuse University Adaptive liquid crystal lens
JP2676870B2 (en) 1989-01-17 1997-11-17 キヤノン株式会社 Optical device having gazing point detection means
JPH0323431A (en) 1989-06-20 1991-01-31 Canon Inc Photography device with gaze point detecting means
US5182585A (en) 1991-09-26 1993-01-26 The Arizona Carbon Foil Company, Inc. Eyeglasses with controllable refracting power
CN1141602A (en) 1994-01-18 1997-01-29 Qqc公司 Using laser for fabricating coatings substrate
US5537163A (en) 1994-06-17 1996-07-16 Nikon Corporation Ophthalmologic apparatus with automatic focusing using two reference marks and having an in-focus detecting system
US6072443A (en) 1996-03-29 2000-06-06 Texas Instruments Incorporated Adaptive ocular projection display
JPH09289973A (en) 1996-04-26 1997-11-11 Canon Inc Eye ground camera
US6151061A (en) 1996-08-29 2000-11-21 Olympus Optical Co., Ltd. Biocular image display apparatus
US6111597A (en) 1996-12-28 2000-08-29 Olympus Optical Co., Ltd. Stereo image forming apparatus
US6325513B1 (en) 1997-02-05 2001-12-04 Carl Zeiss Jena Gmbh Arrangement for projecting a two-dimensional image onto an eye to be examined for use with a device that performs a subjective determination of refraction and/or a device that performs other vision functions
US6152563A (en) 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
JP3383228B2 (en) 1998-11-09 2003-03-04 シャープ株式会社 Head mounted display device
DE19959379A1 (en) 1998-12-09 2000-07-06 Asahi Optical Co Ltd Glasses with variable refractive power
CN1372650A (en) 1999-07-02 2002-10-02 E-视觉有限公司 System, apparatus and method for correcting vision using electro-active spectacles
US20030125638A1 (en) 2000-11-22 2003-07-03 Peter Husar Optical stimulation of the human eye
US20020113943A1 (en) 2000-12-29 2002-08-22 Miroslav Trajkovic System and method for automatically adjusting a lens power through gaze tracking
US20020101568A1 (en) 2001-01-30 2002-08-01 Eberl Heinrich A. Interactive data view and command system
US20030043303A1 (en) 2001-06-12 2003-03-06 Bonaventure Karuta System and method for correcting multiple axis displacement distortion
US7001020B2 (en) 2001-08-02 2006-02-21 Daphne Instruments, Inc. Complete autorefractor system in an ultra-compact package
JP2003307466A (en) 2002-02-18 2003-10-31 Topcon Corp Apparatus, method and chart for calibration as well as result diagnostic device
CN1653374A (en) 2002-03-13 2005-08-10 E-视觉有限公司 Electro-optic lens with integrated components
US20080106633A1 (en) 2002-03-13 2008-05-08 Blum Ronald D Electro-optic lens with integrated components for varying refractive properties
CN1470227A (en) 2002-07-22 2004-01-28 林超群 Dynamic lens vision training apparatus and method thereof
WO2004023167A2 (en) 2002-09-04 2004-03-18 Josef Bekerman Apparatus and method for eyesight rehabilitation
US20060103808A1 (en) 2003-01-16 2006-05-18 Hidenori Horie Eyesight improving device
US7298414B2 (en) 2003-01-29 2007-11-20 Hewlett-Packard Development Company, L.P. Digital camera autofocus using eye focus measurement
CN1527126A (en) 2003-03-07 2004-09-08 精工爱普生株式会社 Image processing system, projector and image processing method
US20050003043A1 (en) 2003-07-03 2005-01-06 Vincent Sewalt Composition and method for reducing caking and proteinaceous products
US20050014092A1 (en) 2003-07-17 2005-01-20 Shin-Etsu Chemical Co., Ltd. Novel compound, polymer, resist composition, and patterning process
JP2005058399A (en) 2003-08-11 2005-03-10 Nikon Corp Display device
CN1604014A (en) 2003-09-30 2005-04-06 佳能株式会社 Image display apparatus and method
CN1901833A (en) 2004-01-02 2007-01-24 视觉仪器控股有限公司 Devices to facilitate alignment and focussing of a fundus camera
CN1645244A (en) 2004-01-20 2005-07-27 精工爱普生株式会社 Projector and focusing method thereof
WO2005077258A1 (en) 2004-02-17 2005-08-25 National University Corporation Shizuoka University Eyeshot detection device using distance image sensor
US20070211207A1 (en) 2004-03-31 2007-09-13 Yuhwa Lo Fluidic Adaptive Lens
CN101069106A (en) 2004-03-31 2007-11-07 加利福尼亚大学校务委员会 Fluidic adaptive lens
US20060016459A1 (en) 2004-05-12 2006-01-26 Mcfarlane Graham High rate etching using high pressure F2 plasma with argon dilution
US20090066915A1 (en) 2004-06-30 2009-03-12 Lai Shui T Apparatus and method for determining sphere and cylinder components of subjective refraction using objective wavefront measurement
US20110018903A1 (en) 2004-08-03 2011-01-27 Silverbrook Research Pty Ltd Augmented reality device for presenting virtual imagery registered to a viewed surface
CN101072534A (en) 2004-11-08 2007-11-14 光视有限公司 Optical apparatus and method for comprehensive eye diagnosis
US20060146281A1 (en) 2004-12-03 2006-07-06 Goodall Eleanor V Method and system for vision enhancement
US20060122530A1 (en) 2004-12-03 2006-06-08 Goodall Eleanor V Adjustable lens system with neural-based control
US8109632B2 (en) 2004-12-03 2012-02-07 The Invention Science Fund I, Llc Vision modification with reflected image
US7486988B2 (en) 2004-12-03 2009-02-03 Searete Llc Method and system for adaptive vision modification
US8282212B2 (en) 2004-12-03 2012-10-09 The Invention Science Fund I, Llc Vision modification with reflected image
US20070019157A1 (en) 2004-12-03 2007-01-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Vision modification with reflected image
US20060122531A1 (en) 2004-12-03 2006-06-08 Goodall Eleanor V Method and system for adaptive vision modification
US8104892B2 (en) 2004-12-03 2012-01-31 The Invention Science Fund I, Llc Vision modification with reflected image
US7334892B2 (en) 2004-12-03 2008-02-26 Searete Llc Method and system for vision enhancement
US20060164593A1 (en) 2005-01-21 2006-07-27 Nasser Peyghambarian Adaptive electro-active lens with variable focal length
CN101194198A (en) 2005-01-21 2008-06-04 庄臣及庄臣视力保护公司 Adaptive electro-active lens with variable focal length
CN1912672A (en) 2005-06-07 2007-02-14 因·S·唐 Motionless lens systems and methods
CN1951314A (en) 2005-08-05 2007-04-25 株式会社拓普康 Fundus camera
JP2007129587A (en) 2005-11-04 2007-05-24 Canon Inc Imaging apparatus, interchangeable lens apparatus, imaging control method, data processing method for distortion aberration correction and computer program
CN2868183Y (en) 2006-01-19 2007-02-14 杨宏军 Intelligent eyesight protector
US7766479B2 (en) 2006-03-31 2010-08-03 National University Corporation Shizuoka University View point detecting device
US7764433B2 (en) 2006-05-18 2010-07-27 The Regents Of The University Of California Method and system for correcting optical aberrations, including widefield imaging applications
CN101103902A (en) 2006-06-22 2008-01-16 株式会社拓普康 Ophthalmologic apparatus
US20080002262A1 (en) 2006-06-29 2008-01-03 Anthony Chirieleison Eye tracking head mounted display
CN101097293A (en) 2006-06-29 2008-01-02 贾怀昌 Freely curved face total reflection type visual optical edge glass and emergent light axis depth of parallelism regulating mechanism thereof
CN101155258A (en) 2006-09-27 2008-04-02 索尼株式会社 Imaging apparatus and imaging method
CN201005945Y (en) 2007-01-15 2008-01-16 陈清祥 Light filter adjustable sight guiding auxiliary device
US20110213462A1 (en) 2007-08-02 2011-09-01 Elenza, Inc. Multi-Focal Intraocular Lens System and Methods
CN101116609A (en) 2007-08-30 2008-02-06 中国科学技术大学 Scanning type automatic zooming iris image gathering system and gathering method thereof
CN101149254B (en) 2007-11-12 2012-06-27 北京航空航天大学 High accuracy vision detection system
US20110019258A1 (en) 2008-02-13 2011-01-27 Nokia Corporation Display device and a method for illuminating a light modulator array of a display device
CN102083390A (en) 2008-03-18 2011-06-01 像素光学公司 Advanced electro-active optic device
US20090279046A1 (en) 2008-05-12 2009-11-12 Dreher Andreas W Adjustable eye glasses with a magnetic attachment
US20090303212A1 (en) 2008-06-10 2009-12-10 Sony Corporation Optical device and virtual image display
CN101430429A (en) 2008-06-17 2009-05-13 沈嘉琦 Myoporthosis spectacles
CN101662696A (en) 2008-08-28 2010-03-03 联想(北京)有限公司 Method and device for adjusting camera system
CN102203850A (en) 2008-09-12 2011-09-28 格斯图尔泰克公司 Orienting displayed elements relative to a user
US8896632B2 (en) 2008-09-12 2014-11-25 Qualcomm Incorporated Orienting displayed elements relative to a user
TW201012448A (en) 2008-09-30 2010-04-01 Univ Ishou Method and apparatus for eyesight care
US20110279277A1 (en) 2008-11-17 2011-11-17 Roger Li-Chung Vision protection method and system thereof
CN201360319Y (en) 2008-12-17 2009-12-09 胡超 Eyeglass type multifunctional three-dimensional image device
CN201352278Y (en) 2008-12-23 2009-11-25 黄玲 Automatic zoom spectacles
CN102292017A (en) 2009-01-26 2011-12-21 托比技术股份公司 Detection of gaze point assisted by optical reference signals
CN201464738U (en) 2009-04-13 2010-05-12 段亚东 Multifunctional health-care glasses
CN101900927B (en) 2009-05-29 2013-05-29 精工爱普生株式会社 Projector and method for controlling the same
US20120092618A1 (en) 2009-07-09 2012-04-19 Nike, Inc. Eye And Body Movement Tracking For Testing And/Or Training
JP2011043876A (en) 2009-08-19 2011-03-03 Brother Industries Ltd Image display device
US20110051087A1 (en) * 2009-09-01 2011-03-03 Canon Kabushiki Kaisha Fundus camera
CN102481097A (en) 2009-09-01 2012-05-30 佳能株式会社 Fundus camera
CN101782685A (en) 2009-09-04 2010-07-21 上海交通大学 System for realizing real-time multi-angle three-dimensional sight
US20120169730A1 (en) 2009-09-28 2012-07-05 Panasonic Corporation 3d image display device and 3d image display method
CN102576154A (en) 2009-10-30 2012-07-11 惠普发展公司,有限责任合伙企业 Stereo display systems
US20120242698A1 (en) 2010-02-28 2012-09-27 Osterhout Group, Inc. See-through near-eye display glasses with a multi-segment processor-controlled optical layer
US20130127980A1 (en) 2010-02-28 2013-05-23 Osterhout Group, Inc. Video display modification based on sensor input for a see-through near-to-eye display
US20130278631A1 (en) 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20120212499A1 (en) 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content control during glasses movement
US20120206485A1 (en) 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered user movement control of ar eyepiece facilities
CN201637953U (en) 2010-03-19 2010-11-17 熊火胜 Intelligent focus-variable lens
US20110242277A1 (en) 2010-03-30 2011-10-06 Do Minh N Systems and methods for embedding a foreground video into a background feed based on a control input
CN101819334A (en) 2010-04-01 2010-09-01 夏翔 Multifunctional electronic glasses
CN101819331A (en) 2010-04-27 2010-09-01 中国计量学院 Remote-control variable-focal length glasses
CN201754203U (en) 2010-05-06 2011-03-02 徐晗 Liquid crystal glasses with dynamic visual angle
US20120133891A1 (en) 2010-05-29 2012-05-31 Wenyu Jiang Systems, methods and apparatus for making and using eyeglasses with adaptive lens driven by gaze distance and low power gaze tracking
CN102939557A (en) 2010-05-29 2013-02-20 蒋文宇 Systems, methods and apparatus for making and using eyeglasses with adaptive lens driven by gaze distance and low power gaze tracking
CN101917638A (en) 2010-07-07 2010-12-15 深圳超多维光电子有限公司 Stereo display device, mobile terminal and stereo display tracking method
US20120013389A1 (en) 2010-07-16 2012-01-19 Linear Technology Corporation Capacitively Coupled Switched Current Source
CN102419631A (en) 2010-10-15 2012-04-18 微软公司 Fusing virtual content into real content
US20120113235A1 (en) 2010-11-08 2012-05-10 Sony Corporation 3d glasses, systems, and methods for optimized viewing of 3d video content
WO2012083415A1 (en) 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
US20160189432A1 (en) 2010-11-18 2016-06-30 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US20120127422A1 (en) 2010-11-20 2012-05-24 Tian Yibin Automatic accommodative spectacles using a scene analyzer and focusing elements
EP2646859A1 (en) 2010-12-01 2013-10-09 Adlens Beacon, Inc. Variable binocular loupe utilizing fluid filled lens technology
WO2012075218A1 (en) 2010-12-01 2012-06-07 Urban Schnell Variable binocular loupe utilizing fluid filled lens technology
CN102487393A (en) 2010-12-01 2012-06-06 深圳市同洲软件有限公司 Method and system for interaction between digital television receiving terminal and mobile terminal, and apparatuses
US20120140044A1 (en) 2010-12-06 2012-06-07 Lensvector, Inc. Motionless adaptive stereoscopic scene capture with tuneable liquid crystal lenses and stereoscopic auto-focusing methods
US20120154277A1 (en) 2010-12-17 2012-06-21 Avi Bar-Zeev Optimized focal area for augmented reality displays
CN102008288A (en) 2010-12-17 2011-04-13 中国科学院光电技术研究所 System and method for line scan confocal ophthalmoscope
US20120212508A1 (en) 2011-02-22 2012-08-23 Qualcomm Incorporated Providing a corrected view based on the position of a user with respect to a mobile platform
JP2012199621A (en) 2011-03-18 2012-10-18 Jvc Kenwood Corp Compound-eye imaging apparatus
CN102918444A (en) 2011-03-25 2013-02-06 松下电器产业株式会社 Dispay device
US20120290401A1 (en) 2011-05-11 2012-11-15 Google Inc. Gaze tracking system
JP2012247449A (en) 2011-05-25 2012-12-13 Canon Inc Projection type video display device
US20120307208A1 (en) 2011-06-01 2012-12-06 Rogue Technologies, Inc. Apparatus and method for eye tracking
US20150235632A1 (en) 2011-06-23 2015-08-20 Microsoft Technology Licensing, Llc Total field of view classification
US20130241927A1 (en) 2011-07-03 2013-09-19 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
US20130044042A1 (en) 2011-08-18 2013-02-21 Google Inc. Wearable device with input and output structures
CN202267785U (en) 2011-08-30 2012-06-06 福州瑞芯微电子有限公司 Naked eye three-dimensional display structure for automatically tracking human eye position
US20130050646A1 (en) 2011-08-31 2013-02-28 Nidek Co., Ltd. Fundus photographing apparatus
US20130072828A1 (en) 2011-09-15 2013-03-21 Jason Sweis Shutter glasses
US20130335301A1 (en) 2011-10-07 2013-12-19 Google Inc. Wearable Computer with Nearby Object Response
US20130107066A1 (en) 2011-10-27 2013-05-02 Qualcomm Incorporated Sensor aided video stabilization
WO2013074851A1 (en) 2011-11-18 2013-05-23 Optovue, Inc. Fundus camera
US20130135203A1 (en) 2011-11-30 2013-05-30 Research In Motion Corporation Input gestures using device movement
CN102572483A (en) 2011-12-02 2012-07-11 深圳超多维光电子有限公司 Tracking type autostereoscopic display control method, device and system, and display equipment
US20130147836A1 (en) 2011-12-07 2013-06-13 Sheridan Martin Small Making static printed content dynamic with virtual data
CN202383380U (en) 2011-12-31 2012-08-15 张欣 Multifunctional spectacles
US8384999B1 (en) 2012-01-09 2013-02-26 Cerr Limited Optical modules
CN103197757A (en) 2012-01-09 2013-07-10 癸水动力(北京)网络科技有限公司 Immersion type virtual reality system and implementation method thereof
US20130241805A1 (en) 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
US20160196603A1 (en) 2012-05-04 2016-07-07 Microsoft Technology Licensing, Llc Product augmentation and advertising in see through displays
US20130335404A1 (en) 2012-06-15 2013-12-19 Jeff Westerinen Depth of field control for see-thru display
US20130342572A1 (en) 2012-06-26 2013-12-26 Adam G. Poulos Control of displayed content in virtual environments
US20150070391A1 (en) 2012-06-29 2015-03-12 Sony Computer Entertainment Inc. Image processing device, image processing method, and image processing system
US20140078175A1 (en) 2012-09-18 2014-03-20 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
CN103065605A (en) 2012-12-10 2013-04-24 惠州Tcl移动通信有限公司 Method and system of adjusting display effect according to eyesight condition
US20140160157A1 (en) 2012-12-11 2014-06-12 Adam G. Poulos People-triggered holographic reminders
CN103150013A (en) 2012-12-20 2013-06-12 天津三星光电子有限公司 Mobile terminal
CN103190883A (en) 2012-12-20 2013-07-10 乾行讯科(北京)科技有限公司 Head-mounted display device and image adjusting method
CN102981270A (en) 2012-12-25 2013-03-20 中国科学院长春光学精密机械与物理研究所 Unblocked adaptive varifocal optical system and calibration method thereof
CN103054695A (en) 2013-01-09 2013-04-24 中山市目明视光视力科技有限公司 Lens combination method for adjusting intraocular optical focal positions
US20140225915A1 (en) 2013-02-14 2014-08-14 Research In Motion Limited Wearable display system with detached projector
US20140225918A1 (en) 2013-02-14 2014-08-14 Qualcomm Incorporated Human-body-gesture-based region and volume selection for hmd
US20140232746A1 (en) 2013-02-21 2014-08-21 Hyundai Motor Company Three dimensional augmented reality display apparatus and method using eye tracking
US20140240351A1 (en) 2013-02-27 2014-08-28 Michael Scavezze Mixed reality augmentation
US20160035139A1 (en) 2013-03-13 2016-02-04 The University Of North Carolina At Chapel Hill Low latency stabilization for head-worn displays
US20140267400A1 (en) 2013-03-14 2014-09-18 Qualcomm Incorporated User Interface for a Head Mounted Display
US20150234184A1 (en) 2013-03-15 2015-08-20 Magic Leap, Inc. Using historical attributes of a user for virtual or augmented reality rendering
US20140282224A1 (en) 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a scrolling gesture
US20140267420A1 (en) 2013-03-15 2014-09-18 Magic Leap, Inc. Display system and method
CN103280175A (en) 2013-06-17 2013-09-04 苏州旭宇升电子有限公司 Projecting apparatus
US20140375680A1 (en) 2013-06-24 2014-12-25 Nathan Ackerman Tracking head movement when wearing mobile device
CN103353667A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging adjustment device and method
US20150002542A1 (en) 2013-06-28 2015-01-01 Calvin Chan Reprojection oled display for augmented reality experiences
CN103353663A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging adjustment apparatus and method
CN103353677A (en) 2013-06-28 2013-10-16 北京智谷睿拓技术服务有限公司 Imaging device and method thereof
US20160171772A1 (en) 2013-07-08 2016-06-16 Ops Solutions Llc Eyewear operational guide system and method
CN103297735A (en) 2013-07-16 2013-09-11 苏州旭宇升电子有限公司 Projection device
US20150035861A1 (en) 2013-07-31 2015-02-05 Thomas George Salter Mixed reality graduated information delivery
CN103558909A (en) 2013-10-10 2014-02-05 北京智谷睿拓技术服务有限公司 Interactive projection display method and interactive projection display system

Non-Patent Citations (29)

* Cited by examiner, † Cited by third party
Title
Gao et al. "Measuring Directionality of the Retinal Reflection with a Shack-Hartmann Wavefront Sensor", Dec. 2009, Optics Express, vol. 17, No. 25, Optical Society of America, 20 pages.
Hansen et al., "In the eye of the beholder: a survey of models for eyes and gaze", IEEE Transactions on pattern analysis and machine intelligence, vol. 32, No. 3, Mar. 2010, 23 pages.
International Search Report dated Apr. 3, 2014 for PCT Application No. PCT/CN2013/088531, 10 pages.
International Search Report dated Feb. 27, 2014 for PCT Application No. PCT/CN2013/088522, 6 pages.
International Search Report dated Jan. 8, 2015 for PCT Application No. PCT/CN2014/088242, 2 pages.
International Search report dated Jun. 12, 2014 for PCT Application No. PCT/CN2013/088554, 4 pages.
International Search Report dated Jun. 5, 2014 for PCT Application No. PCT/CN2013/088549, 4 pages.
International Search Report dated Mar. 6, 2014 for PCT Application No. PCT/CN2013/088540, 8 pages.
International Search Report dated May 28, 2014 for PCT Application No. PCT/CN2013/088545, 4 pages.
International Search Report dated May 28, 2014 for PCT Application No. PCT/CN2013/088553, 6 pages.
International Search Report dated May 5, 2014 for PCT Application No. PCT/CN2013/088544, 4 pages.
International Search Report dated May 8, 2014 for PCT Application No. PCT/CN2013/088547, 4 pages.
Jeong, et al. "Tunable microdoublet lens array", Optics Express, vol. 12, Issue 11, May 2004, pp. 2494-2500.
Ji et al., "Real-Time Eye, Gaze and Face Pose Tracking for Monitoring Driver Vigilance", Real-Time Imaging 8, 357-377 (2002) available online at http://www.idealibrary.com, 21 pages.
Kim et al., "A 200 s Processing Time Smart Image Sensor for an Eye Tracker using pixel-level analog image processing", IEEE Journal of Solid-State Circuits, vol. 44, No. 9, Sep. 2009, 10 pages.
Office Action dated Apr. 20, 2017 for U.S. Appl. No. 14/781,578, 77 pages.
Office Action dated Apr. 21, 2017 for U.S. Appl. No. 14/781,581, 19 pages.
Office Action dated Dec. 29, 2016 for U.S. Appl. No. 14/780,519, 25 pages.
Office Action dated Feb. 27, 2017 for U.S. Appl. No. 14/783,495, 39 pages.
Office Action dated Jul. 12, 2017 for U.S. Appl. No. 14/780,519, 45 pages.
Office Action dated Jun. 29, 2017 for U.S. Appl. No. 14/783,495, 50 pages.
Office Action dated Jun. 29, 2017 for U.S. Appl. No. 14/783,503, 120 pages.
Office Action dated Jun. 8, 2017 for U.S. Appl. No. 14/779,968, 79 pages.
Office Action dated Mar. 30, 2017 for U.S. Appl. No. 15/028,019, 36 pages.
Office Action dated Nov. 9, 2017 for U.S. Appl. No. 14/780,519, 24 pages.
Office Action dated Nov. 9, 2017 for U.S. Appl. No. 14/781,578, 64 pages.
Office Action dated Oct. 4, 2017 for U.S. Appl. No. 14/781,584, 95 pages.
Singh, et al., "Human Eye Tracking and Related Issues: A Review", International Journal of Scientific and Research Publications, vol. 2, Issue 9, Sep. 2012, ISSN 2250-3153, 9 pages.
Smith, et al., "Determining Driver Visual Attention With One Camera", IEEE Transactions on Intelligent Transportation Systems, vol. 4, No. 4, Dec. 2003, 14 Pages.

Also Published As

Publication number Publication date
CN103431840B (en) 2016-01-20
CN103431840A (en) 2013-12-11
WO2015014058A1 (en) 2015-02-05
US20160135675A1 (en) 2016-05-19

Similar Documents

Publication Publication Date Title
US9867532B2 (en) System for detecting optical parameter of eye, and method for detecting optical parameter of eye
US10048750B2 (en) Content projection system and content projection method
US10395510B2 (en) Reminding method and reminding device
US10002293B2 (en) Image collection with increased accuracy
US10551638B2 (en) Imaging apparatus and imaging method
US9870050B2 (en) Interactive projection display
KR102000865B1 (en) A method for operating an eye tracking device and an eye tracking device
US9961257B2 (en) Imaging to facilitate object gaze
US10271722B2 (en) Imaging to facilitate object observation
US10247813B2 (en) Positioning method and positioning system
US9961335B2 (en) Pickup of objects in three-dimensional display
US10684680B2 (en) Information observation method and information observation device
US10360450B2 (en) Image capturing and positioning method, image capturing and positioning device
US20160150951A1 (en) Imaging for local scaling
KR101817436B1 (en) Apparatus and method for displaying contents using electrooculogram sensors
US20230393655A1 (en) Electronic apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: BEIJING ZHIGU RUI TUO TECH CO., LTD, CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DU, LIN;ZHANG, HONGJIANG;REEL/FRAME:036686/0619

Effective date: 20141104

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4