WO2015027598A1 - 提醒方法及提醒装置 - Google Patents

提醒方法及提醒装置 Download PDF

Info

Publication number
WO2015027598A1
WO2015027598A1 PCT/CN2013/088549 CN2013088549W WO2015027598A1 WO 2015027598 A1 WO2015027598 A1 WO 2015027598A1 CN 2013088549 W CN2013088549 W CN 2013088549W WO 2015027598 A1 WO2015027598 A1 WO 2015027598A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
eyes
eye
parameters
Prior art date
Application number
PCT/CN2013/088549
Other languages
English (en)
French (fr)
Inventor
杜琳
张宏江
Original Assignee
北京智谷睿拓技术服务有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京智谷睿拓技术服务有限公司 filed Critical 北京智谷睿拓技术服务有限公司
Priority to US14/783,495 priority Critical patent/US10395510B2/en
Publication of WO2015027598A1 publication Critical patent/WO2015027598A1/zh

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present invention relates to the field of smart reminding technologies, and in particular, to a reminding method and apparatus. Background technique
  • infrared ranging In order to protect eyesight and urge the correct use of the eye, there is a method of using infrared ranging to measure the distance between the eye and the object in front of the user, and then to remind the distance when the distance is not correct.
  • This method can be used to prompt the user in certain applications, for example, in reading, writing and other scenarios.
  • the method measures the distance between the human eye and the object in front, and the object is not the place where the user really looks at it, the method has a limited scope of application and may issue an error in a specific scene. remind.
  • the technical problem to be solved by the present invention is to provide a reminding method and device, which can more accurately alert and have a wide range of applications.
  • the first aspect of the present invention provides a reminding method, where The method includes the steps of:
  • the user is alerted according to the focus position and the user status.
  • an embodiment of the present invention provides a reminding device, where the device includes: a detecting module, configured to detect a user's line of sight focus position;
  • a reminder module configured to alert the user according to the focus position and the user status.
  • an embodiment of the present invention provides a computer program product, the computer program product causing a reminder device to perform the method of the first aspect or any possible implementation of the first aspect.
  • an embodiment of the present invention provides a computer readable medium, where the computer readable medium includes computer operation instructions, when a processor executes the computer operation instruction, the computer operation instruction is used to make a processor A method of performing the first aspect or any possible implementation of the first aspect.
  • an embodiment of the present invention provides a reminding device, where the device includes a processor, a memory, and a communication interface.
  • the memory stores computer operating instructions, and the processor, the memory, and the communication interface are connected by a communication bus.
  • the processor when the device is in operation, executes the computer-operated instructions stored by the memory, such that the device performs the method of any of the first aspect or the first aspect.
  • the method and device of the embodiment of the invention can accurately detect the position of the user's line of sight focus, and accurately remind the user according to the focus position, which has a wider application range.
  • the user can be more appropriately targeted.
  • FIG. 1 is a flowchart of a reminding method according to an embodiment of the present invention.
  • Figure 2 (a) is an exemplary diagram of a spot pattern
  • FIG. 2(b) shows a method of projecting a spot pattern as shown in FIG. 2(a) in accordance with an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a reminding device according to an embodiment of the present invention.
  • FIG. 4(a) is a structural block diagram of an eye focus detection system of a reminder device according to an embodiment of the present invention.
  • 4(b) is a block diagram showing another structure of an eye focus detection system of a reminder device according to an embodiment of the present invention.
  • 4(c) is a schematic diagram of an optical path of an eye focus detection system for imaging a reminder device according to an embodiment of the present invention
  • FIG. 4 (d) is a schematic diagram of the eye focus detection system of the reminding device according to the embodiment of the present invention obtaining the distance from the focus of the eye to the eye according to the imaging parameters of the system and the optical parameters of the eye;
  • FIG. 5 is a schematic diagram of an eye focus number detecting system applied to glasses according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of an eye focus detection system of a reminder device applied to glasses according to an embodiment of the present invention
  • FIG. 7 is another schematic structural diagram of a reminding device according to an embodiment of the present invention.
  • FIG. 8 is a schematic view showing the use of the device in a driving state according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of the use of the device in a reading state according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of still another reminder device according to an embodiment of the present invention.
  • the process of clearly imaging the object on the retina can be referred to as the focus of the eye. Accordingly, the sharpest point on the retina is the focus of the line of sight when the human eye views the target object.
  • an embodiment of the present invention provides a reminding method, where the method includes the following steps:
  • S120 Remind the user according to the focus position and the user status.
  • the method of the embodiment of the invention can more accurately perform the reminding based on the object of the human eye to focus on the detection of the focus position of the user, and the scope of application is wider.
  • the method for detecting the focus position of the user's line of sight may be a) using a pupil direction detector to detect the optical axis direction of an eye, and then obtaining the eye through a depth sensor (such as infrared ranging). Look at the depth of the scene to get the focus position of the eye's line of sight.
  • a pupil direction detector to detect the optical axis direction of an eye
  • a depth sensor such as infrared ranging
  • step S110 may further include:
  • the optical parameters of the user's eye include the optical axis direction of the eye.
  • the above method analyzes and processes the image of the fundus of the user's eyes to obtain the optical parameters of the eye when the image is collected to the clearest image, thereby calculating the current focus position of the user's eye, and further calculating the position of the user based on the position of the precise focus point.
  • Reminder provides the basis.
  • the "eye funda” presented here is mainly an image presented on the retina, which may be an image of the fundus itself, or may be an image of other objects projected to the fundus.
  • the fundus can be obtained at a certain position or state by adjusting the focal length of an optical device on the optical path between the user's eyes and the collection position and/or the position in the optical path.
  • the mediation can be continuous in real time.
  • the optical device may be a focal An adjustable lens for adjusting the focal length thereof by adjusting the refractive index and/or shape of the optical device itself. Specifically: 1) adjusting the focal length by adjusting the curvature of at least one side of the focal length adjustable lens, for example, adding or reducing a liquid medium in a cavity formed by a double transparent layer to adjust the curvature of the focus adjustable lens; 2) passing Changing the refractive index of the focal length adjustable lens to adjust the focal length.
  • the focal length adjustable lens is filled with a specific liquid crystal medium, and the adjustment of the liquid crystal medium is adjusted by adjusting the voltage of the corresponding electrode of the liquid crystal medium, thereby changing the focal length of the adjustable lens. Refractive index.
  • the optical device may be: a lens group for adjusting the focal length of the lens group by adjusting the relative position between the lenses in the lens group.
  • the optical path parameters can be changed by adjusting the position of the optical device on the optical path.
  • step S113 further includes:
  • step S112 makes it possible to collect a clearest image, but it is necessary to find the clearest image by step S113, and the user's eyes can be calculated by the clearest image and the known optical path parameters. Optical parameters.
  • step S113 may further include:
  • the projected spot may have no specific pattern and is only used to illuminate the user's fundus.
  • the projected spot may also include a feature rich pattern. The rich features of the pattern make it easy to detect and improve detection accuracy.
  • 2 (a) is an exemplary diagram of a spot pattern 200, which may be formed by a spot pattern generator, such as frosted glass; and FIG. 2(b) shows the set when the spot pattern 200 is projected. An image of the bottom of a user.
  • the spot is an infrared spot that is invisible to the eye.
  • a step of filtering out light other than the invisible light transmission filter of the eye in the projected spot may be performed.
  • the method implemented by the invention may further comprise the steps of:
  • the brightness of the projected spot is controlled.
  • the analysis result is, for example, the characteristics of the image collected in step S111, including the contrast of the image features and the texture features, and the like.
  • a special case of controlling the brightness of the projected spot is to start or stop the projection.
  • the user can stop the projection periodically when the user keeps gazing at one point; when the user's fundus is bright enough, the projection can be stopped, and the user's fundus information is used to detect the user's eyes. The distance from the current line of sight to the eye.
  • the brightness of the projected spot can be controlled according to the ambient light.
  • step S113 further includes:
  • the fundus image performs a calibration of the fundus image to obtain at least one reference image corresponding to the image presented by the fundus.
  • the image collected by the image is compared with the reference image to obtain the clearest image.
  • the clearest image may be an image obtained with the smallest difference from the reference image.
  • the difference between the currently obtained image and the reference image is calculated by an existing image processing algorithm, for example, using a classical phase difference autofocus algorithm.
  • the optical parameters of the user's eyes obtained in step S1132 may include the optical axis direction of the user's eyes obtained from the characteristics of the user's eyes when summarizing the sharpest image.
  • the characteristics of the user's eyes may be obtained from the clearest image, or may be acquired separately.
  • the optical axis direction of the user's eye indicates the direction in which the user's eye gaze is gazing.
  • the optical axis direction of the user's eyes can be obtained according to the characteristics of the fundus at the time of obtaining the clearest image. It is determined by the characteristics of the fundus that the optical axis direction of the user's eyes is more accurate.
  • the size of the spot pattern may be larger than the fundus viewable area or smaller than the fundus viewable area, wherein:
  • a classical feature point matching algorithm for example, Scale Invariant Feature Transform (SIFT) algorithm
  • SIFT Scale Invariant Feature Transform
  • the direction of the optical axis of the eye can be determined by the position of the spot pattern on the obtained image relative to the original spot pattern (obtained by the image calibration module), thereby determining the direction of the user's line of sight.
  • the optical axis direction of the eye may also be obtained according to the feature of the pupil of the user's eye when the clearest image is obtained.
  • the characteristics of the user's eye pupil may be obtained from the clearest image or may be acquired separately. Obtaining the optical axis direction of the eye through the pupil feature of the user is a prior art, and details are not described herein.
  • a calibration step for the optical axis direction of the user's eyes may be included to more accurately determine the direction of the optical axis of the eye.
  • the known imaging parameters include fixed imaging parameters and real-time imaging parameters, wherein the real-time imaging parameters are parameter information of the optical device when acquiring the clearest image, and the parameter information may be obtained at the acquisition Real-time recording is obtained when the clearest image is described.
  • the distance from the user's eye focus to the user's eyes can be calculated (the specific process will be described in detail in conjunction with the device section).
  • step S120 further includes:
  • the monitoring parameter may include one or more of the following: a distance between the focus position to the user's eyes, an angle between the current line of sight of the user and a specific direction, a current line of sight of the user, and a viewing object. The angle between the normals passing through the focus point and the frequency of change of the focus position.
  • the user is reminded according to the monitoring parameter, and the reminding manner may be, for example, sounding, vibrating, or changing the color of the light source, flashing the light source, and the like.
  • the user can be alerted immediately when the monitoring parameter exceeds a preset range.
  • the distance between the focus position and the user's eyes exceeds a predetermined distance range
  • the angle between the user's current line of sight and a specific direction exceeds a predetermined range
  • the user's current line of sight and viewing The angle between the normal passing through the focus point of the object exceeds a predetermined range, and the frequency of change of the focus position exceeds a predetermined range or the like.
  • the user may also be alerted when the monitoring parameter exceeds a preset range for more than a predetermined time. That is, when one or more of the above-mentioned conditions exceed the preset range, Wake up, but remind after a certain reasonable time frame to further improve the accuracy of the reminder.
  • one or more monitoring parameters may be selected according to a user state, and the preset range and the preset time are set.
  • the user can determine the eye scene, and then set a suitable preset range and preset time. For example, by analyzing the user state that the user is in the reading state, the distance from the focus position to the eye may be selected to exceed a preset distance range (to monitor whether the user is too close to the target reading;), the user's current line of sight and the viewing object.
  • the angle between the normals passing through the focus point exceeds a preset range (to monitor whether the user posture is too tilted), and the frequency of change of the focus position exceeds a preset range (to monitor whether the user is too bumpy or not suitable) Reading situation) and so on.
  • the user state may include a user's exercise state, a health state, and a previous eye history (eg, indicating a time that is already in a reading state or a driving state, and adaptively adjusting a preset range and a preset time of the subsequent reminder according to the time) Wait.
  • a user's exercise state e.g., a user's exercise state, a health state, and a previous eye history (eg, indicating a time that is already in a reading state or a driving state, and adaptively adjusting a preset range and a preset time of the subsequent reminder according to the time) Wait.
  • a previous eye history eg, indicating a time that is already in a reading state or a driving state, and adaptively adjusting a preset range and a preset time of the subsequent reminder according to the time
  • the user data may also be comprehensively considered, and the user data may include the information about the user's vision, age, gender, and occupation, etc. One or more. This information can be entered manually or automatically by the user or by others. Considering the user data comprehensively, different preset ranges and preset times can be set for different users.
  • the method of the embodiment of the invention further includes the steps of:
  • the method of the embodiment of the present invention can accurately detect the position of the user's line of sight focus, and accurately remind the user according to the focus position, which has a wider application range.
  • the user can be more appropriately targeted.
  • sequence number of each step does not mean the order of execution sequence, and the execution order of each step should be determined by its function and internal logic, and should not be The implementation of the specific embodiments of the invention constitutes any limitation.
  • an embodiment of the present invention further provides a reminder device 300.
  • the device 300 includes: a detecting module 310, configured to detect a user's line of sight focus position;
  • a reminder module 320 is configured to remind the user according to the focus position and the user status.
  • the device of the embodiment of the present invention can more accurately perform the reminding based on the object of the human eye gaze by detecting the focus position of the user's line of sight, and the application range is wider.
  • the detecting module 310 can detect the focus position of the user's line of sight in various manners, for example:
  • a) ⁇ Use a pupil direction detector to detect the optical axis direction of an eye, and then use a depth sensor (such as infrared ranging) to obtain the depth of the eye gaze scene, thereby obtaining the focus point position of the eye line of sight.
  • a depth sensor such as infrared ranging
  • the detection module 310 may be one of the focus detection systems shown in Figs. 4(a) - 4(d), 5, and 6.
  • the focus point detection system 400 includes:
  • An image collection device 410 configured to collect an image presented by a fundus of the eye
  • the tunable imaging device 420 is configured to perform adjustment of optical path imaging parameters between the eye and the image collection device 410 to cause the image collection device 410 to obtain a clearest image;
  • the image processing device 430 is configured to process an image obtained by the image collection device 410, according to an imaging parameter of an optical path between the image collection device 410 and the user's eye and an optical parameter of the eye when the clearest image is obtained. , calculate the focus position of the user's eyes.
  • the system 400 analyzes and processes the image of the fundus of the user's eyes to obtain the optical parameters of the eye when the image collection device obtains the clearest image, and can calculate the current focus position of the user's eye, which is further based on the accurate pair.
  • the location of the focus provides the basis for the user to be alerted.
  • the image collection device 410 is A micro camera, in another possible implementation of the embodiment of the present invention, the image collection device 410 can also directly use a photosensitive imaging device such as a CCD or CMOS device.
  • the adjustable imaging device 420 includes: an adjustable lens unit 421 located on the optical path between the user's eyes and the image collection device 410. , its own focal length is adjustable and / or the position in the light path is adjustable.
  • the tunable lens unit 421 the system equivalent focal length between the user's eyes and the image concentrating device 410 is adjustable, and the image concentrating device 410 is adjustable by the adjustment of the tunable lens unit 421 The sharpest image of the fundus is obtained at a certain position or state of the lens unit 421.
  • the tunable lens unit 421 is continuously adjusted in real time during the detection process.
  • the adjustable lens unit 421 is: a focal length adjustable lens for adjusting the focal length of the self by adjusting its own refractive index and/or shape. Specifically: 1) adjusting the focal length by adjusting the curvature of at least one side of the focus adjustable lens, for example, increasing or decreasing the liquid medium in the cavity formed by the double transparent layer to adjust the curvature of the focus adjustable lens; 2) by changing The focal length adjusts the refractive index of the lens to adjust the focal length.
  • the focal length adjustable lens is filled with a specific liquid crystal medium, and the arrangement of the liquid crystal medium is adjusted by adjusting the voltage of the corresponding electrode of the liquid crystal medium, thereby changing the focal length adjustable lens. Refractive index.
  • the adjustable lens unit 421 includes: a lens group for adjusting the relative position between the lenses in the lens group to complete the adjustment of the focal length of the lens group itself.
  • optical path parameters of the system by adjusting the characteristics of the adjustable lens unit 421 itself, it is also possible to change the optical path parameters of the system by adjusting the position of the adjustable lens unit 421 on the optical path.
  • the adjustable imaging device 420 further includes: a beam splitting device 422 for forming a user's eyes and an observation object The optical transmission path between, and between the user's eyes and the image collection device 410. This allows the optical path to be folded, reducing the size of the system while not affecting other user experiences.
  • the beam splitting device 422 can include: a first beam splitting unit located in the user's eyes and an observation pair Between the images, the light for transmitting the object to the eye is transmitted, and the light of the user's eyes to the image collecting device 410 is transmitted.
  • the first beam splitting unit may be a beam splitter, a split optical waveguide (including an optical fiber) or other suitable light splitting device.
  • the image processing device 430 may include an optical path calibration unit for calibrating the optical path of the system, for example, alignment calibration of the optical axis of the optical path to ensure measurement accuracy.
  • An image analyzing unit 431 is configured to analyze the image obtained by the image collecting device 410 to find the clearest image
  • a parameter calculation unit 432 is configured to calculate optical parameters of the user's eyes according to the clearest image and imaging parameters known to the system when the clearest image is obtained.
  • the image capturing device 410 can obtain the clearest image by the adjustable imaging device 420, but the image analyzing unit 431 needs to find the clearest image, according to the most Clear images and known optical path parameters of the system can be used to calculate the optical parameters of the eye.
  • the optical parameters of the eye may include the optical axis direction of the user's eye.
  • the system 400 further includes: a projection device 440 for projecting a light spot to the fundus.
  • the function of the projection device 440 can be implemented by a pico projector.
  • the projected spot can be used to illuminate the fundus without a specific pattern.
  • the projected spot may also include a feature rich pattern. The rich features of the pattern facilitate inspection and improve detection accuracy.
  • the images of the spot pattern and the fundus collected when there is a spot are shown in Fig. 2 (a) and Fig. 2 (b), respectively.
  • the spot is an infrared spot that is invisible to the eye.
  • the spot in order to reduce the interference of other spectra:
  • the exit surface of the projection device 440 may be provided with an ocular invisible light transmission filter.
  • the incident surface of the image collection device 410 is provided with an ocular invisible light transmission filter.
  • the image processing device 430 further includes: a projection control unit 434, configured to control the brightness of the projection spot of the projection device 440 according to the result obtained by the image analysis module 431.
  • a projection control unit 434 configured to control the brightness of the projection spot of the projection device 440 according to the result obtained by the image analysis module 431.
  • the projection control unit 434 can adaptively adjust the brightness according to the characteristics of the image obtained by the image collection device 410.
  • the characteristics of the image here include the contrast of the image features as well as the texture features and the like.
  • a special case of controlling the brightness of the projection spot of the projection device 440 is to open or close the projection device 440.
  • the user can periodically close the projection device 440 when the user keeps gazing at one point; when the user's fundus is bright enough
  • the illumination source can be turned off using only fundus information to detect the distance of the eye's current line of sight focus to the eye.
  • the projection control unit 434 can also control the brightness of the projection spot of the projection device 440 according to the ambient light.
  • the image processing device 430 further includes: an image calibration unit 433, configured to perform calibration of the fundus image to obtain at least one reference image corresponding to the image presented by the fundus.
  • the image analyzing unit 431 compares the image obtained by the image collecting device 430 with the reference image to obtain the clearest image.
  • the clearest image may be an image obtained with the smallest difference from the reference image.
  • the difference between the currently obtained image and the reference image is calculated by an existing image processing algorithm, for example, using a classical phase difference autofocus algorithm.
  • the parameter calculation unit 432 includes: an eye optical axis direction determining sub-unit 4321, configured to obtain an optical axis of the user's eye according to a feature of the user's eye when the clearest image is obtained. direction.
  • the characteristics of the user's eyes may be obtained from the clearest image or may be acquired separately.
  • the direction of the optical axis of the user's eye indicates the direction in which the user's eye is looking at the line of sight.
  • the eye optical axis direction determining subunit 4321 includes: a first determining portion, configured to obtain an optical axis direction of the user's eye according to a feature of the fundus when the clearest image is obtained.
  • the accuracy of the optical axis direction of the user's eye is determined to be higher by the characteristics of the fundus as compared to the direction of the optical axis of the user's eye obtained by the features of the pupil and the surface of the eye.
  • the size of the spot pattern may be larger than the fundus viewable area or smaller than the fundus viewable area, wherein:
  • a classical feature point matching algorithm for example, Scale Invariant Feature Transform (SIFT) calculation can be used.
  • SIFT Scale Invariant Feature Transform
  • the method determines the optical axis direction of the eye by detecting the spot pattern on the image relative to the fundus position.
  • the direction of the optical axis of the eye can be determined by the position of the spot pattern on the obtained image relative to the original spot pattern (obtained by the image calibration module), thereby determining the direction of the user's line of sight. .
  • the eye optical axis direction determining portion 4321 includes: a second determining portion configured to obtain an optical axis direction of the user's eye according to a feature of the user's eye pupil when the clearest image is obtained.
  • the characteristics of the user's eye pupil may be obtained from the sharpest image or may be acquired separately.
  • the optical axis direction of the user's eyes is obtained by the user's eye pupil feature, and will not be described herein.
  • the image processing apparatus 430 further includes: an eye optical axis direction aligning unit 435 for performing calibration of the optical axis direction of the user's eyes, so as to more accurately perform the optical axis of the user's eyes. The determination of the direction.
  • the imaging parameters known by the system include fixed imaging parameters and real-time imaging parameters, wherein the real-time imaging parameters are parameter information of the adjustable lens unit when acquiring the clearest image, and the parameter information may be acquired.
  • the clearest image is recorded in real time.
  • the distance from the eye focus to the user's eyes can be calculated, specifically:
  • Figure 4 (c) shows the schematic of the eye imaging. Combined with the lens imaging formula in the classical optics theory, the formula (1) can be obtained from Figure 4 (c):
  • id. + and id e points other i is the distance between the current observation object 4010 of the eye and the real image 4020 on the retina to the eye equivalent lens 4030, f e is the equivalent focal length of the eye equivalent lens 4030, and X is the optical axis direction of the eye ( That is, the optical axis of the line of sight).
  • Figure 4 (d) shows a schematic diagram of the distance from the eye to the eye according to the optical parameters known to the system and the optical parameters of the eye.
  • the spot 4040 becomes a virtual image through the adjustable lens unit 421 (not Shown), assuming that the virtual image distance adjustment lens unit 421 has a distance of X, a combination formula
  • d is the optical equivalent distance of the spot 4040 to the adjustable lens unit 421, which is the optical effect distance of the adjustable transmissive unit 421 to the eye equivalent lens 4030
  • f is the focal length value of the adjustable lens unit 421, which is The distance from the eye equivalent lens 4030 to the adjustable lens unit 421.
  • the distance d from the current observation object 4010 (eye focus point) to the eye equivalent transparency 4030 can be obtained as shown in the formula (3):
  • the focus position of the eye can be easily obtained.
  • an embodiment of the eye focus detection system 500 is applied to the embodiment of the glasses A (where the glasses A can be the reminding device of the embodiment of the present invention), which includes FIG. 4(b)
  • the content of the description of the illustrated embodiment is specifically as follows:
  • the system 500 of the present embodiment is integrated on the right side of the glasses A (not limited thereto), and includes:
  • the micro camera 510 has the same function as the image collecting device described in the embodiment of FIG. 4 (b), and is disposed on the right outer side of the eye B so as not to affect the line of sight of the user's normal viewing object;
  • the first beam splitter 520 has the same function as the first beam splitting unit described in the embodiment of FIG. 4( b ), and is disposed at an intersection of the eye B gaze direction and the incident direction of the camera 510 at a certain inclination angle, and transmits the observation object into the eye B. Light and light that reflects the eye to the camera 510;
  • the focal length adjustable lens 530 has the same function as the focus adjustable lens described in the embodiment of FIG. 4(b), and is located between the first beam splitter 520 and the camera 510 to adjust the focal length value in real time, so that At the focal length value, the camera 510 is capable of capturing the sharpest image of the fundus.
  • the image processing apparatus is not shown in Fig. 5, and its function is the same as that of the image processing apparatus shown in Fig. 4(b). Since the brightness of the fundus is generally insufficient, it is preferable to illuminate the fundus.
  • the fundus is illuminated by one illumination source 540.
  • the preferred illumination source 540 herein is an invisible light source of the eye, preferably a near-infrared light illumination source that has little effect on the eye B and is relatively sensitive to the camera 510.
  • the illumination source 540 is located outside the spectacle frame on the right side. Therefore, it is necessary to complete the light emitted by the illumination source 540 to the fundus through a second dichroic mirror 550 and the first dichroic mirror 520. transfer.
  • the second dichroic mirror 550 is located before the incident surface of the camera 510. Therefore, it is also required to transmit light from the fundus to the second dichroic mirror 550.
  • the first beam splitter 520 in order to improve the user experience and improve the resolution of the camera 510, preferably has a characteristic of high infrared reflectance and high visible light transmittance.
  • an infrared reflective film may be provided on the side of the first beam splitter 520 facing the eye B to achieve the above characteristics.
  • the eye focus detection system 500 is located on the side of the lens of the glasses 500 away from the eye B, the lens can also be regarded as glasses when performing optical parameter calculation of the eye. As part of this, it is not necessary to know the optical properties of the lens at this time.
  • the eye focus detection system 500 may be located on the side of the lens of the glasses A close to the eye B. In this case, the optical characteristic parameters of the lens need to be obtained in advance, and the focus distance is calculated. When considering the factors affecting the lens.
  • the light emitted by the light source passes through the reflection of the second beam splitter 550, the projection of the focus adjustable lens 530, and the reflection of the first beam splitter 520, and then enters the eyes of the user through the lens of the glasses A, and finally reaches the retina of the fundus;
  • the optical path formed by the camera 510 through the first dichroic mirror 520, the focal length adjusting lens 530, and the second dichroic mirror 550 is transmitted through the pupil of the eye B to the image of the fundus.
  • FIG. 6 is a schematic diagram showing the structure of an eye focus detection system 600 according to another embodiment.
  • the present embodiment is similar to the embodiment shown in FIG. 5, and includes a micro camera 610, a second beam splitter 620, and a focus adjustable lens 630, except that the projection device in the present embodiment is different.
  • 640 is a projection device 640 that projects a spot pattern, and the first beam splitter in the embodiment of FIG. 5 is replaced by a curved beam splitter 650 as a curved beam splitting unit.
  • the surface beam splitter 650 is used to correspond to the position of the pupil when the optical axis direction of the eye is different, and the image presented by the fundus is transmitted to the image collection device.
  • the camera can capture the image superimposed and superimposed at various angles of the eyeball, but since only the fundus portion through the pupil can be clearly imaged on the camera, other parts will be out of focus and cannot be clearly imaged, and thus will not seriously interfere with the imaging of the fundus portion.
  • the characteristics of the fundus portion can still be detected. Therefore, compared with the embodiment shown in FIG. 5, the present embodiment can obtain an image of the fundus well when the eyes are gazing in different directions, so that the eye focus detection system of the present embodiment has a wider application range and detection accuracy. higher.
  • the reminding module 320 further includes:
  • a monitoring parameter obtaining unit 321 is configured to acquire a monitoring parameter according to the focus position.
  • the monitoring parameter may include one or more of the following: a distance between the focus position to the user's eyes, an angle between the current line of sight of the user and a specific direction, a current line of sight of the user, and a viewing object. The angle between the normals passing through the focus point and the frequency of change of the focus position.
  • a reminding unit 322 is configured to remind the user according to the monitoring parameter, such as: sound, vibration, or color change through the light source, flashing of the light source, and the like.
  • the user can be alerted immediately when the monitoring parameter exceeds a preset range.
  • the distance between the focus position and the user's eyes exceeds a predetermined distance range
  • the angle between the user's current line of sight and a specific direction exceeds a preset range
  • the user's current line of sight and one The angle between the normal passing through the focus point on the viewing object exceeds a predetermined range
  • the frequency of change of the focus position exceeds a predetermined range or the like.
  • the user may also be alerted when the monitoring parameter exceeds a preset range for more than a preset time. That is, when the above situation exceeds the preset range, the prompt is not immediately, but a reminder is given after a reasonable time range to further improve the accuracy of the reminder.
  • the reminding unit 322 can include a reminding device corresponding to the above function and a timing device, which is not
  • the apparatus 300 of the embodiment of the present invention may further include a setting module 330, configured to set the preset range and the preset time.
  • the setting module 330 can select one or more monitoring parameters according to the user state, and set the preset range and the preset time. In other words, according to the state of the user, it is possible to determine the user's eye scene, and then set a suitable preset range and preset time.
  • the distance from the focus position to the eye may be selected to exceed a preset distance range (to monitor whether the user is too close to the target reading;), the user's current line of sight and the viewing object.
  • the angle between the normals passing the focus point exceeds a preset range (to monitor whether the user posture is too tilted), and the change frequency of the focus position exceeds a predetermined range (to monitor whether the user is too bumpy Not suitable for reading) etc.
  • the user status may include the user's exercise state, health status, and previous eye history (e.g., a time indicating that the reading state has been in progress, a preset range and a preset time after the adaptive adjustment according to the time), and the like.
  • the user status can be detected by the detecting module 310.
  • the detecting module 310 can be further composed of different parts according to the detected state of the user.
  • the detecting module 310 can further include a GPS positioning device and a head. And detecting, by the sensor, the motion state of the user according to the positioning information and/or the head sensing information.
  • the setting module 330 may also comprehensively consider the user data, which may include the user's vision, age, gender, occupation, etc. One or more of the information. This information can be entered manually or automatically by the user or by others. Considering user data in a more comprehensive manner, different preset ranges and preset times can be set for different users.
  • the apparatus 300 of the embodiment of the present invention further includes: a data acquisition module 340, configured to acquire user data.
  • the driver wears a reminder device (glasses A) according to a possible implementation of the embodiment of the present invention, which is equipped with the focus detection system 600 described in FIG. 6.
  • a timing device is provided.
  • the timing device can separately record various time periods associated with the state of the user.
  • the process of using the reminder to alert the user is as follows: Detect user status.
  • the preset range suitable for safe driving is set: 1) a safety distance threshold S1 with the front target object; 2) a safety range R1 of the angle between the user's line of sight and the forward direction, and the first Time threshold T1 (eg, 10s); and 3) reset 1) and 2) the set threshold and the second time threshold T2 of the range (eg, 1 hour).
  • the user is alerted in a vibrating manner so as not to affect the driving.
  • the user is also alerted when the angle between the user and the line of sight and the forward direction exceeds the safe range R1 and continues to exceed the second time threshold T1.
  • the reminder device When the user's driving time accumulates more than 1 hour, the reminder device automatically resets the threshold and range set by 1), 2). To ensure driving safety, Sl, R1 and T2 can be correspondingly reduced and shortened.
  • the reminding function of the reminding device can be stopped, and the reminding function is restarted after restarting.
  • a reminder device (glasses A) according to a possible implementation of the embodiment of the present invention, which is equipped with the focus detection system 500 described in FIG. 5, and in the reminding device of the present example, a timing device is provided.
  • the timing device can separately record various time periods related to the state of the user.
  • a preset range suitable for protecting vision 1) a safe distance threshold S2 with the front target object; 2) a range R2 of the angle ⁇ between the user's current line of sight and the normal passing through the focus point D on the viewing object And a third time threshold ⁇ 3 (eg, 1 minute); 3) a change frequency F1 and a fourth time threshold ⁇ 4 (eg, 5 minutes) of the focus position; and 4) a fifth time to reset the threshold and range
  • the threshold is ⁇ 5 (for example, 1 hour).
  • the focus position detection when the distance between the focus D and the user's eyes is less than S2, the user is reminded by voice. Remind the user when the angle ⁇ exceeds the range R2 and continues to exceed ⁇ 3.
  • the frequency of change of the focus position exceeds F1 and the duration exceeds ⁇ 4, the user may be in a fast moving state to remind the user to stop reading.
  • FIG. 10 is a schematic structural diagram of a reminder device 1000 according to an embodiment of the present invention.
  • the specific embodiment of the present invention does not limit the specific implementation of the alert device 1000.
  • the reminding device 1000 can include:
  • a processor 1100 a Communications Interface 1200, a memory 1300, and a communication bus 1 400 . among them:
  • the processor 1100, the communication interface 1200, and the memory 1300 perform communication with each other via the communication bus 1400.
  • the communication interface 1200 is configured to communicate with a network element such as a client.
  • the processor 1100 is configured to execute the program 1320, and specifically, the related steps in the method embodiment shown in FIG. 1 above may be performed.
  • the program 1320 may include program code including a computer operation finger.
  • the processor 1100 may be a central processing unit CPU or a specific integrated circuit ASIC.
  • the memory 1300 is used to store the program 1320.
  • Memory 1300 may include high speed RAM memory and may also include non-volatile memory, such as at least one disk memory.
  • the program 1320 specifically enables the apparatus 1000 to perform the following steps:
  • the user is alerted according to the focus position and the user status.
  • An embodiment of the present invention further provides a wearable optical device, which may be the frame glasses shown in FIG. 5 or FIG. 6, or may also be a contact lens, the wearable optical device including The reminding device described in each of the above embodiments.
  • the optical parameter detecting system for an eye may also be applied to other eye-related devices, such as a non-wearing optical device such as a telescope; or the optical parameter detecting system of the present invention. It can also be applied to other imaging receiving devices other than the eyes, such as a camera or the like.
  • the functions, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like, which can store program codes. .

Abstract

本发明公开了一种提醒方法及装置,涉及智能提醒技术领域。所述方法包括步骤:检测一用户视线对焦点位置;根据所述对焦点位置以及用户状态对所述用户进行提醒。本发明实施例的方法及装置能够精确地检测用户视线对焦点位置,根据该对焦点位置对用户进行准确提醒,其适用范围更广。此外,根据用户状态、用户资料等选择监控参数并设置提醒阈值,能够更有针对性地对用户进行合适的提醒。

Description

提醒方法及提醒装置 本申请要求于 2013 年 8 月 30 日提交中国专利局、 申请号为 201310390569.9、 发明名称为 "提醒方法及装置" 的中国专利申请的优先权, 其全部内容通过引用结合在本申请中。 技术领域
本发明涉及智能提醒技术领域, 尤其涉及一种提醒方法及装置。 背景技术
据研究, 除了年龄以及遗传方面的原因外, 不健康甚至不正确的用眼方 式是人们视力衰退的主要原因。 例如, 长时间、 近距离用眼等等。 除了视力 衰退外,在某些场景下, 不正确的用眼方式可能会造成及严重的后果。例如, 在驾驶状态时, 若驾驶员长时间注意力不集中注视方向不正确, 可能会造成 交通事故, 对人身安全造成极大的威胁。
为了保护视力、 督促正确用眼, 存在使用红外测距方式来测量眼睛与用 户前方物体之间的距离, 进而在距离不当时进行提醒的方法。 此方法在一定 的应用场合下可以对用户进行很好的提示作用,例如, 在阅读、写字等场景。 但是, 由于该方法所测量的是人眼与前方对象之间的距离, 而该对象并非是 用户所真正注视的地方, 因此, 该方法的适用范围有限, 在特定的场景下可 能会发出错误的提醒。
发明内容
本发明要解决的技术问题是: 提供一种提醒方法和装置, 能够更准确的 进行提醒且适用范围广。
为实现上述目的, 第一方面, 本发明实施例提供了一种提醒方法, 所述 方法包括步骤:
检测一用户视线对焦点位置;
根据所述对焦点位置以及用户状态对所述用户进行提醒。
第二方面, 本发明实施例提供了一种提醒装置, 所述装置包括: 一检测模块, 用于检测一用户视线对焦点位置;
一提醒模块, 用于根据所述对焦点位置以及用户状态对所述用户进行提 醒。
第三方面,本发明实施例提供了一种计算机程序产品, 所述计算机程序产 品使一提醒装置执行第一方面或第一方面的任意可能的实现方式的方法。
第四方面, 本发明实施例提供了一种计算机可读介质, 所述计算机可读 介质包含计算机操作指令, 当一处理器执行所述计算机操作指令时, 所述计 算机操作指令用于使处理器执行第一方面或第一方面的任意可能的实现方式 的方法。
第五方面,本发明实施例提供了一种提醒装置,所述装置包括一处理器、 —存储器和一通信接口,
所述存储器存储计算机操作指令, 所述处理器、 所述存储器和所述通信 接口通过一通信总线连接,
当所述装置运行时, 所述处理器执行所述存储器存储的所述计算机操作 指令, 使得所述装置执行第一方面或第一方面的任意可能的实现方式的方法。
本发明实施例的方法及装置能够精确地检测用户视线对焦点位置, 根据 该对焦点位置对用户进行准确提醒,其适用范围更广。 此外,根据用户状态、 用户资料等选择监控参数并设置提醒阈值, 能够更有针对性地对用户进行合 适的提醒。
附图说明
图 1为本发明实施例的提醒方法的流程图;
图 2 ( a )为一个光斑图案的示例图;
图 2 ( b )为依照本发明实施例的方法在有图 2 ( a )所示的光斑图案投射 时釆集到的用户眼底的图像;
图 3为本发明实施例的提醒装置的结构示意图;
图 4 ( a )为本发明实施例的提醒装置的眼睛对焦点检测系统的一种结构 框图;
图 4 ( b )为本发明实施例的提醒装置的眼睛对焦点检测系统的另一种结 构框图;
图 4 ( c )为本发明实施例的提醒装置的眼睛对焦点检测系统眼睛成像的 光路示意图;
图 4 ( d )为本发明实施例的提醒装置的眼睛对焦点检测系统根据系统已 知成像参数和眼睛的光学参数得到眼睛对焦点到眼睛的距离的示意图;
图 5 为本发明实施例的提醒装置的眼睛对焦点数检测系统应用在眼镜上 的示意图;
图 6为本发明实施例的提醒装置的眼睛对焦点检测系统应用在眼镜上的 示意图;
图 7为本发明实施例的提醒装置的另一种结构示意图;
图 8为本发明实施例的装置在驾驶状态时的使用示意图;
图 9为本发明实施例的装置在阅读状态时的使用示意图;
图 10为本发明实施例的提醒装置的再一种结构示意图。
具体实施方式
下面结合附图和实施例, 对本发明的具体实施方式作进一步详细说明。 以下实施例用于说明本发明, 但不用来限制本发明的范围。
人眼在观看目标对象时, 将对象清晰地在视网膜上成像的过程可称为眼 睛的对焦, 相应地, 在视网膜上成像最清晰的点即为人眼在观看该目标对象 时视线的对焦点。
如图 1所示, 本发明实施例提供了一种提醒方法, 该方法包括步骤:
S110. 检测一用户视线对焦点位置;
S120. 根据所述对焦点位置以及用户状态对所述用户进行提醒。 本发明实施例的方法通过对用户视线对焦点位置的检测, 能够基于人眼 注视的对象来更准确进行提醒, 适用范围更广。
在本发明实施例的方法中, 对于用户视线对焦点位置的检测方式可以为 a )釆用一个瞳孔方向检测器检测一个眼睛的光轴方向、 再通过一个深度 传感器 (如红外测距)得到眼睛注视场景的深度, 从而得到眼睛视线的对焦 点位置。
b )分别检测两眼的光轴方向, 再得到两眼光轴方向的交点, 进而得到眼 睛视线的对焦点位置。
c )根据釆集到眼睛的成像面呈现的一最清晰图像时图像釆集位置与眼睛 之间光路的光学参数,得到所述用户眼睛视线对焦点位置。具言之,步骤 S110 可进一步包括:
5111. 釆集用户眼睛眼底呈现的图像;
5112. 进行用户眼睛与釆集位置之间光路的成像参数的调节直至釆集到 一最清晰的图像;
5113. 对所述釆集到的图像进行处理,根据釆集到所述最清晰图像时用户 眼睛与所述釆集位置之间光路的成像参数以及用户眼睛的光学参数, 计算得 到所述对焦点位置, 所述用户眼睛的光学参数包括眼睛的光轴方向。
上述方法通过对用户眼睛眼底的图像进行分析处理, 得到釆集到最清晰 图像时眼睛的光学参数, 从而计算得到用户眼睛当前的对焦点位置, 为进一 步基于该精确的对焦点的位置对用户进行提醒提供基础。
这里的 "眼底" 呈现的图像主要为在视网膜上呈现的图像, 其可以为眼 底自身的图像, 或者可以为投射到眼底的其它物体的图像。
在步骤 S112中, 可通过对用户眼睛与釆集位置之间的光路上的一光学器 件的焦距和 /或在光路中的位置进行调节, 可在该光学器件在某一个位置或状 态时获得眼底最清晰的图像。 该调解可为连续实时的。
在本发明实施例的方法的一种可能的实施方式中, 该光学器件可为一焦 距可调透镜, 用于通过调整该光学器件自身的折射率和 /或形状完成其焦距的 调整。 具体为: 1 ) 通过调节该焦距可调透镜的至少一面的曲率来调节焦距, 例如在一双层透明层构成的空腔中增加或减少液体介质来调节焦距可调透镜 的曲率; 2 )通过改变该焦距可调透镜的折射率来调节焦距, 例如焦距可调透 镜中填充有一特定液晶介质, 通过该调节液晶介质对应电极的电压来该调整 液晶介质的排列方式, 从而改变焦距该可调透镜的折射率。
在本发明实施例的方法的另一种可能的实施方式中, 该光学器件可为: 一透镜组, 用于通过调节该透镜组中各透镜之间的相对位置完成该透镜组自 身焦距的调整。
除了上述两种通过光学器件自身的特性来改变光路参数以外, 还可以通 过调节光学器件在光路上的位置来改变光路参数。
此外, 在本发明实施例的方法中, 步骤 S113进一步包括:
51131. 对在步骤 S111中釆集到的图像进行分析, 找到一最清晰的图像;
51132. 根据所述最清晰的图像、 以及得到所述最清晰图像时已知的成像 参数计算用户眼睛的光学参数。
在步骤 S112中的调整使得能够釆集到一最清晰的图像, 但是需要通过步 骤 S113来找到该最清晰的图像, 根据所述最清晰的图像以及已知的光路参数 就可以通过计算得到用户眼睛的光学参数。
在本发明实施例的方法中, 步骤 S113中还可包括:
S1133. 向用户眼底投射一光斑。 所投射的光斑可以没有特定图案, 仅用 于照亮用户眼底。 所投射的光斑还可包括一特征丰富的图案。 图案的特征丰 富可以便于检测, 提高检测精度。 如图 2 ( a )所示为一个光斑图案 200的示 例图, 该图案可以由一光斑图案生成器形成, 例如毛玻璃; 图 2 ( b )所示为 在有光斑图案 200投射时釆集到的一用户眼底的图像。
为了不影响眼睛的正常观看, 优选的, 所述光斑为眼睛不可见的红外光 斑。 此时, 为了减小其它光谱的干扰: 可进行滤除投射的光斑中除了眼睛不 可见光透射滤镜之外的光的步骤。 相应地, 本发明实施的方法还可包括步骤:
51134. 根据步骤 S1131分析得到的结果, 控制投射光斑亮度。 该分析结 果例如步骤 S111釆集到的图像的特性,包括图像特征的反差以及紋理特征等。
需要说明的是, 控制投射光斑亮度的一种特殊的情况为开始或停止投射, 例如用户持续注视一点时可以周期性停止投射; 用户眼底足够明亮时可以停 止投射, 利用用户眼底信息来检测用户眼睛当前视线对焦点到眼睛的距离。
此外, 还可以根据环境光来控制投射光斑亮度。
优选地, 在本发明实施例的方法中, 步骤 S113还包括:
51135. 进行眼底图像的校准, 获得至少一个与眼底呈现的图像对应的基 准图像。 具言之, 将釆集到的图像与所述基准图像进行对比计算, 获得所述 最清晰的图像。 这里, 所述最清晰的图像可以为一获得的与所述基准图像差 异最小的图像。 在本实施方式的方法中, 通过现有的图像处理算法计算当前 获得的图像与该基准图像的差异, 例如使用经典的相位差值自动对焦算法。
在步骤 S1132 中所得到的用户眼睛的光学参数可包括根据釆集到所述最 清晰图像时用户眼睛的特征得到的用户眼睛的光轴方向。 这里用户眼睛的特 征可以是从所述最清晰图像上获取的, 或者也可以是另外获取的。 用户眼睛 的光轴方向表示用户眼睛视线注视的方向。 具言之, 可根据得到所述最清晰 图像时眼底的特征得到用户眼睛的光轴方向。 通过眼底的特征来确定用户眼 睛的光轴方向精确度更高。
在向眼底投射光斑图案时, 该光斑图案的大小有可能大于眼底可视区域 或小于眼底可视区域, 其中:
当该光斑图案的面积小于等于眼底可视区域时, 可以利用经典特征点匹 配算法(例如尺度不变特征转换( Scale Invariant Feature Transform, SIFT )算 法)通过检测图像上的光斑图案相对于眼底位置来确定眼睛的光轴方向。
当该光斑图案的面积大于等于眼底可视区域时, 可以通过得到的图像上 的光斑图案相对于原光斑图案 (通过图像校准模块获得) 的位置来确定眼睛 的光轴方向, 进而确定用户视线方向。 在本发明实施例的方法的另一种可能的实施方式中, 还可根据得到所述 最清晰图像时用户眼睛瞳孔的特征得到眼睛的光轴方向。 这里用户眼睛瞳孔 的特征可以是从所述最清晰图像上获取的, 也可以是另外获取的。 通过用户 眼睛瞳孔特征得到眼睛的光轴方向为已有技术, 此处不再赘述。
此外, 在本发明实施例的方法中, 还可包括对用户眼睛的光轴方向的校 准步骤, 以便更精确的进行上述眼睛光轴方向的确定。
在本实施方式的方法中, 所述已知的成像参数包括固定的成像参数和实 时成像参数, 其中实时成像参数为获取最清晰图像时所述光学器件的参数信 息, 该参数信息可以在获取所述最清晰图像时实时记录得到。
在得到用户眼睛当前的光学参数之后, 就可以计算得到用户眼睛对焦点 到用户眼睛的距离 (具体过程将结合装置部分详述)。
在本发明实施例的方法中, 在步骤 S110中检测到用户对焦点位置后, 步 骤 S120进一步包括:
5121. 根据所述对焦点位置获取监控参数。
根据用户的不同状态, 所述监控参数可包括以下一种或多种: 所述对焦 点位置到用户眼睛的距离、 用户当前视线与一特定方向之间的角度、 用户当 前视线与一观看对象上经过所述对焦点的法线之间的角度、 以及所述对焦点 位置的变化频率。
5122. 根据所述监控参数对用户进行提醒, 提醒方式可例如: 发出声音、 振动、 或通过发光源的颜色改变、 发光源闪烁等。
具言之, 可在所述监控参数超出一预设范围时, 即刻对用户进行提醒。 例如, 所述对焦点位置到用户眼睛的距离超出一预设的距离范围、 用户当前 视线与一特定方向 (例如, 用户的移动方向) 之间的角度超出一预设范围、 用户当前视线与观看对象上经过所述对焦点的法线的之间的角度超出一预设 范围、 以及所述对焦点位置的变化频率超出一预设范围等。
还可在所述监控参数超出一预设范围的时间超过一预设时间时, 对用户 进行提醒。 也即, 当上述一种或多种超出预设范围的情况出现时, 不即刻提 醒, 而是在一定的合理的时间范围后提醒, 以进一步提高提醒的准确度。
在本发明实施例的方法中, 可根据用户状态选择一个或多个监控参数, 并设置该预设范围和预设时间。 具言之, 根据用户状态能够判断用户用眼场 景, 进而设置一合适的预设范围和预设时间。 例如, 通过用户状态分析用户 处于阅读状态, 则可选取所述对焦点位置到眼睛的距离超出预设的距离范围 (以监控用户是否距离目标阅读物过近;)、 用户当前视线与观看对象上经过所 述对焦点的法线的之间的角度超出预设范围(以监控用户姿势是否过于倾斜)、 以及所述对焦点位置的变化频率超出预设范围 (以监控用户是否处于过于颠 簸不适合阅读的情况) 等。
该用户状态可包括用户的运动状态、健康状态、 以及先前的用眼历史(例 如, 表示已经处于阅读状态或驾驶状态的时间, 根据该时间自适应调整后续 提醒的预设范围和预设时间) 等。
在监控参数的选择、 预设范围和预设时间的设置的过程中, 还可综合考 虑用户资料, 该用户资料可包括用户的视力情况、 年龄、 性别以及职业等与 用眼相关的信息中的一种或多种。 这些资料可以通过用户或其他人手动输入 或通过自动获取。 综合考虑用户资料可以更有针对性的对不同用户设置不同 的预设范围和预设时间。 相应地, 本发明实施例的方法还包括步骤:
获取用户资料;
根据所述用户资料设置所述预设范围和 /或所述预设时间。
综上所述, 本发明实施例的方法能够精确地检测用户视线对焦点位置, 根据该焦点位置对用户进行准确提醒, 其适用范围更广。 此外, 根据用户状 态、 用户资料等选择监控参数并设置提醒阈值, 能够更有针对性地对用户进 行合适的提醒。
本领域技术人员可以理解, 在本发明具体实施方式的上述方法中, 各步骤 的序号大小并不意味着执行顺序的先后, 各步骤的执行顺序应以其功能和内 在逻辑确定, 而不应对本发明具体实施方式的实施过程构成任何限定。
如图 3所示,本发明实施例还提供了一种提醒装置 300。该装置 300包括: 一检测模块 310 , 用于检测一用户视线对焦点位置;
一提醒模块 320,用于根据所述对焦点位置以及用户状态对所述用户进行 提醒。
本发明实施例的装置通过对用户视线对焦点位置的检测, 能够基于人眼 注视的对象来更准确进行提醒, 适用范围更广。
在本发明实施例的装置中, 检测模块 310对于用户视线对焦点位置的检 测方式可以为多种, 例如:
a )釆用一个瞳孔方向检测器检测一个眼睛的光轴方向、 再通过一个深度 传感器 (如红外测距)得到眼睛注视场景的深度, 从而得到眼睛视线的对焦 点位置。
b )分别检测两眼的光轴方向, 再得到两眼光轴方向的交点, 进而得到眼 睛视线的对焦点位置, 该技术也为已有技术, 此处不再赘述。
c )根据釆集到眼睛的成像面呈现的一最清晰图像时图像釆集设备与眼睛 之间光路的光学参数, 得到所述用户眼睛视线的对焦点位置。 在本实施方式 的装置中, 所述检测模块 310可以为图 4 ( a ) -图 4 ( d )、 图 5、 图 6所示的 对焦点检测系统中的一种。
如图 4 ( a )所示, 所述对焦点检测系统 400包括:
图像釆集设备 410, 用于釆集眼睛眼底呈现的图像;
可调成像设备 420,用于进行眼睛与所述图像釆集设备 410之间光路成像 参数的调节以使得所述图像釆集设备 410得到一最清晰的图像;
图像处理设备 430, 用于对所述图像釆集设备 410得到的图像进行处理, 根据得到所述最清晰图像时所述图像釆集设备 410 与用户眼睛之间光路的成 像参数以及眼睛的光学参数, 计算得到用户眼睛的对焦点位置。
该系统 400通过对用户眼睛眼底的图像进行分析处理, 得到所述图像釆 集设备获得最清晰图像时眼睛的光学参数, 就可以计算得到用户眼睛当前的 对焦点位置, 为进一步基于该精确的对焦点的位置对用户进行提醒提供基础。
如图 4 ( b )所示, 在一种可能的实施方式中, 所述图像釆集设备 410为 一微型摄像头, 在本发明实施例的另一种可能的实施方式中, 所述图像釆集 设备 410还可以直接使用一感光成像器件, 如 CCD或 CMOS等器件。
如图 4 ( b )所示, 在一种可能的实施方式中, 所述可调成像设备 420包 括: 一可调透镜单元 421 , 位于用户眼睛与所述图像釆集设备 410之间的光路 上, 自身焦距可调和 /或在光路中的位置可调。 通过该可调透镜单元 421 , 使 得从用户眼睛到所述图像釆集设备 410之间的系统等效焦距可调, 通过可调 透镜单元 421的调节, 使得所述图像釆集设备 410在可调透镜单元 421的某 一个位置或状态时获得眼底最清晰的图像。 在本实施方式中, 所述可调透镜 单元 421在检测过程中连续实时的调节。
在一种可能的实施方式中,所述可调透镜单元 421为: 一焦距可调透镜, 用于通过调节自身的折射率和 /或形状完成自身焦距的调整。 具体为: 1 )通过 调节该焦距可调透镜的至少一面的曲率来调节焦距, 例如在双层透明层构成 的空腔中增加或减少液体介质来调节焦距可调透镜的曲率; 2 )通过改变该焦 距可调透镜的折射率来调节焦距, 例如该焦距可调透镜中填充有一特定液晶 介质, 通过调节该液晶介质对应电极的电压来调整该液晶介质的排列方式, 从而改变该焦距可调透镜的折射率。
在另一种可能的实施方式中, 所述可调透镜单元 421包括: 一透镜组, 用于调节该透镜组中各透镜之间的相对位置完成该透镜组自身焦距的调整。
除了上述两种通过调节可调透镜单元 421 自身的特性来改变系统的光路 参数以外, 还可以通过调节所述可调透镜单元 421 在光路上的位置来改变系 统的光路参数。
此外, 为了不影响用户对观察对象的观看体验, 并且为了使得系统可以 便携应用在穿戴式设备上, 所述可调成像设备 420还包括: 一分光装置 422, 用于形成用户眼睛和一观察对象之间、 以及用户眼睛和该图像釆集设备 410 之间的光传递路径。 这样可以对光路进行折叠, 减小系统的体积, 同时尽可 能不影响用户的其它体验。
所述分光装置 422 可包括: 一第一分光单元, 位于用户眼睛和一观察对 象之间, 用于透射观察对象到眼睛的光, 传递用户眼睛到图像釆集设备 410 的光。 所述第一分光单元可以为一分光镜、 分光光波导 (包括光纤)或其它 适合的分光设备。
此外, 图像处理设备 430 可包括一光路校准单元, 用于对系统的光路进 行校准, 例如进行光路光轴的对齐校准等, 以保证测量的精度。
一图像分析单元 431 ,用于对所述图像釆集设备 410得到的图像进行分析, 找到最清晰的图像;
一参数计算单元 432, 用于根据所述最清晰的图像、 以及得到所述最清晰 图像时系统已知的成像参数计算用户眼睛的光学参数。
在本实施方式中, 通过可调成像设备 420使得所述图像釆集设备 410可 以得到最清晰的图像, 但是需要通过所述图像分析单元 431 来找到该最清晰 的图像, 此时根据所述最清晰的图像以及系统已知的光路参数就可以通过计 算得到眼睛的光学参数。 这里眼睛的光学参数可以包括用户眼睛的光轴方向。
在一种可能的实施方式中, 所述系统 400还包括: 一投射设备 440, 用于 向眼底投射光斑。 可以通过一微型投影仪来实现该投射设备 440 的功能。 所 投射的光斑可以没有特定图案仅用于照亮眼底。 所投射的光斑还可包括一特 征丰富的图案。 图案的特征丰富可以便于检测, 提高检测精度。 光斑图案以 及有光斑时釆集到的眼底的图像分别如图 2 ( a )、 图 2 ( b )所示。
为了不影响眼睛的正常观看, 优选的, 所述光斑为眼睛不可见的红外光 斑。 此时, 为了减小其它光谱的干扰:
所述投射设备 440的出射面可以设置有一眼睛不可见光透射滤镜。
所述图像釆集设备 410的入射面设置有一眼睛不可见光透射滤镜。
优选地, 在一种可能的实施方式中, 所述图像处理设备 430还包括: 一投射控制单元 434, 用于根据图像分析模块 431得到的结果, 控制所述 投射设备 440的投射光斑亮度。
例如所述投射控制单元 434可以根据图像釆集设备 410得到的图像的特 性自适应调整亮度。 这里图像的特性包括图像特征的反差以及紋理特征等。 需要说明的是, 控制所述投射设备 440 的投射光斑亮度的一种特殊的情 况为打开或关闭投射设备 440,例如用户持续注视一点时可以周期性关闭所述 投射设备 440;用户眼底足够明亮时可以关闭发光源只利用眼底信息来检测眼 睛当前视线对焦点到眼睛的距离。
此外, 所述投射控制单元 434还可以根据环境光来控制投射设备 440的 投射光斑亮度。
优选地, 在一种可能的实施方式中, 所述图像处理设备 430还包括: 一 图像校准单元 433 , 用于进行眼底图像的校准, 获得至少一个与眼底呈现的图 像对应的基准图像。
所述图像分析单元 431将图像釆集设备 430得到的图像与所述基准图像 进行对比计算, 获得所述最清晰的图像。 这里, 所述最清晰的图像可以为获 得的与所述基准图像差异最小的图像。 在本实施方式中, 通过现有的图像处 理算法计算当前获得的图像与该基准图像的差异, 例如使用经典的相位差值 自动对焦算法。
优选地, 在一种可能的实施方式中, 所述参数计算单元 432包括: 一眼睛光轴方向确定子单元 4321 , 用于根据得到所述最清晰图像时用户 眼睛的特征得到用户眼睛的光轴方向。 用户眼睛的特征可以是从所述最清晰 图像上获取的, 或者也可以是另外获取的。 用户眼睛的光轴方向表示用户眼 睛视线注视的方向。
在一种可能的实施方式中, 所述眼睛光轴方向确定子单元 4321包括: 一 第一确定部分, 用于根据得到所述最清晰图像时眼底的特征得到用户眼睛的 光轴方向。 与通过瞳孔和眼球表面的特征得到用户眼睛的光轴方向相比, 通 过眼底的特征来确定用户眼睛的光轴方向精确度更高。
在向眼底投射光斑图案时, 该光斑图案的大小有可能大于眼底可视区域 或小于眼底可视区域, 其中:
当该光斑图案的面积小于等于眼底可视区域时, 可以利用经典特征点匹 配算法(例如尺度不变特征转换( Scale Invariant Feature Transform, SIFT )算 法)通过检测图像上的光斑图案相对于眼底位置来确定眼睛的光轴方向。 当该光斑图案的面积大于等于眼底可视区域时, 可以通过得到的图像上 的光斑图案相对于原光斑图案 (通过图像校准模块获得) 的位置来确定眼睛 的光轴方向, 进而确定用户视线方向。
在另一种可能的实施方式中, 所述眼睛光轴方向确定部分 4321包括: 一 第二确定部分, 用于根据得到所述最清晰图像时用户眼睛瞳孔的特征得到用 户眼睛的光轴方向。 这里用户眼睛瞳孔的特征可以是从所述最清晰图像上获 取的, 也可以是另外获取的。 通过用户眼睛瞳孔特征得到用户眼睛的光轴方 向为已有技术, 此处不再赘述。
在一种可能的实施方式中, 所述图像处理设备 430还包括: 一眼睛光轴 方向校准单元 435 , 用于进行用户眼睛的光轴方向的校准, 以便更精确的进行 上述用户眼睛的光轴方向的确定。
在本实施方式中, 所述系统已知的成像参数包括固定的成像参数和实时 成像参数, 其中实时成像参数为获取最清晰图像时所述可调透镜单元的参数 信息, 该参数信息可以在获取所述最清晰图像时实时记录得到。
在得到用户眼睛当前的光学参数之后, 就可以计算得到眼睛对焦点到用 户眼睛的距离, 具体为:
图 4 ( c )所示为眼睛成像示意图,结合经典光学理论中的透镜成像公式, 由图 4 ( c ) 可以得到公式 (1):
其中 i d。+和i de分=别i为眼睛当前观察对象 4010和视网膜上的实像 4020到眼 睛等效透镜 4030的距离, fe为眼睛等效透镜 4030的等效焦距, X为眼睛的光 轴方向 (即视线的光轴)。
图 4 ( d )所示为根据系统已知光学参数和眼睛的光学参数得到眼睛对焦 点到眼睛的距离的示意图, 图 4 ( d )中光斑 4040通过可调透镜单元 421会成 一个虛像(未示出), 假设该虛像距离可调透镜单元 421距离为 X , 结合公式
(1)可以得到如下方程组:
Figure imgf000015_0001
其中 d为光斑 4040到可调透镜单元 421的光学等效距离, 为可调透^ 单元 421到眼睛等效透镜 4030的光学 效距离, f„为可调透镜单元 421的焦 距值, 为所述眼睛等效透镜 4030到可调透镜单元 421的距离。
由(1)和 (2)可以得出当前观察对象 4010 (眼睛对焦点) 到眼睛等效透 4030的距离 d如公式 (3)所示:
Figure imgf000015_0002
根据上述计算得到的观察对象 4010到眼睛的距离, 又由于之前的记载可 以得到眼睛的光轴方向, 则可以轻易得到眼睛的对焦点位置。
如图 5所示为一种可能的实施方式的眼睛对焦点检测系统 500应用在眼 镜 A (这里的眼镜 A可以为本发明实施例的提醒装置) 上的实施例, 其包括 图 4 ( b )所示实施方式的记载的内容, 具体为: 由图 5可以看出, 在本实施 方式中, 在眼镜 A右侧 (不局限于此)集成了本实施方式的系统 500, 其包 括:
微型摄像头 510, 其作用与图 4 ( b ) 实施方式中记载的图像釆集设备相 同, 为了不影响用户正常观看对象的视线, 其被设置于眼睛 B右外侧;
第一分光镜 520, 其作用与图 4 ( b ) 实施方式中记载的第一分光单元相 同, 以一定倾角设置于眼睛 B注视方向和摄像头 510入射方向的交点处, 透 射观察对象进入眼睛 B的光以及反射眼睛到摄像头 510的光;
焦距可调透镜 530, 其作用与图 4 ( b ) 实施方式中记载的焦距可调透镜 相同,位于所述第一分光镜 520和摄像头 510之间,实时进行焦距值的调整, 使得在某个焦距值时, 所述摄像头 510能够拍到眼底最清晰的图像。
在本实施方式中, 所述图像处理设备在图 5 中未表示出, 其功能与图 4 ( b )所示的图像处理设备相同。 由于一般情况下, 眼底的亮度不够, 因此, 最好对眼底进行照明, 在本 实施方式中, 通过一个发光源 540来对眼底进行照明。 为了不影响用户的体 验, 这里优选的发光源 540为眼睛不可见光, 优选对眼睛 B影响不大并且摄 像头 510又比较敏感的近红外光发光源。
在本实施方式中, 所述发光源 540位于右侧的眼镜架外侧, 因此需要通 过一个第二分光镜 550与所述第一分光镜 520—起完成所述发光源 540发出 的光到眼底的传递。 本实施方式中, 所述第二分光镜 550 又位于摄像头 510 的入射面之前, 因此其还需要透射眼底到第二分光镜 550的光。
可以看出, 在本实施方式中, 为了提高用户体验和提高摄像头 510 的釆 集清晰度, 所述第一分光镜 520优选地可以具有对红外反射率高、 对可见光 透射率高的特性。 例如可以在第一分光镜 520朝向眼睛 B的一侧设置红外反 射膜实现上述特性。
由图 5 可以看出, 由于在本实施方式中, 所述眼睛对焦点检测系统 500 位于眼镜 500的镜片远离眼睛 B的一侧, 因此进行眼睛光学参数计算时, 可 以将镜片也看成是眼镜的一部分, 此时不需要知道镜片的光学特性。
在本发明实施例的其它实施方式中, 所述眼睛对焦点检测系统 500可能 位于眼镜 A的镜片靠近眼睛 B的一侧, 此时, 需要预先得到镜片的光学特性 参数, 并在计算对焦点距离时, 考虑镜片的影响因素。
发光源发出的光通过第二分光镜 550的反射、焦距可调透镜 530的投射、 以及第一分光镜 520的反射后再透过眼镜 A的镜片进入用户眼睛, 并最终到 达眼底的视网膜上; 摄像头 510经过所述第一分光镜 520、 焦距可调透镜 530 以及第二分光镜 550构成的光路透过眼睛 B的瞳孔拍摄到眼底的图像。
如图 6所示为另一种实施方式眼睛对焦点检测系统 600的结构示意图。 由图 6可以看出, 本实施方式与图 5所示的实施方式相似, 包括微型摄像头 610、 第二分光镜 620、 焦距可调透镜 630, 不同之处在于, 在本实施方式中 的投射装置 640为投射光斑图案的投射装置 640, 并且通过一个曲面分光镜 650作为曲面分光单元取代了图 5实施方式中的第一分光镜。 这里釆用了曲面分光镜 650分别对应眼睛光轴方向不同时瞳孔的位置, 将眼底呈现的图像传递到图像釆集设备。 这样摄像头可以拍摄到眼球各个角 度混合叠加的成像, 但由于只有通过瞳孔的眼底部分能够在摄像头上清晰成 像, 其它部分会失焦而无法清晰成像, 因而不会对眼底部分的成像构成严重 干扰, 眼底部分的特征仍然可以检测出来。 因此, 与图 5 所示的实施方式相 比, 本实施方式可以在眼睛注视不同方向时都能很好的得到眼底的图像, 使 得本实施方式的眼睛对焦点检测系统适用范围更广, 检测精度更高。 如图 7所示, 本发明实施例的提醒装置 300中, 所述提醒模块 320进一步包 括:
一监控参数获取单元 321 , 用于根据所述对焦点位置获取监控参数。 根据用户的不同状态, 所述监控参数可包括以下一种或多种: 所述对焦 点位置到用户眼睛的距离、 用户当前视线与一特定方向之间的角度、 用户当 前视线与一观看对象上经过所述对焦点的法线之间的角度、 以及所述对焦点 位置的变化频率。
一提醒单元 322, 用于根据所述监控参数对用户进行提醒, 提醒方式可例 如: 发出声音、 振动、 或通过发光源的颜色改变、 发光源闪烁等。
具言之, 可在所述监控参数超出一预设范围时, 即刻对用户进行提醒。 例如, 所述对焦点位置到用户眼睛的距离超出一预设的距离范围、 用户当前 视线与一特定方向 (例如, 用户的移动方向) 之间的角度超出一预设范围、 用户当前视线与一观看对象上经过所述对焦点的法线之间的角度超出一预设 范围、 以及所述对焦点位置的变化频率超出一预设范围等。
还可在所述监控参数超出一预设范围的时间超过预设时间时, 对用户进 行提醒。 也即, 当上述超出预设范围的情况出现时, 不即刻提醒, 而是在一 定的合理的时间范围后提醒, 以进一步提高提醒的准确度。
提醒单元 322可包括对应上述功能的一提醒设备以及一计时设备, 在此不 此外, 本发明实施例的装置 300还可包括一设置模块 330 , 用于设置该预 设范围和预设时间。 该设置模块 330可根据用户状态选择一个或多个监控参数, 并设置该预设范围和预设时间。 具言之, 根据用户状态能够判断用户用眼场 景, 进而设置合适的预设范围和预设时间。 例如, 通过用户状态分析用户处 于阅读状态, 则可选取所述对焦点位置到眼睛的距离超出一预设的距离范围 (以监控用户是否距离目标阅读物过近;)、 用户当前视线与观看对象上经过所 述对焦点的法线之间的角度超出一预设范围(以监控用户姿势是否过于倾斜)、 以及所述对焦点位置的变化频率超出一预设范围 (以监控用户是否处于过于 颠簸不适合阅读的情况) 等。
该用户状态可包括用户的运动状态、健康状态、 以及先前的用眼历史(例 如, 表示已经处于阅读状态的时间, 根据该时间自适应调整之后提醒的预设 范围和预设时间)等。 用户状态可由检测模块 310完成检测, 根据所检测的用 户状态的不同, 检测模块 310可进一步由不同的部分组成, 例如, 对于运动状 态来说, 检测模块 310可进一步包括一 GPS定位设备以及一头部传感器, 以根 据定位信息和 /或头部传感信息检测所述用户运动状态。
在监控参数的选择、 预设范围和预设时间的设置的过程中, 设置模块 330 还可综合考虑用户资料, 该用户资料可包括用户的视力情况、 年龄、 性别以 及职业等与用眼相关的信息中的一种或多种。 这些资料可以通过用户或其他 人手动输入或通过自动获取。 综合考虑用户资料可以更有针对性的对不同用 户设置不同的预设范围和预设时间。 相应地, 本发明实施例的装置 300还包 括:一资料获取模块 340, 用于获取用户资料。
下面通过具体实例来进一步说明本发明实施例的方法及装置。
以图 8所示的驾驶场景为例。 驾驶员佩戴依照本发明实施例一种可能的 实现方式的提醒装置(眼镜 A ),其上装配有图 6所述的对焦点检测系统 600, 在本实例的提醒装置中, 设置有计时设备, 该计时设备可分别记录与用户状 态相关的各种时间周期。
在此场景下, 使用该提醒装置对用户进行提醒的过程如下: 检测用户状态。 根据 GPS定位信息, 确定用户处于移动状态, 设置适于 安全驾驶的预设范围: 1 ) 与前方目标对象的安全距离阈值 S1 ; 2 )用户视线 与前进方向的夹角的安全范围 R1以及第一时间阈值 T1 (例如, 10s );以及 3 ) 重置 1 )和 2 )所设阈值和范围的第二时间阈值 T2 (例如, 1小时)。 根据对 焦点位置检测确定对焦点 C与用户眼睛之间的距离小于距离阈值 S1时,为不 影响驾驶, 以震动的方式提醒用户。 当用户与视线与前进方向的夹角超出安 全范围 R1且持续超过第二时间阈值 T1时, 同样提醒用户。
当用户驾驶时间累积超过 1小时后, 该提醒装置自动重置 1 )、 2 )所设的 阈值和范围, 为保证驾驶安全, 可将 Sl、 R1以及 T2相应的减小和缩短。
需要说明的是, 若根据 GPS定位信息确定用户处于静止状态 (例如, 红 灯等待) 时, 可停止提醒装置的提醒功能, 并在重新启动后重启提醒功能。 以图 9所示的阅读场景为例。 用户佩戴依照本发明实施例一种可能的实 现方式的提醒装置(眼镜 A ), 其上装配有图 5所述的对焦点检测系统 500, 在本实例的提醒装置中, 设置有计时设备, 该计时设备可分别记录与用户状 态相关的各种时间周期。
在此场景下, 使用该提醒装置对用户进行提醒的过程如下:
设置适于保护视力的预设范围: 1 ) 与前方目标对象的安全距离阈值 S2; 2 )用户当前视线与观看对象上经过所述对焦点 D的法线的之间的夹角 β的范 围 R2以及第三时间阈值 Τ3 (例如, 1分钟); 3 )所述对焦点位置的变化频率 F1和第四时间阈值 Τ4 (例如, 5分钟); 以及 4 )重置上述阈值和范围的第五 时间阈值 Τ5 (例如, 1小时)。
根据对焦点位置检测确定对焦点 D与用户眼睛之间的距离小于 S2时,以 语音的方式提醒用户。 当夹角 β超出范围 R2且持续超过 Τ3时, 提醒用户。 当对焦点位置的变化频率超过 F1且持续时间超过 Τ4时, 此时用户可能处于 快速移动状态, 提醒用户停止阅读。
当用户阅读时间累积超过 Τ5后,该提醒装置自动重置上述各阈值和范围, 并做出适当的减小或缩短。 图 10为本发明实施例提供的一种提醒装置 1000的结构示意图, 本发明 具体实施例并不对提醒装置 1000的具体实现做限定。 如图 10所示, 该提醒 装置 1000可以包括:
处理器 (processor)1100、 通信接口(Communications Interface) 1200、存储器 (memory) 1300, 以及通信总线 1400。 其中:
处理器 1100、 通信接口 1200、 以及存储器 1300通过通信总线 1400完成 相互间的通信。
通信接口 1200, 用于与比如客户端等的网元通信。
处理器 1100, 用于执行程序 1320, 具体可以执行上述图 1所示的方法实 施例中的相关步骤。
具体地, 程序 1320可以包括程序代码, 所述程序代码包括计算机操作指 处理器 1100 可能是一个中央处理器 CPU, 或者是特定集成电路 ASIC
( Application Specific Integrated Circuit ) , 或者是被配置成实施本发明实施例 的一个或多个集成电路。
存储器 1300,用于存放程序 1320。存储器 1300可能包含高速 RAM存储 器, 也可能还包括非易失性存储器 ( non- volatile memory ), 例如至少一个磁 盘存储器。 程序 1320具体可以使该装置 1000执行以下步骤:
检测一用户视线对焦点位置;
根据所述对焦点位置以及用户状态对所述用户进行提醒。
程序 1320中各单元的具体实现可以参见上文各实施例中的相应步骤或单 元, 在此不赘述。
本发明的实施例还提供了一种可穿戴式光学设备, 所述可穿戴式光学设 备可以为图 5或图 6示出的框架眼镜、 或者还可以为隐形眼睛, 该可穿戴式 光学设备包括上面各实施例记载的提醒装置。 在本发明实施例的其它可能的实施方式中, 所述眼睛光学参数检测系统 还可能应用于其它与眼睛相关的设备上, 例如望远镜等非穿戴式光学设备; 或者, 本发明的光学参数检测系统还可以应用于除了眼睛之外的其他成像接 收装置, 如相机等。
本领域普通技术人员可以意识到, 结合本文中所公开的实施例描述的各 示例的单元及方法步骤, 能够以电子硬件、 或者计算机软件和电子硬件的结 合来实现。 这些功能究竟以硬件还是软件方式来执行, 取决于技术方案的特 定应用和设计约束条件。 专业技术人员可以对每个特定的应用来使用不同方 法来实现所描述的功能, 但是这种实现不应认为超出本发明的范围。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用 时, 可以存储在一个计算机可读取存储介质中。 基于这样的理解, 本发明的 技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可 以以软件产品的形式体现出来, 该计算机软件产品存储在一个存储介质中, 包括若干指令用以使得一台计算机设备(可以是个人计算机, 服务器, 或者 网络设备等) 执行本发明各个实施例所述方法的全部或部分步骤。 而前述的 存储介质包括: U盘、 移动硬盘、 只读存储器(ROM, Read-Only Memory ), 随机存取存储器(RAM, Random Access Memory ),磁碟或者光盘等各种可以 存储程序代码的介质。
以上实施方式仅用于说明本发明, 而并非对本发明的限制, 有关技术领 域的普通技术人员, 在不脱离本发明的精神和范围的情况下, 还可以做出各 种变化和变型, 因此所有等同的技术方案也属于本发明的范畴, 本发明的专 利保护范围应由权利要求限定。

Claims

权 利 要 求
1、 一种提醒方法, 其特征在于, 所述方法包括步骤:
检测一用户视线对焦点位置;
根据所述对焦点位置以及用户状态对所述用户进行提醒。
2、 根据权利要求 1所述的方法, 其特征在于, 所述检测一用户视线对焦 点位置进一步包括:
釆集所述用户眼睛眼底呈现的一图像;
进行所述用户眼睛与釆集位置之间光路的成像参数的调节直至釆集到一 最清晰的图像;
根据釆集到所述最清晰图像时所述用户眼睛与所述釆集位置之间光路的 成像参数以及所述用户眼睛的光学参数, 计算得到所述对焦点位置。
3、 根据权利要求 2所述的方法, 其特征在于, 所述用户眼睛的光学参数 包括所述用户眼睛的光轴方向。
4、 根据权利要求 2所述的方法, 其特征在于, 所述进行所述用户眼睛与 釆集位置之间光路的成像参数的调节包括:
对所述用户眼睛与所述釆集位置之间的光路上一光学器件的焦距和 /或在 所述光路中的位置进行调节。
5、 根据权利要求 2所述的方法, 其特征在于, 所述进行所述用户眼睛与 釆集位置之间光路的成像参数的调节直至釆集到一最清晰的图像包括:
分别对应所述用户眼睛的光轴方向不同时瞳孔的位置, 将所述用户眼底 呈现的一图像传递到所述釆集位置。
6、 根据权利要求 2所述的方法, 其特征在于, 所述方法还包括: 向所述用户眼底投射一光斑图案。
7、 根据权利要求 1至 6中任一项所述的方法, 其特征在于, 所述方法还包 括步骤:
检测所述用户状态。
8、 根据权利要求 7所述的方法, 其特征在于, 在所述根据所述对焦点位 根据所述对焦点位置获取监控参数;
根据所述监控参数对所述用户进行提醒。
9、 根据权利要求 8所述的方法, 其特征在于, 在所述根据所述监控参数 对所述用户进行提醒的步骤中:
在所述监控参数超出一预设范围时, 对所述用户进行提醒。
10、 根据权利要求 8所述的方法, 其特征在于, 在所述根据所述监控参数 对所述用户进行提醒的步骤中:
在所述监控参数超出一预设范围的时间超过一预设时间时, 对所述用户 进行提醒。
11、 根据权利要求 8所述的方法, 其特征在于, 所述监控参数包括以下一 种或多种: 所述对焦点位置到所述用户眼睛的距离、 所述用户当前视线与一 特定方向之间的角度、 所述用户当前视线与一观看对象上经过所述对焦点的 法线之间的角度、 以及所述对焦点位置的变化频率。
12、 根据权利要求 8所述的方法, 其特征在于, 所述监控参数包括所述用 户当前视线与一特定方向之间的角度, 所述特定方向为所述用户的移动方向。
13、根据权利要求 9或 10所述的方法,其特征在于,所述方法还包括步骤: 根据所述用户状态设置所述预设范围。
14、 根据权利要求 10所述的方法, 其特征在于, 所述方法还包括步骤: 根据所述用户状态设置所述预设时间。
15、根据权利要求 9或 10所述的方法,其特征在于,所述方法还包括步骤: 获取用户资料;
根据所述用户资料设置所述预设范围。
16、 根据权利要求 10所述的方法, 其特征在于, 所述方法还包括步骤: 获取用户资料;
根据所述用户资料设置所述预设时间。
17、 一种提醒装置, 其特征在于, 所述装置包括: 一检测模块, 用于检测一用户视线对焦点位置;
一提醒模块, 用于根据所述对焦点位置以及用户状态对所述用户进行提 醒。
18、 根据权利要求 17所述的装置, 其特征在于, 所述检测模块进一步包 括:
一图像釆集设备, 用于釆集所述用户眼睛眼底呈现的一图像;
一可调成像设备, 用于进行所述用户眼睛与所述图像釆集设备之间光路 的成像参数的调节直至所述图像釆集设备釆集到一最清晰的图像;
图像处理设备, 用于根据釆集到所述最清晰图像时所述用户眼睛与所述 图像釆集设备之间光路的成像参数以及所述用户眼睛的光学参数, 计算得到 所述对焦点位置。
19、根据权利要求 18所述的装置,其特征在于,所述可调成像设备包括: 一可调透镜单元, 位于所述用户眼睛与所述图像釆集设备之间的光路上, 自身焦距可调和 /或在光路中的位置可调。
20、根据权利要求 18所述的装置,其特征在于,所述可调成像设备包括: 一曲面分光单元, 用于分别对应所述以及眼睛的光轴方向不同时瞳孔的 位置, 将所述用户眼底呈现的一图像传递到所述图像釆集设备。
21、 根据权利要求 18所述的装置, 其特征在于, 所述检测模块还包括: 一投射设备, 用于向所述用户眼底投射一光斑图案。
22、 根据权利要求 17至 21中任一项所述的装置, 其特征在于, 所述检测
23、 根据权利要求 22所述的装置, 其特征在于, 所述提醒模块进一步包 括:
一监控参数获取单元, 用于根据所述对焦点位置获取监控参数; 一提醒单元, 用于根据所述监控参数对所述用户进行提醒。
24、 根据权利要求 23所述的装置, 其特征在于, 所述提醒单元在所述监 控参数超出一预设范围时, 对所述用户进行提醒。
25、 根据权利要求 23所述的装置, 其特征在于, 所述提醒单元在所述监 控参数超出一预设范围的时间超过一预设时间时, 对所述用户进行提醒。
26、 根据权利要求 24或 25所述的装置, 其特征在于, 所述装置还包括: 一设置模块, 用于根据所述用户状态设置所述预设范围。
27、 根据权利要求 26所述的装置, 其特征在于, 所述装置还包括: 一设置模块, 用于根据所述用户状态设置所述预设时间。
28、 根据权利要求 24或 25所述的装置, 其特征在于, 所述装置还包括: 一资料获取模块, 用于获取用户资料;
一设置模块, 用于根据所述用户资料设置所述预设范围。
29、 根据权利要求 25所述的装置, 其特征在于, 所述装置还包括: 一资料获取模块, 用于获取用户资料;
一设置模块, 用于根据所述用户资料设置所述预设时间。
30、 根据权利要求 17所述的装置, 其特征在于, 所述装置为眼镜。
31、 一种计算机程序产品, 其特征在于, 所述计算机程序产品使一提醒装 置执行权利要求 1所述的方法。
32、 一种计算机可读介质, 其特征在于, 所述计算机可读介质包含计算 机操作指令, 当一处理器执行所述计算机操作指令时, 所述计算机操作指令 用于使处理器执行权利要求 1所述的方法。
33、 一种提醒装置, 其特征在于, 所述装置包括一处理器、 一存储器和 一通信接口,
所述存储器存储计算机操作指令, 所述处理器、 所述存储器和所述通信 接口通过一通信总线连接,
当所述装置运行时, 所述处理器执行所述存储器存储的所述计算机操作 指令, 使得所述装置执行权利要求 1所述的方法。
PCT/CN2013/088549 2013-08-30 2013-12-04 提醒方法及提醒装置 WO2015027598A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/783,495 US10395510B2 (en) 2013-08-30 2013-12-04 Reminding method and reminding device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310390569.9A CN103500331B (zh) 2013-08-30 2013-08-30 提醒方法及装置
CN201310390569.9 2013-08-30

Publications (1)

Publication Number Publication Date
WO2015027598A1 true WO2015027598A1 (zh) 2015-03-05

Family

ID=49865536

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/088549 WO2015027598A1 (zh) 2013-08-30 2013-12-04 提醒方法及提醒装置

Country Status (3)

Country Link
US (1) US10395510B2 (zh)
CN (1) CN103500331B (zh)
WO (1) WO2015027598A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763779A (zh) * 2016-03-31 2016-07-13 联想(北京)有限公司 一种电子设备及提示方法
CN113239754A (zh) * 2021-04-23 2021-08-10 泰山学院 一种应用于车联网的危险驾驶行为检测定位方法及系统

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9639231B2 (en) * 2014-03-17 2017-05-02 Google Inc. Adjusting information depth based on user's attention
JP2016036390A (ja) * 2014-08-05 2016-03-22 富士通株式会社 情報処理装置、焦点検出方法、および焦点検出プログラム
JP2016163150A (ja) * 2015-02-27 2016-09-05 株式会社リコー 通信端末、面談システム、通信方法及びプログラム
CN105607252A (zh) * 2015-09-23 2016-05-25 宇龙计算机通信科技(深圳)有限公司 智能眼镜的控制方法及控制装置
CN105528577B (zh) * 2015-12-04 2019-02-12 深圳大学 基于智能眼镜的识别方法
CN105787435A (zh) * 2016-02-03 2016-07-20 北京天诚盛业科技有限公司 一种用于虹膜采集的指示方法和装置
US20180012094A1 (en) * 2016-07-05 2018-01-11 Morphotrust Usa, Llc Spoofing attack detection during live image capture
CN106098020A (zh) * 2016-07-21 2016-11-09 广东欧珀移动通信有限公司 控制方法及控制装置
CN106339082A (zh) * 2016-08-16 2017-01-18 惠州Tcl移动通信有限公司 一种基于教学用头戴式设备的提示方法及系统
CN107016835A (zh) * 2017-05-11 2017-08-04 紫勋智能科技(北京)有限公司 视力防护装置
US11017249B2 (en) 2018-01-29 2021-05-25 Futurewei Technologies, Inc. Primary preview region and gaze based driver distraction detection
CN108874122A (zh) * 2018-04-28 2018-11-23 深圳市赛亿科技开发有限公司 智能眼镜及其控制方法、计算机可读存储介质
US10380882B1 (en) * 2018-06-28 2019-08-13 International Business Machines Corporation Reconfigurable hardware platform for processing of classifier outputs
CN110717349A (zh) * 2018-07-11 2020-01-21 深圳纬目信息技术有限公司 一种桌垫和状态监测方法
JP6992693B2 (ja) * 2018-07-23 2022-01-13 トヨタ自動車株式会社 乗員状態認識装置
CN109326085B (zh) * 2018-11-08 2020-07-31 上海掌门科技有限公司 一种用于在车辆设备上进行疲劳驾驶检测的方法与设备
JP7424099B2 (ja) 2019-03-06 2024-01-30 株式会社リコー 光学装置、網膜投影表示装置、頭部装着型表示装置、及び検眼装置
CN109933687B (zh) * 2019-03-13 2022-05-31 联想(北京)有限公司 信息处理方法、装置及电子设备
CN113034069A (zh) * 2019-12-25 2021-06-25 菜鸟智能物流控股有限公司 一种物流对象的处理方法及物流管理设备
CN113867121B (zh) * 2020-06-30 2023-07-11 上海市眼病防治中心 一种基于3010法则进行近距离用眼干预的方法及设备
CN111986443A (zh) * 2020-08-31 2020-11-24 上海博泰悦臻网络技术服务有限公司 一种疲劳驾驶的监测装置及方法
CN114209990A (zh) * 2021-12-24 2022-03-22 艾视雅健康科技(苏州)有限公司 一种实时分析医疗装置的入眼有效功的方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5537163A (en) * 1994-06-17 1996-07-16 Nikon Corporation Ophthalmologic apparatus with automatic focusing using two reference marks and having an in-focus detecting system
JPH09289973A (ja) * 1996-04-26 1997-11-11 Canon Inc 眼底カメラ
CN102008288A (zh) * 2010-12-17 2011-04-13 中国科学院光电技术研究所 一种线扫描共焦检眼镜的系统和方法
CN103190883A (zh) * 2012-12-20 2013-07-10 乾行讯科(北京)科技有限公司 一种头戴式显示装置和图像调节方法

Family Cites Families (181)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264154A (en) 1979-06-05 1981-04-28 Polaroid Corporation Apparatus for automatically controlling transmission of light through a lens system
US4572616A (en) 1982-08-10 1986-02-25 Syracuse University Adaptive liquid crystal lens
US4973149A (en) 1987-08-19 1990-11-27 Center For Innovative Technology Eye movement detector
JP2676870B2 (ja) 1989-01-17 1997-11-17 キヤノン株式会社 注視点検出手段を有した光学装置
JP2920940B2 (ja) 1989-06-20 1999-07-19 キヤノン株式会社 視線検出手段を有した機器
US5182585A (en) 1991-09-26 1993-01-26 The Arizona Carbon Foil Company, Inc. Eyeglasses with controllable refracting power
US5731046A (en) 1994-01-18 1998-03-24 Qqc, Inc. Fabrication of diamond and diamond-like carbon coatings
US6072443A (en) 1996-03-29 2000-06-06 Texas Instruments Incorporated Adaptive ocular projection display
JPH1070742A (ja) 1996-08-29 1998-03-10 Olympus Optical Co Ltd 二眼式映像表示装置
JP3802630B2 (ja) 1996-12-28 2006-07-26 オリンパス株式会社 立体画像生成装置および立体画像生成方法
DE19704197A1 (de) 1997-02-05 1998-08-06 Zeiss Carl Jena Gmbh Anordnung zur subjektivem Refraktionsbestimmung und/oder zur Bestimmung anderer Sehfunktionen
US6152563A (en) 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
JP3383228B2 (ja) 1998-11-09 2003-03-04 シャープ株式会社 ヘッドマウントディスプレイ装置
US6318857B1 (en) 1998-12-09 2001-11-20 Asahi Kogaku Kogyo Kabushiki Kaisha Variable power spectacles
JP4209535B2 (ja) * 1999-04-16 2009-01-14 パナソニック株式会社 カメラ制御装置
US6619799B1 (en) 1999-07-02 2003-09-16 E-Vision, Llc Optical lens system with electro-active lens having alterably different focal lengths
US6835179B2 (en) 2000-11-22 2004-12-28 Carl Zeiss Jena Gmbh Optical stimulation of the human eye
US6478425B2 (en) 2000-12-29 2002-11-12 Koninlijke Phillip Electronics N. V. System and method for automatically adjusting a lens power through gaze tracking
DE10103922A1 (de) 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interaktives Datensicht- und Bediensystem
EP1395865A2 (en) 2001-06-12 2004-03-10 Silicon Optix Inc. System and method for correcting keystone distortion
US7001020B2 (en) 2001-08-02 2006-02-21 Daphne Instruments, Inc. Complete autorefractor system in an ultra-compact package
JP4224260B2 (ja) 2002-02-18 2009-02-12 株式会社トプコン キャリブレーション装置、方法及び結果診断装置、並びにキャリブレーション用チャート
US20080106633A1 (en) 2002-03-13 2008-05-08 Blum Ronald D Electro-optic lens with integrated components for varying refractive properties
JP2005519684A (ja) 2002-03-13 2005-07-07 イー・ビジョン・エルエルシー 一体化構成部品を備えた電気光学レンズ
CN100553601C (zh) 2002-07-22 2009-10-28 林超群 动态式透镜视力训练装置
IL151592A (en) 2002-09-04 2008-06-05 Josef Bekerman Variable optical power spectacles for eyesight rehabilitation and methods for lens optical power control
JP3766681B2 (ja) * 2003-01-16 2006-04-12 秀典 堀江 視力改善装置
US7298414B2 (en) * 2003-01-29 2007-11-20 Hewlett-Packard Development Company, L.P. Digital camera autofocus using eye focus measurement
JP3844076B2 (ja) 2003-03-07 2006-11-08 セイコーエプソン株式会社 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法
US20050003043A1 (en) 2003-07-03 2005-01-06 Vincent Sewalt Composition and method for reducing caking and proteinaceous products
JP4085034B2 (ja) 2003-07-17 2008-04-30 信越化学工業株式会社 化合物、高分子化合物、レジスト材料及びパターン形成方法
JP2005092175A (ja) 2003-08-08 2005-04-07 Olympus Corp 光学特性可変光学素子
JP2005058399A (ja) 2003-08-11 2005-03-10 Nikon Corp ディスプレイ装置
JP4401727B2 (ja) 2003-09-30 2010-01-20 キヤノン株式会社 画像表示装置及び方法
WO2005065528A1 (en) 2004-01-02 2005-07-21 Vision Instruments Pty Ltd Devices to facilitate alignment and focussing of a fundus camera
JP2005208092A (ja) 2004-01-20 2005-08-04 Seiko Epson Corp プロジェクタ及びそれにおけるズーム調整方法
US9229540B2 (en) 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
JP4604190B2 (ja) 2004-02-17 2010-12-22 国立大学法人静岡大学 距離イメージセンサを用いた視線検出装置
US7535649B2 (en) 2004-03-09 2009-05-19 Tang Yin S Motionless lens systems and methods
US7675686B2 (en) 2004-03-31 2010-03-09 The Regents Of The University Of California Fluidic adaptive lens
US20060016459A1 (en) 2004-05-12 2006-01-26 Mcfarlane Graham High rate etching using high pressure F2 plasma with argon dilution
US7461938B2 (en) 2004-06-30 2008-12-09 Ophthonix, Inc. Apparatus and method for determining sphere and cylinder components of subjective refraction using objective wavefront measurement
WO2006012678A1 (en) 2004-08-03 2006-02-09 Silverbrook Research Pty Ltd Walk-up printing
US7287854B2 (en) * 2004-09-27 2007-10-30 Kowa Company Ltd. Ophthalmic photography apparatus
JP2008518740A (ja) 2004-11-08 2008-06-05 オプトビュー,インコーポレーテッド 総合的眼診断用光学装置および方法
US7486988B2 (en) 2004-12-03 2009-02-03 Searete Llc Method and system for adaptive vision modification
US7334892B2 (en) 2004-12-03 2008-02-26 Searete Llc Method and system for vision enhancement
US8104892B2 (en) 2004-12-03 2012-01-31 The Invention Science Fund I, Llc Vision modification with reflected image
US7344244B2 (en) 2004-12-03 2008-03-18 Searete, Llc Adjustable lens system with neural-based control
BRPI0515749B8 (pt) * 2004-12-06 2021-07-27 Biogen Idec Inc métodos para testar quanto à presença de ciclodextrina ou derivado de ciclodextrina em uma solução compreendendo proteína, para determinação da quantidade de ciclodextrina ou derivado de ciclodextrina que possa estar presente em uma solução compreendendo proteína e de avaliação de preparação farmacêutica
US8885139B2 (en) 2005-01-21 2014-11-11 Johnson & Johnson Vision Care Adaptive electro-active lens with variable focal length
CN100595620C (zh) 2005-01-21 2010-03-24 庄臣及庄臣视力保护公司 具有可变焦距的自适应电激活透镜
JP4744973B2 (ja) 2005-08-05 2011-08-10 株式会社トプコン 眼底カメラ
JP4719553B2 (ja) 2005-11-04 2011-07-06 キヤノン株式会社 撮像装置、撮像方法、コンピュータプログラム、及びコンピュータ読み取り可能な記憶媒体
CN2868183Y (zh) 2006-01-19 2007-02-14 杨宏军 智能视力保护器
US7766479B2 (en) 2006-03-31 2010-08-03 National University Corporation Shizuoka University View point detecting device
US7764433B2 (en) 2006-05-18 2010-07-27 The Regents Of The University Of California Method and system for correcting optical aberrations, including widefield imaging applications
JP4822331B2 (ja) 2006-06-22 2011-11-24 株式会社トプコン 眼科装置
CN100485448C (zh) 2006-06-29 2009-05-06 贾怀昌 自由曲面全反射式的目视光学棱镜
US7542210B2 (en) 2006-06-29 2009-06-02 Chirieleison Sr Anthony Eye tracking head mounted display
JP5017989B2 (ja) 2006-09-27 2012-09-05 ソニー株式会社 撮像装置、撮像方法
CN201005945Y (zh) 2007-01-15 2008-01-16 陈清祥 可调整滤光片的视线导正辅助器
US7926940B2 (en) 2007-02-23 2011-04-19 Pixeloptics, Inc. Advanced electro-active optic device
RU2489991C2 (ru) 2007-08-02 2013-08-20 Эленза, Инк. Многофокусная интраокулярная линзовая система и способы
CN101116609B (zh) 2007-08-30 2010-06-16 中国科学技术大学 扫描式自动变焦虹膜图像采集系统及采集方法
CN100578531C (zh) * 2007-11-08 2010-01-06 北京中星微电子有限公司 一种实现视力保护的显示器、装置及方法
CN101149254B (zh) 2007-11-12 2012-06-27 北京航空航天大学 高精度视觉检测系统
US8786675B2 (en) 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
US8331006B2 (en) 2008-02-13 2012-12-11 Nokia Corporation Display device and a method for illuminating a light modulator array of a display device
JP5227639B2 (ja) * 2008-04-04 2013-07-03 富士フイルム株式会社 オブジェクト検出方法、オブジェクト検出装置、およびオブジェクト検出プログラム
WO2009140255A2 (en) 2008-05-12 2009-11-19 Dreher Andreas W Adjustable eye glasses with a magnetic attachment
JP4518193B2 (ja) 2008-06-10 2010-08-04 ソニー株式会社 光学装置および虚像表示装置
CN101430429A (zh) 2008-06-17 2009-05-13 沈嘉琦 一种近视矫正眼镜
JP5310266B2 (ja) 2009-05-29 2013-10-09 セイコーエプソン株式会社 プロジェクターおよびその制御方法
US7736000B2 (en) * 2008-08-27 2010-06-15 Locarna Systems, Inc. Method and apparatus for tracking eye movement
CN101662696B (zh) * 2008-08-28 2011-01-26 联想(北京)有限公司 调节摄像系统的方法和设备
TW201011350A (en) 2008-09-04 2010-03-16 E Pin Optical Industry Co Ltd Liquid crystal zoom lens
CN102203850A (zh) 2008-09-12 2011-09-28 格斯图尔泰克公司 相对于用户而定向所显示的元素
TWI367748B (en) 2008-09-30 2012-07-11 Univ Ishou Apparatus for sight healthcare
US8884735B2 (en) 2008-11-17 2014-11-11 Roger Li-Chung Vision protection method and system thereof
CN201360319Y (zh) 2008-12-17 2009-12-09 胡超 眼镜式多功能立体图像装置
CN201352278Y (zh) 2008-12-23 2009-11-25 黄玲 一种自动变焦眼镜
DK2389095T3 (da) 2009-01-26 2014-11-17 Tobii Technology Ab Detektering af blikpunkt hjulpet af optiske referencesignaler
CN201464738U (zh) 2009-04-13 2010-05-12 段亚东 多功能保健眼镜
US8100532B2 (en) 2009-07-09 2012-01-24 Nike, Inc. Eye and body movement tracking for testing and/or training
JP2011043876A (ja) 2009-08-19 2011-03-03 Brother Industries Ltd 画像表示装置
JP5371638B2 (ja) 2009-09-01 2013-12-18 キヤノン株式会社 眼科撮影装置及びその方法
JP5388765B2 (ja) 2009-09-01 2014-01-15 キヤノン株式会社 眼底カメラ
CN101782685A (zh) 2009-09-04 2010-07-21 上海交通大学 实时多角度立体视觉系统
JP2011071898A (ja) 2009-09-28 2011-04-07 Panasonic Corp 立体映像表示装置および立体映像表示方法
WO2011053319A1 (en) 2009-10-30 2011-05-05 Hewlett-Packard Development Company, L.P. Stereo display systems
US8552850B2 (en) * 2010-02-17 2013-10-08 Honeywell International Inc. Near-to-eye tracking for adaptive operation
US20130278631A1 (en) 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20120242698A1 (en) 2010-02-28 2012-09-27 Osterhout Group, Inc. See-through near-eye display glasses with a multi-segment processor-controlled optical layer
US20120206485A1 (en) 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered user movement control of ar eyepiece facilities
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US20120212499A1 (en) 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content control during glasses movement
CN201637953U (zh) 2010-03-19 2010-11-17 熊火胜 智能变焦镜片
US9628722B2 (en) 2010-03-30 2017-04-18 Personify, Inc. Systems and methods for embedding a foreground video into a background feed based on a control input
CN101819334B (zh) 2010-04-01 2013-04-17 夏翔 多功能电子眼镜
CN101819331B (zh) 2010-04-27 2012-03-07 中国计量学院 一种遥控变焦眼镜装置
CN201754203U (zh) 2010-05-06 2011-03-02 徐晗 动态视角液晶眼镜
RU2576344C2 (ru) 2010-05-29 2016-02-27 Вэньюй ЦЗЯН Системы, способы и аппараты для создания и использования очков с адаптивной линзой на основе определения расстояния наблюдения и отслеживания взгляда в условиях низкого энергопотребления
JP5685013B2 (ja) 2010-06-30 2015-03-18 キヤノン株式会社 光断層撮像装置及びその制御方法、プログラム
CN101917638B (zh) 2010-07-07 2012-10-31 深圳超多维光电子有限公司 立体显示装置、移动终端及立体显示跟踪方法
KR20120005328A (ko) 2010-07-08 2012-01-16 삼성전자주식회사 입체 안경 및 이를 포함하는 디스플레이장치
US8324957B2 (en) 2010-07-16 2012-12-04 Linear Technology Corporation Capacitively coupled switched current source
US8884984B2 (en) 2010-10-15 2014-11-11 Microsoft Corporation Fusing virtual content into real content
US9395546B2 (en) 2010-10-29 2016-07-19 Lg Electronics Inc. Stereoscopic image processing system and device and glasses
US20120113235A1 (en) * 2010-11-08 2012-05-10 Sony Corporation 3d glasses, systems, and methods for optimized viewing of 3d video content
WO2012083415A1 (en) 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US8628193B2 (en) 2010-11-20 2014-01-14 Yibin TIAN Automatic accommodative spectacles using a scene analyzer and focusing elements
US20120140322A1 (en) 2010-12-01 2012-06-07 Adlens Beacon, Inc. Variable Binocular Loupe Utilizing Fluid Filled Lens Technology
CN102487393B (zh) 2010-12-01 2015-09-23 深圳市同洲软件有限公司 数字电视接收终端与移动终端交互方法、装置和系统
US9314148B2 (en) 2010-12-06 2016-04-19 Lensvector, Inc. Motionless adaptive stereoscopic scene capture with tuneable liquid crystal lenses and stereoscopic auto-focusing methods
US9690099B2 (en) * 2010-12-17 2017-06-27 Microsoft Technology Licensing, Llc Optimized focal area for augmented reality displays
US9507416B2 (en) 2011-02-22 2016-11-29 Robert Howard Kimball Providing a corrected view based on the position of a user with respect to a mobile platform
WO2012122411A1 (en) 2011-03-08 2012-09-13 Pixeloptics, Inc. Advanced electro-active optic device
JP2012199621A (ja) 2011-03-18 2012-10-18 Jvc Kenwood Corp 複眼撮像装置
JP5118266B2 (ja) 2011-03-25 2013-01-16 パナソニック株式会社 表示装置
AU2012243431A1 (en) * 2011-04-11 2013-11-07 Bob And Ahmed Enterprises Pty Ltd. An adjustable food covering support device
US8510166B2 (en) 2011-05-11 2013-08-13 Google Inc. Gaze tracking system
US8911087B2 (en) 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils
JP2012247449A (ja) 2011-05-25 2012-12-13 Canon Inc 投射型映像表示装置
US20120307208A1 (en) 2011-06-01 2012-12-06 Rogue Technologies, Inc. Apparatus and method for eye tracking
US20120327116A1 (en) 2011-06-23 2012-12-27 Microsoft Corporation Total field of view classification for head-mounted display
US20130241927A1 (en) 2011-07-03 2013-09-19 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
US9285592B2 (en) 2011-08-18 2016-03-15 Google Inc. Wearable device with input and output structures
US9323325B2 (en) * 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
CN202267785U (zh) 2011-08-30 2012-06-06 福州瑞芯微电子有限公司 一种自动跟踪人眼位置的裸眼三维显示结构
JP5879826B2 (ja) 2011-08-31 2016-03-08 株式会社ニデック 眼底撮影装置
US20130072828A1 (en) 2011-09-15 2013-03-21 Jason Sweis Shutter glasses
US9081177B2 (en) 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
US20130107066A1 (en) 2011-10-27 2013-05-02 Qualcomm Incorporated Sensor aided video stabilization
CA2856075A1 (en) 2011-11-18 2013-05-23 Optovue, Inc. Fundus camera
US9442517B2 (en) 2011-11-30 2016-09-13 Blackberry Limited Input gestures using device movement
CN102572483B (zh) 2011-12-02 2014-08-13 深圳超多维光电子有限公司 跟踪式裸眼立体显示控制方法、装置及显示设备、系统
US9182815B2 (en) 2011-12-07 2015-11-10 Microsoft Technology Licensing, Llc Making static printed content dynamic with virtual data
CN202383380U (zh) 2011-12-31 2012-08-15 张欣 多功能眼镜
US8384999B1 (en) 2012-01-09 2013-02-26 Cerr Limited Optical modules
CN103197757A (zh) 2012-01-09 2013-07-10 癸水动力(北京)网络科技有限公司 一种沉浸式虚拟现实系统及其实现方法
KR101911439B1 (ko) 2012-01-30 2019-01-07 삼성전자주식회사 초점길이 가변형 마이크로 렌즈와 이를 포함하는 마이크로 렌즈 어레이 및 이를 포함하는 3d 디스플레이와 그 동작 방법
EP2812775A1 (en) * 2012-02-06 2014-12-17 Sony Mobile Communications AB Gaze tracking with projector
US20130241805A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
US20130293530A1 (en) 2012-05-04 2013-11-07 Kathryn Stone Perez Product augmentation and advertising in see through displays
TWI474072B (zh) 2012-06-14 2015-02-21 Largan Precision Co Ltd 光學影像鏡片系統組
US9430055B2 (en) 2012-06-15 2016-08-30 Microsoft Technology Licensing, Llc Depth of field control for see-thru display
US20130342572A1 (en) 2012-06-26 2013-12-26 Adam G. Poulos Control of displayed content in virtual environments
JP5538483B2 (ja) 2012-06-29 2014-07-02 株式会社ソニー・コンピュータエンタテインメント 映像処理装置、映像処理方法、および映像処理システム
US9310611B2 (en) * 2012-09-18 2016-04-12 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
CN102937745B (zh) * 2012-11-13 2015-04-01 京东方科技集团股份有限公司 开放式头戴显示装置及其显示方法
CN103065605B (zh) 2012-12-10 2015-12-09 惠州Tcl移动通信有限公司 一种根据视力状况调节显示效果的方法和系统
US20140160157A1 (en) 2012-12-11 2014-06-12 Adam G. Poulos People-triggered holographic reminders
CN103150013A (zh) 2012-12-20 2013-06-12 天津三星光电子有限公司 一种移动终端
CN102981270A (zh) 2012-12-25 2013-03-20 中国科学院长春光学精密机械与物理研究所 一种无遮拦自适应变焦距光学系统及其标定方法
CN103054695A (zh) 2013-01-09 2013-04-24 中山市目明视光视力科技有限公司 调整眼内光学焦点位置的镜片组合方法
US10133342B2 (en) 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
US9507147B2 (en) 2013-02-14 2016-11-29 Blackberry Limited Wearable display system with detached projector
US20140232746A1 (en) 2013-02-21 2014-08-21 Hyundai Motor Company Three dimensional augmented reality display apparatus and method using eye tracking
US20140240351A1 (en) 2013-02-27 2014-08-28 Michael Scavezze Mixed reality augmentation
WO2014160342A1 (en) 2013-03-13 2014-10-02 The University Of North Carolina At Chapel Hill Low latency stabilization for head-worn displays
US9041741B2 (en) 2013-03-14 2015-05-26 Qualcomm Incorporated User interface for a head mounted display
US20140282224A1 (en) 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a scrolling gesture
NZ751593A (en) 2013-03-15 2020-01-31 Magic Leap Inc Display system and method
CN103280175A (zh) 2013-06-17 2013-09-04 苏州旭宇升电子有限公司 一种投影装置
JP5967597B2 (ja) 2013-06-19 2016-08-10 パナソニックIpマネジメント株式会社 画像表示装置および画像表示方法
US9256987B2 (en) 2013-06-24 2016-02-09 Microsoft Technology Licensing, Llc Tracking head movement when wearing mobile device
CN103353677B (zh) 2013-06-28 2015-03-11 北京智谷睿拓技术服务有限公司 成像装置及方法
US9443355B2 (en) 2013-06-28 2016-09-13 Microsoft Technology Licensing, Llc Reprojection OLED display for augmented reality experiences
CN103353667B (zh) 2013-06-28 2015-10-21 北京智谷睿拓技术服务有限公司 成像调整设备及方法
CN103353663B (zh) 2013-06-28 2016-08-10 北京智谷睿拓技术服务有限公司 成像调整装置及方法
WO2015006334A1 (en) 2013-07-08 2015-01-15 Ops Solutions Llc Eyewear operational guide system and method
CN103297735A (zh) 2013-07-16 2013-09-11 苏州旭宇升电子有限公司 一种投影装置
US9619939B2 (en) 2013-07-31 2017-04-11 Microsoft Technology Licensing, Llc Mixed reality graduated information delivery
CN103558909B (zh) 2013-10-10 2017-03-29 北京智谷睿拓技术服务有限公司 交互投射显示方法及交互投射显示系统
US10620457B2 (en) 2013-12-17 2020-04-14 Intel Corporation Controlling vision correction using eye tracking and depth detection
US9965030B2 (en) 2014-07-31 2018-05-08 Samsung Electronics Co., Ltd. Wearable glasses and method of displaying image via the wearable glasses
US10261579B2 (en) 2014-09-01 2019-04-16 Samsung Electronics Co., Ltd. Head-mounted display apparatus
US10330958B2 (en) 2015-04-10 2019-06-25 Bespoke, Inc. Systems and methods for creating eyewear with multi-focal lenses
JP6548821B2 (ja) 2015-09-30 2019-07-24 株式会社ソニー・インタラクティブエンタテインメント ヘッドマウントディスプレイの画面上でコンテンツの配置を最適化する方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5537163A (en) * 1994-06-17 1996-07-16 Nikon Corporation Ophthalmologic apparatus with automatic focusing using two reference marks and having an in-focus detecting system
JPH09289973A (ja) * 1996-04-26 1997-11-11 Canon Inc 眼底カメラ
CN102008288A (zh) * 2010-12-17 2011-04-13 中国科学院光电技术研究所 一种线扫描共焦检眼镜的系统和方法
CN103190883A (zh) * 2012-12-20 2013-07-10 乾行讯科(北京)科技有限公司 一种头戴式显示装置和图像调节方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763779A (zh) * 2016-03-31 2016-07-13 联想(北京)有限公司 一种电子设备及提示方法
CN105763779B (zh) * 2016-03-31 2020-02-21 联想(北京)有限公司 一种电子设备及提示方法
CN113239754A (zh) * 2021-04-23 2021-08-10 泰山学院 一种应用于车联网的危险驾驶行为检测定位方法及系统

Also Published As

Publication number Publication date
US10395510B2 (en) 2019-08-27
CN103500331A (zh) 2014-01-08
CN103500331B (zh) 2017-11-10
US20160180692A1 (en) 2016-06-23

Similar Documents

Publication Publication Date Title
WO2015027598A1 (zh) 提醒方法及提醒装置
US9867532B2 (en) System for detecting optical parameter of eye, and method for detecting optical parameter of eye
US10002293B2 (en) Image collection with increased accuracy
US10583068B2 (en) Eyesight-protection imaging apparatus and eyesight-protection imaging method
US9870050B2 (en) Interactive projection display
US9961257B2 (en) Imaging to facilitate object gaze
US10684680B2 (en) Information observation method and information observation device
US11567570B2 (en) Relative position based eye-tracking system
US10048750B2 (en) Content projection system and content projection method
US10247813B2 (en) Positioning method and positioning system
US10271722B2 (en) Imaging to facilitate object observation
WO2015014059A1 (zh) 成像装置及成像方法
US10360450B2 (en) Image capturing and positioning method, image capturing and positioning device
WO2015043275A1 (en) Imaging for local scaling
CN109964230B (zh) 用于眼睛度量采集的方法和设备
CN111317438B (zh) 通过追踪光路来进行注视跟踪
WO2017179280A1 (ja) 視線測定装置および視線測定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13892212

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14783495

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13892212

Country of ref document: EP

Kind code of ref document: A1