CN113759545A - Method for operating a pair of smart glasses - Google Patents

Method for operating a pair of smart glasses Download PDF

Info

Publication number
CN113759545A
CN113759545A CN202110599605.7A CN202110599605A CN113759545A CN 113759545 A CN113759545 A CN 113759545A CN 202110599605 A CN202110599605 A CN 202110599605A CN 113759545 A CN113759545 A CN 113759545A
Authority
CN
China
Prior art keywords
eye
laser beam
input unit
output unit
smart glasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110599605.7A
Other languages
Chinese (zh)
Inventor
安德烈亚斯·彼得森
托马斯·亚历山大·施莱布施
约翰尼斯·梅耶尔
汉斯·斯普鲁伊特
约琴·赫尔米格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongkuai Optoelectronic Device Co ltd
Robert Bosch GmbH
Original Assignee
Tongkuai Optoelectronic Device Co ltd
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongkuai Optoelectronic Device Co ltd, Robert Bosch GmbH filed Critical Tongkuai Optoelectronic Device Co ltd
Publication of CN113759545A publication Critical patent/CN113759545A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0147Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The invention relates to a method for operating smart glasses (50), the smart glasses (50) comprising an input unit and/or an output unit (7) and a gaze detection arrangement (20), wherein the gaze detection arrangement (20) detects any eye movement of an eye (10), the method comprising the steps of: irradiating at least one wavelength-modulated laser beam (1) to an eye (10); detecting an optical path length (2) of the emitted laser beam (1) based on laser feedback interferometry of the emitted laser radiation and radiation backscattered from the eye (10); detecting a doppler shift of the emitted and backscattered radiation based on laser feedback interferometry; and detecting the eye velocity based on the doppler shift, and wherein the input unit and/or the output unit (7) is operated based on the optical path length (2) and/or the eye velocity.

Description

Method for operating a pair of smart glasses
Technical Field
The invention relates to a method for operating smart glasses and smart glasses.
Background
It is known to use eye tracking or eye detection (also known as eye photography) to determine eye movement and estimate the gaze direction. It is also known to use smart glasses to project images onto, for example, the retina of a user. Gaze detection is used, for example, to adjust a control device of a projection unit of a pair of smart glasses, for example, when displaying context-sensitive information. US 7,637,615B 2 discloses such smart glasses using retinal projection and prism mirror based eye tracking. Known systems for eye tracking are typically based on the use of camera-based systems or electrical or electromagnetic sensors in the area of the eye to detect information about the position of the eye. In addition, scanning laser systems are known, which use micromirrors to scan a laser spot on the eye, for example. All these systems generally have a high complexity and energy consumption, often with limited temporal resolution.
Disclosure of Invention
In contrast, the method according to the invention having the features of claim 1 is characterized by a particularly energy-saving and cost-effective method of operating a pair of smart glasses with high user convenience. This is achieved by a method for operating a pair of smart glasses comprising an input unit and/or an output unit and gaze detection means. The gaze detection arrangement is adapted to determine eye movements using the steps of:
illuminating the eye with at least one wavelength modulated laser beam,
laser feedback interferometry based on the emitted laser beam and radiation backscattered from the eye, detecting the optical path length of the emitted laser beam,
detecting the Doppler shift of the emitted and backscattered radiation, in particular the Doppler shift between its frequencies, based on laser feedback interferometry, and
the eye velocity is detected based on the doppler shift,
wherein the input unit and/or the output unit operate based on the optical path length and/or the eye velocity.
Eye movement is considered to be any movement of the eyes, in particular relative to the head of the smart eyewear user. In particular, such eye movements correspond to a change of the viewing direction of the eyes. Furthermore, the eye movement preferably corresponds to a rotation of the eye (in particular of the eyeball) within the eye socket.
In other words, in the eye movement determination, in particular, the wavelength-modulated laser beam emitted from the wavelength-modulated laser light source is irradiated onto the eye of the user. The laser beam is at least partially backscattered at the eye surface. The backscattered radiation is a portion of the radiation scattered at the eye surface that is parallel to the emitted laser beam and therefore capable of interfering therewith. The backscattered portion interferes with the incident laser radiation, i.e. with the laser radiation propagating towards the eye. The backscattered portion of the irradiated laser beam may also be referred to as backscattered radiation. By so-called laser feedback interferometry, an overlap of the emitted laser beam and the backscattered radiation is produced, so that there is interference radiation produced thereby. This generated interference radiation can be detected and analyzed by means of a detector, for example.
Laser feedback interferometry is considered to be the detection and analysis of the overlap of the irradiated laser beam and its backscattered portion, i.e. the detection and analysis of the generated interference radiation. Based on laser feedback interferometry, the optical path length of the emitted laser beam is determined. The optical path length is considered to be the product of the geometric distance covered by the emitted laser beam from the laser source to the eye surface and the refractive index of the material present there. This means that if the laser beam is emitted in air from the laser source directly towards the eye (refractive index of about 1), the optical path length approximates the distance between the laser source and the eye very well. For example, if the wavelength of the emitted laser radiation is known, the optical path length may be estimated based on constructive or destructive interference.
Preferably, the triangular wave modulated laser light is emitted as a wavelength modulated laser beam in a wavelength range. By analyzing the interferometrically emitted and backscattered radiation, in particular by calculating the average of the resulting interference frequencies with respect to the two edges of the triangular wave modulation signal, the optical path length can be determined.
Furthermore, if there is a movement of the eye relative to the dispersed laser radiation, a doppler shift occurs, in particular between the frequency of the emitted radiation and the frequency of the backscattered radiation due to the doppler effect. The doppler shift can be detected by means of laser feedback interferometry. The doppler shift can then be used to determine the eye velocity. The eye velocity is considered to be the tangential velocity of a point on the eye surface that corresponds to the point at which the laser radiation impinges on the eye surface. Preferably, the eye velocity comprises the absolute value of the current velocity and the direction of the current velocity.
Then, the input unit and/or the output unit is operated based on the determined optical path length and/or eye velocity. The method thus allows the input unit and/or the output unit to be operated in a particularly simple and efficient manner by eye movements of the user. The particular method of detecting eye movements using laser feedback interferometry, in particular using the doppler effect, has the advantage of a particularly high time sampling rate, so that eye movements can be detected with a particularly high time resolution. In addition, the method provides the advantage of using simple and inexpensive components and evaluation algorithms with lower energy requirements. In particular, computationally intensive image data processing is not required. Further, advantageously, no moving parts such as a scanning device will be required, thus providing a particularly flexible and robust applicability.
Preferred further embodiments of the invention are the subject matter of the dependent claims.
Preferably, the input unit and/or the output unit are activated or deactivated based on the optical path length and/or the eye velocity. That is, the eye tracking apparatus is capable of initiating activation and/or deactivation of the input unit and/or the output unit. Preferably, the deactivated state is a state in which the input unit and/or the output unit is not energy consuming. Thus, a particularly energy-efficient operation of the smart glasses may be provided, since the power supply of the gaze detection arrangement is only required when no input by the user and/or no output to the user is required.
Particularly preferably, the gaze detection arrangement directs the single laser beam towards the eye, wherein, based on the detected optical path length, detection will be made when the gaze direction of the eye is aligned such that the single laser beam enters the eye through the pupil. A portion of the input unit and/or the output unit and/or the eye tracking apparatus is operated based on detecting the single laser beam entering the eye. For example, the eye tracking apparatus may thus be set to a sleep mode with reduced energy consumption. Preferably, in the sleep mode, only a single laser beam is directed towards the eye. The input unit and/or the output unit and/or the eye tracking device may be operated and, for example, may be fully activated corresponding to viewing in a particular direction, for example downwards in the user's field of view, by any corresponding eye movement of the user such that the laser beam enters the eye through the pupil. Preferably, the individual laser beams may be designed to be visible to the user in the process, so that the user receives corresponding optical feedback when operating. This may make the smart glasses particularly power-saving and user-friendly to be operated.
Preferably, the eye pose is recognized based on the optical path length and/or the eye velocity. The input unit and/or the output unit will operate based on the recognized predetermined eye gesture. The eye pose may be, for example, any predetermined movement pattern of the eye, such as a downward rotation followed by a lateral rotation. In particular, different actions may be performed based on different predetermined eye gestures. In particular, it may be sufficient to determine relative eye movements by means of an eye tracking device, wherein, for example, predetermined eye poses can be clearly distinguished without determining absolute eye positions. Thus, a particularly efficient and reliable operation of smart glasses may be achieved using eye tracking devices for a specific high resolution and energy efficient detection of eye movements.
Preferably, the input unit and/or the output unit comprises an image projection unit arranged to project an image onto the retina of the eye. By operating the image projection unit using the eye tracking device, a particularly convenient and accurate image projection may be provided.
Preferably, image parameters of the image projected by the image projection unit are adjusted based on the determined optical path length and/or eye velocity. This enables a particularly high image quality of the image visible to the user, wherein the specific adjustment of the image parameters means, for example, that areas in which a lower image quality is sufficient for anatomical reasons can be displayed with little display effort, thereby obtaining a particularly efficient operation of the smart glasses.
Particularly preferably, the image sharpness and/or the image resolution and/or the brightness and/or the color distribution of the image projected by the image projection unit are to be adjusted. Preferably, these image parameters are adapted over the entire field of view of the user corresponding to the instantaneous viewing angle of the eye. Advantageously, the image parameters are adjusted according to the position of the fovea of the eye, which corresponds to the sharpest visual zone. In particular, the image projected by the image projection unit is adjusted such that in the foveal region there is a higher image sharpness and/or a higher image resolution than in the surrounding regions of the field of view. Particularly preferably, the image sharpness and/or the image resolution decrease, in particular continuously, radially outwards in the field of view of the user. This enables the smart glasses to operate efficiently, since a lower display quality of the image projection unit can be achieved in areas where the visual ability is declared in less detail for anatomical reasons. This results in a lower amount of computations and a lower energy consumption in the image projection unit.
Preferably, the method further comprises the steps of: a maximum eye velocity is detected during the eye movement, and an eye movement end position is predicted based on the maximum eye velocity. Thereby performing the operation of the input unit and/or the output unit based on the eye-movement end position. Particularly preferably, the adjustment of the image parameters of the image projected by the image projection unit is performed based on the eye movement end position. It is thus possible to predict in particular in the case of rapid eye movements, so-called saccades, where the eye movements are stopped. Preferably, the prediction of the eye movement end position is based on the assumption that the eye performs a uniformly accelerated movement during such eye movement. That is, there is a constant positive acceleration during the first half of such eye movement, while a constant (in particular, the same magnitude) negative acceleration is generated during the second half of the eye movement. By detecting the first half of the respective velocity profile, i.e. the velocity of the eye movement from rest until the maximum velocity during movement, the second half of the velocity profile can be estimated, in particular on the assumption of mirror symmetry. In this way, the end point of the eye movement can be estimated, in particular by integrating the determined velocity profile. Preferably, determining the maximum velocity during eye movement is based on detecting a decreasing eye velocity following an increase in eye velocity. In particular, if an image parameter adjustment takes place to adapt to the predicted eye-movement end position, any delay in the image parameter adjustment in response to a change in the gaze of the user can advantageously be avoided, so that a very accurate and optimally adapted display of the projection image can always be ensured.
Particularly preferably, the input unit and/or the output unit comprise a sound reproduction unit and/or an electronic user device. The electronic user device is preferably provided as a separate device connectable to the smart glasses, e.g. using a wireless connection. This enables a particularly flexible use of smart glasses.
Preferably, eyelid closure of the eye is detected based on the optical path length and/or the eye velocity. This means that in particular any blinking may be detected. In this case, the determined eye velocity is the velocity of the eyelid closure. In particular, eyelid closure may be determined based on a characteristic velocity, and preferably a characteristic direction of the velocity determined by using an eye tracking device. Preferably, the input unit and/or the output unit may be operated using detection of eyelid closure. For example, any particular action may be performed using a predetermined blink pattern or closing of the eyelids for a predetermined period of time. Furthermore, the detection of eyelid closure of the eye may be used for fatigue detection. Thus, a particularly broad range of applications of smart glasses may be provided.
Preferably, at least a part of the eye tracking apparatus is always active. Thus, for example, a specific eye gesture may be performed at any time to operate, in particular to activate the input unit and/or the output unit, which may otherwise be switched off. As a result, the input unit and/or the output unit can be operated as desired, so that the energy consumption of the smart glasses is particularly low throughout its operation time. Due to the particularly energy-saving gaze detection method it can be operated at all times, e.g. without the high energy consumption of smart glasses.
Preferably, the method further comprises the steps of: the reflectivity of the eye is detected from the amplitude and phase of the radiation backscattered by the eye, so that the input unit and/or the output unit are operated on the basis of the reflectivity. Preferably, the reflectivity-based operation may be performed instead of or in addition to the optical path length and/or eye velocity-based operation. It is particularly advantageous if the image projection unit is operated on the basis of the determined optical path length and/or eye velocity and/or on the basis of the reflectivity, in particular wherein image parameters of the image projected by the image projection unit are adjusted on the basis of the determined optical path length and/or eye velocity and/or on the basis of the reflectivity. In this context, reflectivity especially refers to a complex reflectivity with amplitude and phase of radiation backscattered by the eye. By additionally measuring the reflectivity of the eye, the current position of the eye can thus be determined particularly precisely. Advantageously, based on the different reflectivities of different parts of the eye, it can be identified which part of the eye the laser beam is currently illuminating, e.g. any absolute eye position can be estimated from this. For example, when laser radiation impinges on different parts of the eye, the reflectivity may differ significantly and in characteristics. For example, significantly more scattering occurs when laser radiation impinges on the iris of the eye and passes through the cornea than does retinal illumination. Preferably, the determination of the reflectivity may thus be used to determine when the laser beam, which is in particular stationary relative to the user's head, moves past the anatomical boundary during the eye movement. In this way, any eye movement can be determined particularly easily and accurately, for example as a trigger for operating the input unit and/or the output unit.
Furthermore, the invention provides that the smart glasses comprise an input unit and/or an output unit and a gaze detection device. The input unit and/or the output unit are arranged to receive input from a user and/or to provide output to a user. The gaze detection arrangement comprises a laser device arranged to impinge at least one laser beam on the eye and control means arranged to operate the laser device. The smart glasses are configured to perform the above-described method. Smart glasses are characterized by a particularly simple and inexpensive configuration with a high eye movement detection rate and low energy requirements.
Preferably, the laser device comprises at least one surface emitter (also called vertical cavity surface emitting laser, or VCSEL for short) with a photodiode integrated therein. Such a laser device may be used for detecting eye movements with an eye tracking apparatus of a particularly simple, compact and cost-effective design based on laser feedback interferometry. Such a laser device is particularly suitable for detection by the self-mixing effect. Preferably, in this case, the photodiode is used to directly detect any overlap of emitted and backscattered radiation within the laser cavity. Particularly preferably, the laser device may comprise a plurality of surface emitters, each surface emitter emitting a laser beam.
Preferably, at least one surface emitter with a photodiode integrated therein is arranged on the spectacle frame and/or the spectacle arm. In this case, the spectacle frame is particularly considered to be the area of the smart spectacles surrounding the spectacle lenses, and the temples are particularly considered to be the holding legs connected to the spectacle frame, for example extending to the user's ears. For example, several surface emitters with integrated photodiodes may be arranged on the spectacle frame in a distributed manner around the spectacle lens to allow a particularly precise scanning of the eye over its entire range of motion. Alternatively or additionally, one or more laser sources may be integrated, preferably molded into the spectacle lens.
Drawings
The invention is described below by way of example embodiments with reference to the accompanying drawings. In the drawings, functionally identical components are each indicated by the same reference numeral, wherein:
fig. 1 is a simplified schematic diagram of a pair of smart glasses according to a first embodiment of the present invention.
Fig. 2 is a simplified schematic detail view of a gaze detection process using the smart glasses of fig. 1.
Fig. 3 is another simplified schematic detail view of gaze detection using the smart glasses of fig. 1.
Fig. 4 is a simplified schematic representation of measurement data from the smart glasses of fig. 1 when performing gaze acquisition.
FIG. 5 is another simplified schematic detail view of a gaze acquisition performance using the smart glasses of FIG. 1, an
Fig. 6 is a simplified schematic diagram of smart glasses according to a second embodiment of the present invention.
Detailed Description
Fig. 1 shows a simplified schematic diagram of a pair of smart glasses 50 according to a first embodiment of the present invention. The smart glasses 50 include glasses lenses 52, a glasses frame 51 accommodating the glasses lenses 52 therein, and a temple 53 for holding the smart glasses 50 on the head of the user. Thus, the smart glasses 50 are configured to be worn on the head of the user.
The smart glasses 50 comprise gaze detection means 20 by which gaze direction of the user's eyes 10 can be determined 20. To this end, the gaze detection arrangement 20 comprises a laser device 3 and a control arrangement 4, the control arrangement 4 being arranged to operate the laser device 3 to perform a corresponding method for detecting a gaze direction of the eye 10. For a compact design, the control device 4 is arranged in the temple 53 of the smart glasses 50.
The laser device 3 comprises a total of five surface emitters 3a, 3b, 3c, 3d, 3e as laser sources. Four of the five surface emitters 3a, 3b, 3c, 3d are arranged distributed on a spectacle frame 51 surrounding a spectacle lens 52. The fifth surface emitter 3e is arranged on the temple 53. Each surface emitter 3a, 3b, 3c, 3d, 3e is arranged to irradiate a wavelength modulated laser beam 1 onto the eye 10. In this case, the triangular wave modulated laser light is emitted as the laser beam 1 in the wavelength range. For the sake of clarity, only a single laser beam 1 emitted by the first surface emitter 3a is shown in the figure. Each laser beam 1 is directed onto the eye surface 11 of the eye 10 in a separate laser spot 30a, 30b, 30c, 30d, 30 e.
Fig. 2 shows the laser spots 30a, 30b, 30c, 30d of the first four surface emitters 3a, 3b, 3c, 3d arranged on the spectacle frame 51. In fig. 3, a fifth laser spot 30e generated by a fifth surface emitter 3e is shown on the side of the eye 10.
As shown in fig. 2 and 3, the laser spots 30a, 30b, 30c, 30d, 30e are preferably located within or near the area of the iris 12 of the eye 10. As a result, when the eye 10 moves, the pupil 13 of the eye 10 often moves close to or through the laser spot 1, so that the position and movement of the pupil 13 can be determined with high accuracy to determine the gaze direction of the eye with high accuracy.
The implementation of the method for detecting the gaze direction of the eye 10 is described in detail below, this description being based on only a single laser beam 1.
First, a laser beam 1 is irradiated onto an eye 10. At the eye surface 11, the laser beam 1 will be at least partially backscattered. As a result, an overlap of the irradiated laser beam 1 and the portion of the backscattered radiation propagating backwards in parallel in the direction of the surface emitter 3a occurs. Laser feedback interferometry is performed using photodiodes integrated in the surface emitter 3a to detect the resulting interference radiation, i.e. the overlap of the irradiated laser radiation 1 and the radiation backscattered in the opposite direction. Since the photodiode is directly integrated into the laser cavity of the surface emitter 3a, the detection of the generated interference radiation is performed by the so-called self-mixing effect.
An exemplary spectrum 25 of the generated interference radiation that can be detected by the integrated photodiode of the surface emitter 3a is schematically shown in fig. 4. Axis 25a corresponds to frequency and axis 25b corresponds to amplitude. Reference numeral 26 denotes the peak frequency of the detected interference radiation, which is determined, for example, by fourier analysis. The peak frequency 26 depends on the optical path length 2 due to the triangular wave modulation of the wavelength of the emitted laser beam 1. The optical path length 2 (see fig. 1) corresponds to the distance covered by the laser beam 1 between the surface emitter 3a and the eye surface 11. When the laser beam 1 is directly irradiated onto the eye 10, the optical path length 2 corresponds to the shortest distance between the surface emitter 3a and the eye surface 11 in the first embodiment of fig. 1. Thus, with the wavelength of the emitted laser beam 1 known, the optical path length 2 can be determined for a specific eye position (i.e. for a specific gaze direction) based on laser feedback interferometry.
Fig. 4 shows an exemplary frequency spectrum 25, which is recorded during a constant movement of the eye surface 11 relative to the laser beam 1 (i.e. during a rotation of the eye 10). During this movement, a shift 27 of the peak frequency 26 towards a shifted peak frequency 26' as shown by the dashed line occurs due to the doppler effect. The resulting doppler shift of the emitted laser radiation and the backscattered laser radiation can thus be determined on the basis of the frequency spectrum 25. Based on the doppler shift, the instantaneous eye velocity of the movement of the eye 10 and the direction of movement can be determined.
Preferably, in addition to detecting the frequency of the impinging and backscattered laser radiation, the reflectivity of the eye 10 may also be detected based on the amplitude and phase of the radiation backscattered by the eye 10. In this context, reflectivity is defined as a complex reflectivity having an amplitude and phase of radiation backscattered by the eye 10, wherein the reflectivity is different for different regions of the eye 10. In particular, the determined reflectivity may change when the laser beam 1 passes through an anatomical boundary of the eye 10 (e.g., the iris 12 or pupil 13). Thus, the reflectivity of the eye 10 can be used to estimate which region of the eye 10 is currently illuminated by the laser beam 1. Together with the determined optical path length 2, the instantaneous absolute eye position of the eye 10 can thus be determined.
Thus, any eye movement may be determined based on laser feedback interferometry using the eye tracking device 20, wherein any eye movement of the eye 10 may be determined and tracked. In combination with the determination of the absolute eye position, e.g. performed only at predetermined times, and further based on the reflectivity, the instantaneous gaze direction of the eye 10 may be determined. Due to the components required for performing the gaze direction determination, a particularly high temporal resolution of the gaze direction determination can be achieved with a low energy requirement. In addition, particularly low-cost components can be used.
Further, the pair of smart glasses 50 includes an input and/or output device 7 configured to provide output to a user. The input and/or output device 7 comprises a projection unit arranged to project an image onto the retina of the eye 10. The projection unit may for example be used for displaying augmented or virtual reality (also called augmented reality AR or virtual reality). Preferably, the projection unit is coupled to the control means 4, wherein the control means 4 is arranged to operate the projection unit in accordance with the determined gaze direction.
By means of the control device 4, the projected image can thus be adjusted according to the gaze direction. For a particularly efficient and user-friendly operation, the image parameters of the image projected by the projection unit are adjusted in dependence on the position of the fovea of the eye 10 (fovea), which corresponds to the sharpest zone of vision. In this regard, the location of the fovea may be determined based on any prior calibration of the eye-detection apparatus 20.
The image projected by the image projection unit will therefore be adjusted such that in the area of the fovea there is a higher image sharpness and a higher image resolution than in the circumferential area of the field of view. Preferably, the image sharpness and/or image resolution is reduced radially outwards with respect to the user's field of view. In this way, a particularly efficient operation of the smart glasses 50 may be provided, since a lower display quality of the image projection unit may be achieved in areas where the visual capabilities are less apparent due to anatomical reasons. This will result in a low computational effort and a particularly low energy requirement for the smart glasses 50.
Further, the eye gesture may be recognized using the eye tracking apparatus 20. For example, various predetermined eye gestures may be assigned to different predetermined actions of the smart glasses 50. For example, an eye gesture may be defined as a predetermined sequential movement of the eye 10. For example, the eye pose may be a vertical downward rotation followed by a right horizontal rotation. This corresponds to the user first looking down and then right.
If such eye posture is detected based on the eye velocity and the optical path length 2 determined using the eye tracking apparatus 20, the image projection unit may be operated in a predetermined manner. For example, the image projection unit may be activated or deactivated to enable the smart glasses 50 to be operated as desired and thus in a particularly energy-efficient manner. Alternatively or additionally, eye gestures may be used to activate operation of menu options for content displayed in augmented or virtual reality. It is particularly convenient, for example, if a particular eye gesture, preferably the scrolling up and/or down of the eyes 10, activates the scrolling of content displayed in augmented reality or virtual reality.
A particularly simple and energy efficient way of operating the smart glasses 50 is shown in fig. 5. In fig. 5, an operational state of the smart glasses 50 is shown, wherein the input unit and/or the output unit 7 is deactivated and the eye tracking device 20 is in a sleep mode. In the sleep mode, only exactly one surface emitter 3b of the gaze detection arrangement 20 is operable, while all other surface emitters 3a, 3c, 3d, 3e are deactivated. By means of a single active surface emitter 3b, only a single laser beam 1 is directed to the eye 10 in exactly one laser port 30b, as shown in fig. 5.
In the sleep mode, simplified eye detection will be performed. Specifically, for example, only the optical path length 2 is determined. Based on the optical path length 2, it can be determined when the eye 10 moves into a position such that a single laser beam 1 can enter the eye 10 through the pupil 13. This applies to the operating state shown in fig. 5 when the user looks down (i.e. when the eye 10 performs a vertical downward rotation towards the direction a). This particular eye posture, i.e. the state when the laser beam 1 enters the eye 10, is used here for activating the projection unit and the inactive parts of the eye tracking apparatus 20. By emitting a single laser beam 1 as visible light from the surface emitter 3b, feedback on the activation can be indicated to the user.
Fig. 6 shows a simplified schematic diagram of a pair of smart glasses 50 according to a second embodiment of the present invention. The second embodiment substantially corresponds to the first embodiment of fig. 1, but differs in an alternative arrangement of the laser device 3. In the second embodiment of fig. 6, the laser device 3 of the eye tracking apparatus 20 comprises four surface emitters 3a, 3b, 3c, 3d comprising integrated photodiodes, all of which are arranged on the temple 53. Thus, the laser beam 1 emitted by the surface emitters 3a, 3b, 3c, 3d will be indirectly irradiated onto the eye 10. Specifically, the laser beam 1 is irradiated onto the eyeglass lens 52, taking the focusing point 1' on the eyeglass lens 52 as an example. A deflection element 54 in the form of a holographic optical element is integrated in the spectacle lens 52, deflecting the laser beam 1 towards the eye 10. In this way, an alternative arrangement of the laser device 3 may be provided, by which the optical path length 2 and the eye speed may also be determined efficiently as a basis for operating the input unit and/or the output unit 7.

Claims (15)

1. A method for operating a pair of smart glasses (50), the pair of smart glasses (50) comprising an input unit and/or an output unit (7) and a gaze detection arrangement (20), wherein the gaze detection arrangement (20) detects any eye movement of an eye (10) by the method, the method comprising the steps of:
irradiating at least one wavelength-modulated laser beam (1) onto the eye (10),
detecting an optical path length (2) of the irradiated laser beam (1) based on laser feedback interferometry of the irradiated laser beam (1) and of a portion of the laser beam (1) backscattered from the eye (10),
detecting the Doppler shift of the irradiated laser beam (1) and of the backscattered portion based on the laser feedback interferometry, and
detecting an eye velocity based on the Doppler shift, and
wherein the input unit and/or output unit (7) operates based on the optical path length (2) and/or the eye velocity.
2. The method according to claim 1, wherein the input unit and/or output unit (7) is activated or deactivated based on the optical path length (2) and/or the eye velocity.
3. The method according to any one of the preceding claims, wherein the gaze detection arrangement (20) directs a single laser beam (1) towards the eye (10), wherein, based on the detected optical path length (2), it is detected when the single laser beam (1) enters the eye (10) through a pupil (13), and wherein, based on the detection of the single laser beam (1) entering the eye (10), the input unit and/or output unit (7) and/or a part of the gaze detection arrangement (20) is operated.
4. The method according to any of the preceding claims, wherein an eye posture is detected based on the optical path length (2) and/or the eye velocity, and wherein the input unit and/or output unit (7) is operated based on the detected predetermined eye posture.
5. The method according to any of the preceding claims, wherein the input unit and/or the output unit (7) comprises an image projection unit projecting an image onto the retina.
6. The method according to claim 5, wherein image parameters of the image projected by the image projection unit are adjusted based on the optical path length (2) and/or eye velocity.
7. The method according to claim 6, wherein the image sharpness and/or image resolution and/or brightness and/or color distribution of the image projected by the image projection unit is adjusted.
8. The method according to any of the preceding claims, further comprising the step of:
detecting a maximum eye velocity during eye movement, an
Predicting an eye-movement end position based on the maximum eye velocity,
wherein the input unit and/or output unit (7) is operated based on the eye movement end position, preferably wherein the image parameters of the image projected by the image projection unit are adjusted based on the eye movement end position.
9. The method according to any of the preceding claims, wherein the input unit and/or output unit (7) comprises a sound reproduction unit and/or an electronic user device, preferably provided as a separate device.
10. The method according to any one of the preceding claims, wherein eyelid closure of the eye (10) is detected based on the optical path length (2) and/or the eye velocity.
11. The method according to any of the preceding claims, wherein at least a part of the eye tracking apparatus (20) is always active.
12. The method according to any of the preceding claims, further comprising the step of:
detecting any reflectivity of the eye (10) based on the amplitude and phase of radiation backscattered from the eye,
wherein the input unit and/or the output unit (7) is operated based on the reflectivity.
13. A pair of smart glasses comprising:
an input unit and/or an output unit (7) arranged to receive input from a user and/or to provide output to a user; and
gaze detection means (20) for detecting any eye movement of the eye (10),
wherein the gaze detection arrangement (20) comprises a laser device (3) adapted to irradiate at least one laser beam (1) towards the eye (10) and control means (4) adapted to operate the laser device (3), and
wherein the smart glasses (50) are adapted to perform the method according to any one of the preceding claims.
14. A pair of smart glasses according to claim 13, wherein the laser device (3) comprises at least one surface emitter (3a, 3b, 3c, 3d, 3e) in which a photodiode is integrated.
15. A pair of smart glasses according to claim 14, wherein the at least one surface emitter (3a, 3b, 3c, 3d, 3e) in which the photodiode is integrated is arranged on the glasses frame (51), in particular around the glasses lenses (52), and/or on the glasses legs (53), and/or in the glasses lenses (52).
CN202110599605.7A 2020-06-02 2021-05-31 Method for operating a pair of smart glasses Pending CN113759545A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020206822.4A DE102020206822A1 (en) 2020-06-02 2020-06-02 Procedure for operating data glasses
DE102020206822.4 2020-06-02

Publications (1)

Publication Number Publication Date
CN113759545A true CN113759545A (en) 2021-12-07

Family

ID=78508822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110599605.7A Pending CN113759545A (en) 2020-06-02 2021-05-31 Method for operating a pair of smart glasses

Country Status (3)

Country Link
US (1) US11513594B2 (en)
CN (1) CN113759545A (en)
DE (1) DE102020206822A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020206821A1 (en) * 2020-06-02 2021-12-02 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining a line of sight of an eye
DE102020206822A1 (en) 2020-06-02 2021-12-02 Robert Bosch Gesellschaft mit beschränkter Haftung Procedure for operating data glasses
DE102021207058A1 (en) 2021-07-06 2023-01-12 Robert Bosch Gesellschaft mit beschränkter Haftung Human activity detection method
DE102022204107A1 (en) 2022-04-27 2023-11-02 Robert Bosch Gesellschaft mit beschränkter Haftung Method and device for determining the viewing direction of an eye and data glasses
DE102022208691A1 (en) 2022-08-23 2024-02-29 Robert Bosch Gesellschaft mit beschränkter Haftung Device for monitoring an eye position of a user's eye in a virtual retinal display, data glasses and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102068237A (en) * 2004-04-01 2011-05-25 威廉·C·托奇 Controllers and Methods for Monitoring Eye Movement, System and Method for Controlling Calculation Device
US20120113431A1 (en) * 2009-06-03 2012-05-10 Kabushiki Kaisha Topcon optical image measurement apparatus
WO2016105282A1 (en) * 2014-12-26 2016-06-30 Koc University Near-to-eye display device with spatial light modulator and pupil tracker
CN109414166A (en) * 2016-07-15 2019-03-01 卡尔蔡司医疗技术股份公司 Method for delicately measuring very much distance and angle in human eye
CN109922710A (en) * 2016-11-10 2019-06-21 奇跃公司 The method and system of eyes tracking is executed using speckle pattern
US20200026350A1 (en) * 2018-07-20 2020-01-23 Avegant Corp. Relative Position Based Eye-tracking System

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4608996B2 (en) 2004-08-19 2011-01-12 ブラザー工業株式会社 Pupil detection device and image display device including the same
DE102014206626A1 (en) 2014-04-07 2015-10-08 Bayerische Motoren Werke Aktiengesellschaft Fatigue detection using data glasses (HMD)
DE102014210892A1 (en) 2014-06-06 2015-12-17 Siemens Aktiengesellschaft A method of operating an input device and input device controllable by gestures performed with at least one body part of the human body
US20160274365A1 (en) 2015-03-17 2016-09-22 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays with heterogeneous display quality
JP2019506017A (en) * 2015-11-06 2019-02-28 フェイスブック・テクノロジーズ・リミテッド・ライアビリティ・カンパニーFacebook Technologies, Llc Eye tracking using optical flow
US10908279B2 (en) 2016-03-11 2021-02-02 Facebook Technologies, Llc Ultrasound/radar for eye tracking
DE102016212236A1 (en) 2016-07-05 2018-01-11 Siemens Aktiengesellschaft Interaction system and procedure
US10762810B2 (en) 2018-01-05 2020-09-01 North Inc. Augmented reality eyebox fitting optimization for individuals
DE102018214637A1 (en) 2018-08-29 2020-03-05 Robert Bosch Gmbh Method for determining a direction of view of an eye
CN115202037A (en) * 2019-03-06 2022-10-18 株式会社理光 Optical device, retina projection display device, and head-mounted display device
DE102020206822A1 (en) 2020-06-02 2021-12-02 Robert Bosch Gesellschaft mit beschränkter Haftung Procedure for operating data glasses

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102068237A (en) * 2004-04-01 2011-05-25 威廉·C·托奇 Controllers and Methods for Monitoring Eye Movement, System and Method for Controlling Calculation Device
US20120113431A1 (en) * 2009-06-03 2012-05-10 Kabushiki Kaisha Topcon optical image measurement apparatus
WO2016105282A1 (en) * 2014-12-26 2016-06-30 Koc University Near-to-eye display device with spatial light modulator and pupil tracker
CN109414166A (en) * 2016-07-15 2019-03-01 卡尔蔡司医疗技术股份公司 Method for delicately measuring very much distance and angle in human eye
CN109922710A (en) * 2016-11-10 2019-06-21 奇跃公司 The method and system of eyes tracking is executed using speckle pattern
US20200026350A1 (en) * 2018-07-20 2020-01-23 Avegant Corp. Relative Position Based Eye-tracking System

Also Published As

Publication number Publication date
US20210373659A1 (en) 2021-12-02
US11513594B2 (en) 2022-11-29
DE102020206822A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
US11513594B2 (en) Method for operating a pair of smart glasses
US11567570B2 (en) Relative position based eye-tracking system
CN107533362B (en) Eye tracking device and method for operating an eye tracking device
US9498125B2 (en) Method for operating an eye tracking device and eye tracking device for providing an active illumination control for improved eye tracking robustness
US10395111B2 (en) Gaze-tracking system and method
EP2581034B1 (en) Eye-tracker illumination
US20180101229A1 (en) Systems, devices, and methods for proximity-based eye tracking
US11733774B2 (en) Eye-tracking arrangement
JP5689874B2 (en) Gaze control device, ophthalmic device, method of operating gaze control device for ophthalmic device for controlling eye gaze, computer program or evaluation unit
US11435578B2 (en) Method for detecting a gaze direction of an eye
JP2023550538A (en) Event camera system for pupil detection and eye tracking
US11625095B2 (en) Gaze sensors and display elements for detection of gaze vectors and user control at headset
US11675427B2 (en) Eye tracking based on imaging eye features and assistance of structured illumination probe light
CN110603513B (en) Eye tracking device and method for tracking eyes by utilizing optical imaging
JP2015066046A (en) Spectacles-wearing image analysis apparatus and spectacles-wearing image analysis program
KR20210031597A (en) Eye accommodation distance measuring device and method for head mounted display, head mounted display
Meyer et al. A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback Interferometry
US20230122222A1 (en) Device, system, and method for biometrically identifying a user of a device
Meyer Towards energy efficient mobile eye tracking for AR glasses through optical sensor technology
JP2615807B2 (en) Visual information analyzer
Meyer et al. A hybrid high speed eye tracking sensor by fusion of video oculography and laser feedback interferometry
CN117751316A (en) Determination of eye movement
KR20230101580A (en) Eye tracking method, apparatus and sensor for determining sensing coverage based on eye model
CN116406449A (en) Systems and methods for eye tracking in head-mounted devices using low coherence interferometry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination