WO2019084595A1 - Système et procédé permettant d'améliorer le rapport signal sur bruit dans le suivi d'objets dans des conditions de lumière faible - Google Patents

Système et procédé permettant d'améliorer le rapport signal sur bruit dans le suivi d'objets dans des conditions de lumière faible Download PDF

Info

Publication number
WO2019084595A1
WO2019084595A1 PCT/AU2018/050776 AU2018050776W WO2019084595A1 WO 2019084595 A1 WO2019084595 A1 WO 2019084595A1 AU 2018050776 W AU2018050776 W AU 2018050776W WO 2019084595 A1 WO2019084595 A1 WO 2019084595A1
Authority
WO
WIPO (PCT)
Prior art keywords
drive current
images
eye
image
illumination characteristics
Prior art date
Application number
PCT/AU2018/050776
Other languages
English (en)
Inventor
John Noble
Original Assignee
Seeing Machines Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2017904419A external-priority patent/AU2017904419A0/en
Application filed by Seeing Machines Limited filed Critical Seeing Machines Limited
Priority to DE112018005191.9T priority Critical patent/DE112018005191T5/de
Priority to JP2020523296A priority patent/JP7138168B2/ja
Priority to US16/759,951 priority patent/US11386709B2/en
Publication of WO2019084595A1 publication Critical patent/WO2019084595A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2215/00Special procedures for taking photographs; Apparatus therefor
    • G03B2215/05Combinations of cameras with electronic flash units
    • G03B2215/0564Combinations of cameras with electronic flash units characterised by the type of light source
    • G03B2215/0567Solid-state light source, e.g. LED, laser
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/16Control of exposure by setting shutters, diaphragms or filters, separately or conjointly in accordance with both the intensity of the flash source and the distance of the flash source from the object, e.g. in accordance with the "guide number" of the flash bulb and the focusing of the camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the present invention relates to illumination systems and in particular to a method and system for tracking eyes or a head of a subject in images having varying illumination characteristics. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
  • Eye tracking systems may be quite robust under normal operating conditions. However, the systems often break down when the subject is a large distance from the camera or if the subject's eyes become partially occluded from the camera's view. An example where the subject's eyes become partially occluded is when the subject is wearing glasses in the presence of high glare (e.g. in a convertible vehicle) or wearing dark sunglasses. In these situations, the signal to noise ratio of an image becomes too low to accurately distinguish the subject's eye from surrounding objects in the image.
  • the signal to noise ratio can be considered as comprising visible eye features as the signal component, while the noise component includes, inter alia, image sensor dark current noise, reflections from the environment on the eye, reflections from the environment on glasses lens, reflections from LEDs on glasses lens and motion blur.
  • one or more cameras for capturing images of a subject
  • LEDs light emitting diodes
  • a controller configured to send an LED control signal to the one or more LEDs to control the drive current amplitude and pulse time of the one or more LEDs, wherein the controller selectively adjusts the drive current amplitude and pulse time based on the determined illumination characteristics of a previous captured image or images.
  • the controller is also configured to control an image sensor gain value of the one or more cameras based on the determined illumination characteristics of a previous captured image or images.
  • the controller selectively adjusts the drive current amplitude and pulse time according to a predefined pulse handling curve specified by a manufacturer of the one or more LEDs.
  • the pulse handling curve includes a non-linear relationship between the drive current amplitude and pulse time for a given duty cycle.
  • the illumination characteristics of a captured image include a brightness measure of the captured image.
  • the brightness measure includes an average pixel intensity of each pixel in the captured image.
  • the processing of at least a subset of the captured images includes the detection of sunglasses on the subject.
  • the illumination characteristics include a darkness or reflectivity measure of the sunglasses.
  • an imaging method including:
  • the method includes the step of selectively adjusting an image sensor exposure time of the one or more cameras based on the determined illumination characteristics.
  • the drive current amplitude and pulse time are selectively adjusted according to a predefined pulse handling curve specified by a manufacturer of the one or more LEDs.
  • the pulse handling curve includes a non-linear relationship between the drive current amplitude and pulse time for a given duty cycle.
  • the illumination characteristics of a captured image include a brightness measure of the captured image.
  • the brightness measure includes an average pixel intensity of each pixel in the captured image.
  • the method is configured to image one or both eyes of the subject.
  • the method may include the step of:
  • the method may also include the step of:
  • the step of processing at least a subset of the captured images includes determining an eye pixel region corresponding to a localized region around the subject's eye.
  • the illumination characteristics include a brightness measure of the eye pixel region.
  • the brightness measure includes an average pixel intensity of the pixels within the eye pixel region.
  • the illumination characteristics include a measure of contrast of pixels within the eye pixel region.
  • the step of processing at least a subset of the captured images includes the detection of sunglasses on the subject.
  • the illumination characteristics include a darkness or reflectivity measure of the sunglasses.
  • step d) includes:
  • step d) further includes:
  • step d) further includes:
  • the method includes the steps:
  • processing the integrated image to determine one or more eye tracking parameters of the subject's eye.
  • the number of previous captured images used to generate the integrated image is dependent on the determined illumination characteristics of a previous captured image or images.
  • an object tracking method including:
  • the combined image is formed by integrating the pixel values of corresponding pixels within the object pixel regions of a plurality of previous captured images based on the determined illumination characteristics of a previous captured image or images.
  • the combined image is formed by averaging the pixel values of corresponding pixels within the object pixel regions of a plurality of previous captured images based on the determined illumination characteristics of a previous captured image or images.
  • the illumination characteristics may include a brightness measure of the object pixel region.
  • the number of previous captured images used to generate the combined image is determined based on the determined illumination characteristics. In other embodiments, the number of previous captured images used to generate the combined image is determined based on a detected level of motion blur in the images.
  • Figure 1 is a perspective view of an interior of a vehicle illustrating an imaging system according to an embodiment of the invention
  • Figure 3 is schematic functional diagram of the eye tracking system of Figures 1 and 2;
  • Figure 5 is a graph illustrating exemplary LED pulse handling curves for 5% and 10% duty cycles
  • Figure 6 is a process flow diagram illustrating sub-steps in a process of selectively adjusting the drive current amplitude and/or pulse time of LEDs
  • Figure 7 is a schematic comparison of camera shutter time adjustment with LED drive current adjustment
  • Figure 9 is a process flow diagram illustrating steps in an imaging method utilizing a multi frame integration or averaging technique for improving tracking robustness.
  • the present invention relates to an illumination system and method, preferably for use in a tracking objects.
  • the invention will be described with particular reference to an eye tracking system for use in a vehicle to track the eyes of a vehicle driver. However, it will be appreciated that the invention is applicable to tracking eyes, head movement or other characteristics of a subject in vehicles and other scenarios such as in aircraft, vehicle or aircraft simulators, air traffic control facilities and consumer attention monitoring scenarios.
  • System 100 for tracking the eyes of a driver 102 of a vehicle 104.
  • System 100 includes an infrared camera 106 that is positioned on or in the vehicle dash instrument display to capture images of the eyes of driver 102 at wavelengths in the infrared range.
  • Two horizontally spaced apart infrared LEDs in the form of light emitting diodes (LEDs) 108 and 1 10 are disposed symmetrically about camera 106 to selectively illuminate the driver's face with infrared radiation during image capture by camera 106.
  • LEDs 108 and 1 10 may be replaced with other types of light sources such as directional filament lights or fluorescent lights.
  • system 100 is able to operate using only a single infrared illumination device at the expense of potential performance degradation in the presence of glare, or using more than two LEDs.
  • Camera 106 is preferably a two dimensional camera having an image sensor that is configured to sense electromagnetic radiation in the infrared range. In other embodiments, camera 106 may be replaced by a single two dimensional camera having depth sensing capability or a pair of like cameras operating in a stereo configuration and calibrated to extract depth. Although camera 106 is preferably configured to image in the infrared wavelength range, it will be appreciated that, in alternative embodiments, camera 106 may image in the visible range. As will be described below, in the present invention, camera 106 includes an image sensor employing a two dimensional array of photosensitive pixels. [0042] As shown in Figure 3, a system controller 1 12 acts as the central processor for system 100 and is configured to perform a number of functions as described below.
  • Controller 1 12 is located within the dash of vehicle 5 and may be connected to or integral with the vehicle on-board computer. In another embodiment, controller 1 12 may be located within a housing or module together with camera 106 and LEDs 108 and 1 10. The housing or module is able to be sold as an after-market product, mounted to a vehicle dash and subsequently calibrated for use in that vehicle. In further embodiments, such as flight simulators, controller 1 12 may be an external computer or unit such as a personal computer.
  • Controller 1 12 may be implemented as any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • controller 1 12 includes a microprocessor 1 14, executing code stored in memory 1 16, such as random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), and other equivalent memory or storage systems as should be readily apparent to those skilled in the art.
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • Microprocessor 1 14 of controller 1 12 includes a vision processor 1 18 and a device controller 120.
  • Vision processor 1 18 and device controller 120 represent functional elements which are performed by microprocessor 1 14.
  • vision processor 1 18 and device controller 120 may be realized as separate hardware such as Field Programmable Gate Arrays or microprocessors in conjunction with custom or specialized circuitry.
  • Device controller 120 is configured to control camera 106 and to selectively actuate LEDs 108 and 1 10 in sync with the exposure time of camera 106.
  • LEDs 108 and 1 10 are preferably electrically connected to device controller 120 but may also be controlled wirelessly by controller 120 through wireless communication such as BluetoothTM or WiFiTM communication.
  • device controller 120 activates camera 106 to capture images of the face of driver 102 in a video sequence.
  • LEDs 108 and 1 10 are alternatively activated and deactivated in synchronization with alternate frames of the images captured by camera 106 to illuminate the driver during image capture.
  • device controller 120 and vision processor 1 18 provide for capturing and processing images of the driver to obtain driver state information such as drowsiness, attention and gaze position during an ordinary operation of vehicle 104.
  • System 100 operates in varying lighting conditions, including bright and dark conditions, and when the driver is wearing dark or reflective sunglasses.
  • the present invention relates to controlling LEDs 108 and 1 10 to increase the robustness of system 100 to operate under these varying lighting conditions.
  • system 100 is configured to perform active LED control to enhance the signal to noise (SNR) ratio of captured images in terms of object detection and tracking such as head, facial feature and eye tracking.
  • SNR signal to noise
  • the relevant signal component of the SNR refers to the strength or visibility of image features of the driver's eye or eyes as these are the features to be identified and tracked by system 100.
  • the noise component of the SNR includes undesired signal components such as ambient brightness from the environment (e.g. sun), reflections from the environment on the eyes or sunglasses worn by the driver and image sensor dark current noise prominent in dark conditions.
  • the inventors have identified that the non-linear operation of illuminating LEDs can be leveraged to increase or maximize the SNR under varying light conditions.
  • the LEDs can be controlled to maximize the instantaneous output power or irradiance and minimize pulse duration.
  • the LEDs can be controlled to maximize energy by increasing the pulse duration within limits of motion blur.
  • system 100 performs an eye tracking method 400, as illustrated in Figure 4.
  • device controller 120 generates a camera control signal to control camera 106 to capture images of the driver's eye or eyes.
  • the camera control signal includes parameters such as an image sensor exposure time and an image sensor gain value of the camera.
  • device controller 120 performs an auto-exposure algorithm that determines an appropriate image sensor exposure time based on a detected level of residual or background light.
  • the camera control signal may also include other parameters such as an image resolution and a frame rate.
  • device controller 102 generates a light emitting diode (LED) control signal to control LEDs 108 and 1 10 to selectively illuminate the subject's eye during image capture by camera 106.
  • the LED control signal controls at least a drive current amplitude and pulse time of the one or more LEDs.
  • LED manufacturers specify control limits on LEDs to avoid damage by excessive heat. These control limits are often specified through a "pulse handling curve" as shown in Figure 5.
  • the key variables for a design are drive current, pulse duration and duty cycle.
  • the pulse handling curve represents a predefined relationship between these key drive parameters for efficient device operation. The curves for each duty cycle must consider inter alia the LED model, the ambient temperature, and the thermal performance of the design incorporating the LED. If the LED is operated outside the constraints specified by the pulse handling curve, the expected lifetime of the LED is reduced.
  • the drive current amplitude and pulse time of LEDs 108 and 1 10 are dynamically controlled to increase the SNR of the imaging system.
  • the manner in which the drive current amplitude and pulse time of the LEDs are controlled is described in detail below.
  • This step involves determining an illumination characteristic of the image such as a measure of brightness of the image.
  • an illumination characteristic indicative of brightness includes an average pixel intensity of each pixel in the captured image or an average over a plurality of past images.
  • Another example involves first determining an eye pixel region corresponding to a localized region of pixels around the driver's eye and subsequently obtaining a brightness measure of the eye pixel region.
  • the brightness measure may be an average pixel intensity of the pixels within the eye region or may be an average of the eye region pixels over a plurality of past captured images. This may be performed for one eye or both eyes independently.
  • the illumination characteristics may also include a measure of contrast of pixels within the eye pixel region as a proxy for brightness.
  • the average pixel intensity of the eye pixel region is used but where the eyes cannot be distinguished, the average pixel intensity of the entire image is used.
  • Step 403 may also involve determining one or more eye tracking parameters of the subject's eye by running an eye detection and gaze determination algorithm, such as that performed in Edwards et al. These algorithms can be used to determine the eye pixel regions.
  • the eye tracking parameters include two dimensional eye position in the images, eye closure, eye gaze direction and point of regard.
  • the eye tracking parameters may also include head pose measurements such as head position and orientation. If the object being tracked is a subject's head or other facial features, then a detection and tracking routing for those characteristics is performed at step 403.
  • This processing may be performed on every captured image or a subset such as every two images. In some circumstances the eye or eyes cannot be accurately distinguished and these images may be discarded or stored in conjunction with a flag indicating the eyes could not be identified.
  • the drive current amplitude and/or the pulse time of LEDs 108 and 1 10 is selectively adjusted. These adjusted parameters are applied to the LED control signal, which are applied to the LEDs during subsequent image capture.
  • Step 404 includes a number of sub-steps as illustrated in Figure 6.
  • the measured brightness of the image or eye pixel region determined in step 403 is compared with a target brightness such as an average pixel intensity of the entire image or eye pixel region.
  • the comparison may be measured as an exposure error and specified in terms of exposure stops.
  • An exposure stop is a power of two measure in photography to determine camera shutter and sensor parameters required to double or halve the amount of light received by a sensor. For example, if the current brightness is half the target brightness, then the current brightness should be increased by one exposure stop.
  • the LEDs 108 and 1 10 can be driven to increase or maximize power in bright conditions or increase or maximize output energy in dark conditions.
  • the new parameters are applied to LEDs 108 and 1 10 in the LED control signals for illumination during subsequent image capture.
  • the amount by which the LED parameters are adjusted is dependent on a difference between the brightness measure with the target brightness and the practical control limits of the LED determined by the pulse handling curve.
  • device controller 120 determines that the LED drive current amplitude should be increased to 2.4 A, then, if the LED is driven at a 5% duty cycle, the LED drive current pulse time should be reduced to a maximum of 3 ms to meet the device handling requirements.
  • the dynamic control of the LED drive current pulse time and drive current amplitude is performed based on the predefined pulse handling curve relationship for the LED.
  • the LED control is preferably performed in conjunction with control of camera 106. This includes controlling both an image sensor exposure time (or shutter period) and an image sensor gain value of camera 106.
  • the device controller 120 increases the camera shutter period by a predetermined amount.
  • the device controller 120 also adjusts the LED drive current pulse time to match the increased camera shutter period. Based on the pulse handling curve for a given duty cycle (e.g. Fig. 5), the LED drive current is adjusted to the maximum permitted value for that LED drive current pulse time.
  • a given duty cycle e.g. Fig. 5
  • the LEDs are driven with a drive current amplitude of 4 A and the shutter period is set at 1 .5 ms and the LED drive current pulse time is matched to the shutter period.
  • the shutter period is increased to 3 ms to increase the brightness of subsequent images.
  • the LED drive current pulse time is adjusted to match the new shutter period.
  • the LED drive current amplitude is reduced to 2.4 A.
  • the gain of the image sensor may be dynamically adjusted in a similar manner based on the brightness comparison performed in step 404. These adjustments to camera parameters are applied through the camera control signal to subsequent images.
  • an exposure error is determined which relates to the number of exposure stops between the brightness measured in step 403 and a target brightness. For example, if the measured brightness is half that of the target brightness, then the exposure error is -1 .
  • the exposure error can be calculated from the following equation:
  • the pixel intensity may relate to the average intensity of the pixels of the entire image or the average intensity of the pixels of an eye pixel region within the image.
  • a damping factor is determined. This damping factor determines the number of subsequent image frames over which the adjustment to the camera parameters (sensor exposure time and sensor gain) and LED parameters will be made.
  • the damping factor is used in order to prevent unstable oscillation to exposure control updates. For example, the damping results in the camera sensor exposure time or shutter period being adjusted to be part-way between the current value and the value that would achieve target brightness with the current scene.
  • the damping factor is scaled based on the exposure error, to allow the system to quickly adapt to large exposure errors.
  • the damping factor also uses timestamps so the dynamics of the system are invariant to frame-rate.
  • the damping factor can be calculated from the following equation:
  • n ranges from 0 to a predefined maximum damping value and the error scaling factor is expressed as:
  • a negative exposure error means the current brightness is less than the target brightness while a positive exposure error means the current brightness is greater than the target brightness.
  • the subsequent process flow is divided into two branches depending on whether the current brightness is darker or brighter than the target brightness.
  • step 804 a determination is made as to whether the shutter period (image sensor exposure time) is at its maximum based on limits of motion blur. If the shutter period is determined to be a maximum, no further increase is available and, at step 805, an appropriate image sensor gain level is determined which would increase the current brightness to the target level. If the shutter period is determined to be less than its maximum available value, at step 806, an appropriate longer shutter period is determined which would increase the current image brightness to the target level. Based on this determined shutter period, the corresponding drive current pulse period of LEDs 108 and 1 10 are adjusted to match the new shutter period. Finally, at step 807, the LED drive current amplitude is reduced to its maximum level according to the LED pulse handling curve as illustrated in Figures 5 and 7.
  • step 808 a determination is made as to whether the image sensor gain is current at its maximum level. If the image sensor gain is less than its maximum, at step 809, an appropriate gain level is determined which would decrease the current brightness to the target level. If the image sensor gain is at its maximum level, at step 810 an appropriate shorter shutter period is determined which would decrease the current image brightness to the target level. The corresponding drive current pulse period of LEDs 108 and 1 10 are adjusted to match the new shutter period. Finally, at step 81 1 , the LED drive current amplitude is increased to its maximum level according to the LED pulse handling curve as illustrated in Figures 5 and 7.
  • step 405 subsequent images are captured under the new illumination conditions (LED drive current amplitude and LED drive current pulse time) with corresponding camera parameters.
  • step 406 the one or more eye tracking parameters of the driver's eye or eyes are output.
  • the output parameters may be stored in memory 1 16 or in a separate database for subsequent eye tracking analysis.
  • the parameters may be stored in conjunction with a time stamp and optionally in conjunction with the original or processed images.
  • the above described method 400 provides for dynamically controlling LEDs 108 and 1 10 during eye tracking by assessing current image brightness or illumination conditions and adjusting the LED drive current amplitude and/or drive current pulse time to enhance visibility of the eyes.
  • the pixel region relates to a localized region around the detected object or objects.
  • the combined image is processed by vision processor 1 18 to determine one or more eye tracking parameters of the subject's eye.
  • the number of images used to generate the integrated image may be determined based on the determined illumination characteristics (e.g. brightness) of the previous captured images and also the motion blur of the imaged scene (e.g. driver head movement). If the eyes are not stationary during the period of integration, the eye-regions may optionally be aligned using feature tracking from the eye or surrounding face or sunglasses / glasses to maximize the alignment of eye features of iris, pupil and eyelids.
  • the multi-frame integration technique may optionally be implemented as multiple eye-region inputs to a tracking algorithm (e.g. neural network), which combines the information in the eye region input images during the tracking calculation.
  • the above described system and method provide for more robust eye(or other object) tracking under varying illumination conditions such as conditions of high brightness and reflection and in dark conditions.
  • the eye tracking can be performed when there is strong contrast on the face, and when the subject is wearing dark or highly reflective glasses.
  • the tight control of the LEDs also results in reducing cost, size and heat dissipation.
  • the invention leverages the fact that an LED has a nonlinear response and is more efficient at converting electrical energy into light at low drive currents. Using this property, the invention provides two dimensional control in terms of both the drive current amplitude and drive current pulse time to compensate for varying illumination conditions. In dark image conditions, higher output optical energies are achieved at the camera sensor by reducing the LED drive current amplitude and increasing the drive current pulse time over a longer camera exposure time. Conversely, in bright image conditions, higher instantaneous optical power can be achieved by increasing the LED drive current amplitude and reducing the drive current pulse time over a shorter camera exposure time.
  • controller or “processor” may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a "computer” or a “computing machine” or a “computing platform” may include one or more processors.
  • the term memory unit as used herein also encompasses a storage system such as a disk drive unit.
  • the processing system in some configurations may include a sound output device, and a network interface device.
  • the memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Eye Examination Apparatus (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)

Abstract

L'invention concerne un système et un procédé d'oculométrie. Un mode de réalisation concerne un système (100) comprenant un appareil photo (106) permettant de capturer des images d'un œil d'un conducteur (102) d'un véhicule, et des diodes électroluminescentes (DEL) (108, 110) servant à éclairer sélectivement l'œil du conducteur pendant la capture d'images par l'appareil photo (106). Un processeur (118) est configuré pour traiter au moins un sous-ensemble des images capturées afin de déterminer un ou plusieurs paramètres d'oculométrie de l'œil du sujet et de déterminer une ou plusieurs caractéristiques d'éclairage des images. Un dispositif de commande (120) est configuré pour envoyer un signal de commande de DEL aux DEL (108, 110) afin de commander l'amplitude de courant d'attaque et le temps d'impulsion des DEL (108, 110). Le dispositif de commande (120) règle sélectivement l'amplitude de courant d'attaque et/ou le temps d'impulsion sur la base des caractéristiques d'éclairage déterminées d'une ou plusieurs images capturées précédentes.
PCT/AU2018/050776 2017-10-31 2018-07-27 Système et procédé permettant d'améliorer le rapport signal sur bruit dans le suivi d'objets dans des conditions de lumière faible WO2019084595A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112018005191.9T DE112018005191T5 (de) 2017-10-31 2018-07-27 System und Verfahren zur Verbesserung des Signal-Rausch-Verhältnisses bei der Objektverfolgung unter schlechten Lichtbedingungen
JP2020523296A JP7138168B2 (ja) 2017-10-31 2018-07-27 低照明光条件下での物体追跡における信号対雑音比を向上させるためのシステム及び方法
US16/759,951 US11386709B2 (en) 2017-10-31 2018-07-27 System and method for improving signal to noise ratio in object tracking under poor light conditions

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2017904419A AU2017904419A0 (en) 2017-10-31 System and method for improving signal to noise ratio in eye tracking under poor light conditions
AU2017904419 2017-10-31

Publications (1)

Publication Number Publication Date
WO2019084595A1 true WO2019084595A1 (fr) 2019-05-09

Family

ID=66331179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2018/050776 WO2019084595A1 (fr) 2017-10-31 2018-07-27 Système et procédé permettant d'améliorer le rapport signal sur bruit dans le suivi d'objets dans des conditions de lumière faible

Country Status (4)

Country Link
US (1) US11386709B2 (fr)
JP (1) JP7138168B2 (fr)
DE (1) DE112018005191T5 (fr)
WO (1) WO2019084595A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021087573A1 (fr) * 2019-11-07 2021-05-14 Seeing Machines Limited Suivi oculaire de pupille brillante à haute performance
EP3890300A1 (fr) * 2020-04-03 2021-10-06 Hovering Solutions Ltd. Véhicule autopropulsé
CN113490316A (zh) * 2021-06-30 2021-10-08 浙江大华技术股份有限公司 一种补光系统及补光灯

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020148747A (ja) * 2019-03-15 2020-09-17 株式会社リコー 物体検出装置
US11851080B2 (en) 2021-02-03 2023-12-26 Magna Mirrors Of America, Inc. Vehicular driver monitoring system with posture detection and alert
US11780372B2 (en) 2021-03-01 2023-10-10 Magna Mirrors Of America, Inc. Vehicular driver monitoring system with driver monitoring camera and near IR light emitter at interior rearview mirror assembly
US11639134B1 (en) 2021-03-01 2023-05-02 Magna Mirrors Of America, Inc. Interior rearview mirror assembly with driver monitoring system
US11766968B2 (en) 2021-05-18 2023-09-26 Magna Mirrors Of America, Inc. Vehicular interior rearview mirror assembly with video mirror display and VRLC stack
US11930264B2 (en) 2021-05-18 2024-03-12 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
WO2023034956A1 (fr) 2021-09-03 2023-03-09 Magna Electronics Inc. Système de surveillance d'habitacle de véhicule avec émetteur de lumière qui peut fonctionner de façon sélective pour des fonctions dms et oms
SE2250765A1 (en) * 2022-06-22 2023-12-23 Tobii Ab An eye tracking system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100219327A1 (en) * 2009-03-01 2010-09-02 Arbore Mark A High speed quantum efficiency measurement apparatus utilizing solid state lightsource
US20150199003A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation Eye gaze detection with multiple light sources and sensors
US9131150B1 (en) * 2014-06-06 2015-09-08 Amazon Technologies, Inc. Automatic exposure control and illumination for head tracking
US20160198091A1 (en) * 2013-09-03 2016-07-07 Seeing Machines Limited Low power eye tracking system and method
WO2016131075A1 (fr) * 2015-02-20 2016-08-25 Seeing Machines Limited Réduction d'éblouissement
US20160358009A1 (en) * 2014-04-29 2016-12-08 Microsoft Technology Licensing, Llc Handling glare in eye tracking

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPQ896000A0 (en) 2000-07-24 2000-08-17 Seeing Machines Pty Ltd Facial image processing system
JP4780088B2 (ja) 2007-11-09 2011-09-28 アイシン精機株式会社 顔画像撮像装置、顔画像撮像方法、及びそのプログラム
JP5761074B2 (ja) 2012-02-24 2015-08-12 株式会社デンソー 撮像制御装置及びプログラム
WO2017017896A1 (fr) 2015-07-29 2017-02-02 京セラ株式会社 Dispositif de traitement d'image, dispositif de capture d'image, système de surveillance de conducteur, corps mobile, et procédé de traitement d'image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100219327A1 (en) * 2009-03-01 2010-09-02 Arbore Mark A High speed quantum efficiency measurement apparatus utilizing solid state lightsource
US20160198091A1 (en) * 2013-09-03 2016-07-07 Seeing Machines Limited Low power eye tracking system and method
US20150199003A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation Eye gaze detection with multiple light sources and sensors
US20160358009A1 (en) * 2014-04-29 2016-12-08 Microsoft Technology Licensing, Llc Handling glare in eye tracking
US9131150B1 (en) * 2014-06-06 2015-09-08 Amazon Technologies, Inc. Automatic exposure control and illumination for head tracking
WO2016131075A1 (fr) * 2015-02-20 2016-08-25 Seeing Machines Limited Réduction d'éblouissement

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021087573A1 (fr) * 2019-11-07 2021-05-14 Seeing Machines Limited Suivi oculaire de pupille brillante à haute performance
EP3890300A1 (fr) * 2020-04-03 2021-10-06 Hovering Solutions Ltd. Véhicule autopropulsé
WO2021198387A1 (fr) 2020-04-03 2021-10-07 Hovering Solutions Ltd. Véhicule autopropulsé
CN113490316A (zh) * 2021-06-30 2021-10-08 浙江大华技术股份有限公司 一种补光系统及补光灯
CN113490316B (zh) * 2021-06-30 2024-04-12 浙江大华技术股份有限公司 一种补光系统及补光灯

Also Published As

Publication number Publication date
US11386709B2 (en) 2022-07-12
DE112018005191T5 (de) 2020-06-18
US20200327323A1 (en) 2020-10-15
JP2021501517A (ja) 2021-01-14
JP7138168B2 (ja) 2022-09-15

Similar Documents

Publication Publication Date Title
US11386709B2 (en) System and method for improving signal to noise ratio in object tracking under poor light conditions
US10521683B2 (en) Glare reduction
EP2288287B1 (fr) Appareil d'imagerie de conducteur et procédé d'imagerie de conducteur
US8797394B2 (en) Face image capturing apparatus
JP5867355B2 (ja) 状態監視装置及び状態監視プログラム
JP6698963B2 (ja) 搭乗者状態検出装置、搭乗者状態検出システム及び搭乗者状態検出方法
CN113542529B (zh) 用于dms和oms的940nm led闪光同步
US20200169678A1 (en) Driver's eye position detecting device and method, imaging device having image sensor with rolling shutter driving system, and illumination control method thereof
US11046327B2 (en) System for performing eye detection and/or tracking
CN109565549B (zh) 用于运行内饰摄像机的方法和设备
CN110709281A (zh) 车辆用灯具及其控制装置、控制方法
TWI608735B (zh) 影像擷取裝置及亮度調整方法
US20220377223A1 (en) High performance bright pupil eye tracking
US20210374443A1 (en) Driver attention state estimation
JPWO2016167137A1 (ja) 撮影装置、撮影方法、信号処理装置、信号処理方法、及び、プログラム
JP2022553501A (ja) 弱光環境内の露光変化の制御
US10089731B2 (en) Image processing device to reduce an influence of reflected light for capturing and processing images
US20190045100A1 (en) Image processing device, method, and program
CN111214205A (zh) 控制发光器以获得最佳闪光
US11933599B2 (en) Electronic device and method for controlling same
US20190289186A1 (en) Imaging device
US20210235005A1 (en) Monitoring camera, camera parameter determining method and storage medium
US20220335648A1 (en) Determination of Gaze Direction
US20200221000A1 (en) Camera system, vehicle and method for configuring light source of camera system
WO2014196166A1 (fr) Appareil de commande de saisie d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18872633

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020523296

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 18872633

Country of ref document: EP

Kind code of ref document: A1