EP4035060B1 - Distance determination between an image sensor and a target area - Google Patents
Distance determination between an image sensor and a target area Download PDFInfo
- Publication number
- EP4035060B1 EP4035060B1 EP20772082.2A EP20772082A EP4035060B1 EP 4035060 B1 EP4035060 B1 EP 4035060B1 EP 20772082 A EP20772082 A EP 20772082A EP 4035060 B1 EP4035060 B1 EP 4035060B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- light source
- distance
- steering wheel
- image sensor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012545 processing Methods 0.000 claims description 28
- 238000000034 method Methods 0.000 claims description 14
- 230000003287 optical effect Effects 0.000 claims description 7
- 230000000875 corresponding effect Effects 0.000 description 6
- 210000003128 head Anatomy 0.000 description 5
- 238000013459 approach Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 240000000015 Iris germanica Species 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30268—Vehicle interior
Definitions
- the present invention relates to determining a distance between a single image sensor and a target area, in particular in an eye-tracking system.
- the face of a user is illuminated with a light source having a central wavelength outside the visible spectrum, and images of the face are acquired by one or several imaging sensors.
- images of the face are acquired by one or several imaging sensors.
- two image sensors (“stereo imaging") a distance to the face can be determined by triangulation.
- determining the distance to the face may be difficult.
- One approach is to measure (in the image) inter-pupillary distance or other similar parameter, and estimate the distance based on an expected value of the parameter.
- One major problem with this distance determination approach is the uncertainty in the estimation due to natural distribution of the measured parameter which leads to a range estimation error of around 10%.
- a different choice of parameter, such as iris size or cornea curvature, may minimize the uncertainty in the measurements, but as soon as the driver's eyes are covered e.g. by wearing glasses, measuring these features is affected by the refraction of light and becomes unreliable.
- WO 2019/036751 A1 (SEEING MACHINES LTD [AU]) 28 February 2019 relates to a method for determining a distance between an image sensor and a face of a driver of a vehicle, with an different configuration of the sensor and a different computation.
- this and other objects are achieved by a method for determining a distance D sensor between an image sensor and a target area of a driver of a vehicle, comprising arranging a point light source at a known geometric relationship with respect to the image sensor, such that a steering wheel of the vehicle at least occasionally casts a shadow in the target area, determining a distance D light source between the light source and the target area based on an image acquired be the image sensor of the target area including the shadow and a geometric relationship (e.g. distance) between the light source and the steering wheel, and determining the distance D sensor based on the distance D light source and the geometric relationship between the light source and the image sensor.
- a method for determining a distance D sensor between an image sensor and a target area of a driver of a vehicle comprising arranging a point light source at a known geometric relationship with respect to the image sensor, such that a steering wheel of the vehicle at least occasionally casts a shadow in the target area, determining a distance D light source between the light source and the target area based
- an arrangement for determining a distance D sensor between an image sensor and a target area of a driver of a vehicle comprising a point light source arranged at a known geometric relationship with respect to the image sensor and arranged such that a steering wheel of the vehicle at least occasionally casts a shadow in the target area, processing circuitry configured to determine a distance D light source between the light source and the target area based on an image acquired by the image sensor of the target area including the shadow and a geometric relationship between the light source and the steering wheel, and determine the distance D sensor based on the distance D light source and the geometric relationship between the light source and the image sensor.
- the invention is based on the realization that a steering wheel located between the light source(s) and the target area will cast a shadow, which in turn may be used to determine a distance between the light source and the target area if the geometrical relationships between 1) the light source(s) and the steering wheel, and 2) the light source(s) and the image sensor are known.
- a known geometrical relationship e.g. distance
- the known geometrical relationship between the light source and the steering wheel makes it possible to determine a first distance between the light source(s) and the target area based on the location and/or size of the shadow, while the known geometrical relationship between the light source and the image sensor makes it possible to translate this first distance to the required distance D sensor . (Of course, an explicit calculation of the first distance may not be necessary.)
- the "known geometrical relationship" between the light source and the steering wheel in the following description for purposes of illustration is referred to as "distance”. In many cases, this is also a correct description. In some cases, however, it is a simplification, as the steering wheel may not be in the same plane as the target area (e.g. face) and there is no single “distance”. In such a case, the geometric relationship may be more complex than a scalar distance.
- the distance D light source may be determined by detecting, in the image, a width of a shadow of a specific portion of the steering wheel, and calculating the distance D light source as d ⁇ R/r, where d is a distance between the light source and the steering wheel, r is a known width of the portion, and R is the detected width.
- the distance D light source may be determined by detecting, in the image, a position P of a shadow of a specific contour of the steering wheel, and calculating the distance D light source as d ⁇ P/p, where d is a distance between the light source and the steering wheel, p is a position of the contour with respect to the optical axis of the light source, and P is the detected position, wherein the position p and the detected position P are both determined with respect to an optical axis of the light source.
- the portion or contour is part of a spoke of the steering wheel.
- the distance between the steering wheel and the image sensor may be previously known, or determined during suitable calibration procedure.
- the steering wheel may be provided with reflective markers with a known separation, making it possible to detect the distance between the image sensor and the steering wheel from an image acquired with the image sensor.
- the distance between the image sensor and the steering wheel is determined based on a value of a geometric parameter identified in an image acquired by said image sensor, and on a pre-identified value of the same geometric parameter at a known distance.
- the geometry (e.g. width of a spoke) of the steering wheel is desirable to facilitate the determination of the distance.
- Such geometry may be known, e.g. from CAD information relating to the steering wheel of a vehicle.
- the geometry of the steering wheel may be determined from an image acquired by the image sensor, and based on a known distance between the image sensor and the steering wheel.
- an angular position of the steering wheel is detected to facilitate the determination of the distance. If the geometry of the steering wheel and its angular position are known, the angle of the side of the spoke is known, which facilitates identification of the spoke-shadow. Also, the position of the spoke is known, so it is not critical to be able to acquire an image of the steering wheel. This allows an implementation with the image sensor unable to "see" the steering wheel.
- the shadow pattern, geometry of the steering wheel, the distance between the steering wheel and the image sensor, or the angular position of the steering wheel may be determined by providing appropriate output of the measurements (e.g. geometric parameters identified in an acquired image) as a function over time to an artificial neural network trained on the variety of geometrical dimensions and locations of the steering wheel.
- Figure 1 , 2a , 3a show an eye-tracking system 1 for detecting eyes of a user.
- the system comprises a light source 2, an image sensor 3 and processing circuitry 4.
- the image sensor 3 is configured to acquire images of a target area, here the face 5 of a user 6.
- the processing circuitry is configured to identify and detect the eyes 7 of the user 6.
- the light source 2 may be any suitable type of light source, including but not limited to light emitting diode (LED)s operating in different wavelength domains preferably having a central wavelength in the infrared (IR) or near IR part of the light spectrum, eye-safe laser sources with divergence lenses or the like. In this preferred embodiment the light source 2 is an IR LED.
- LED light emitting diode
- IR infrared
- the light source 2 is an IR LED.
- the light source is used in combination with a band-pass filter having a pass-band centered around the center IR wavelength.
- the center wavelength can be in the near IR region, e.g. 840 nm or 940 nm.
- the filter has a pass-band enabling capturing of most light emitted by the light source, while at the same time blocking most ambient light.
- the image sensor 3 can be a camera or a photodetector such as semiconductor image sensors, photodetectors, thermal detectors, PIN diodes or avalanche diodes.
- the camera 3 may further be a charge coupled device (CCD) sensor array, or a complementary metal-oxide semiconductor (CMOS) sensor array, or similar.
- CCD charge coupled device
- CMOS complementary metal-oxide semiconductor
- the image sensor 3 may be provided with a pass-band filter (not shown) having a pass-band corresponding to the spectrum of the light source 2.
- the light source 2 and image sensor 3 may be arranged so that the transmitted signal is synchronously demodulated to filter out ambient IR noise, thus improving the quality of captured images even with low intensity IR LEDs.
- the IR light transmitted from the LED can be continuous wave over a predetermined time period or pulsing light at a predetermined frequency.
- the sensitivity and resolution of the image sensor 3 may be selected such that the recorded image is suitable for further data processing such as image processing or to be used as an input for machine vision algorithms.
- the light source 2 and image sensor 3 may be arranged to be time synchronized.
- the processing circuitry 4 may be configured to operate computer vision systems and image processing algorithms for processing and analysing images acquired from the image sensor 3.
- the processing circuitry 4 may be configured to perform other functions such as determining position of the head of the user e.g. when the driver moves his or her head, the processing circuitry 4 may be configured to use the position information to determine the direction in which the head moves and compute changes in the distance between the target area 7 and the image sensor 4.
- the processing circuitry 4 is also configured for sending and/or receiving data between different components of the vehicle or remote data centres. Further, the processing circuitry 4 is in this case connected to the IR image sensor 3 and other components via wired connections. However, it is conceivable to use wireless connections instead of wired connections to access remote components.
- a steering wheel 8 is located between the light source and the face 5, so that the steering wheel casts a shadow 9 on the face 5.
- the processing circuitry may also be connected to receive various sensor input to facilitate the processing. For example, information about current steering wheel angle may be provided to the processing circuitry.
- the light source 2 the image sensor 3 and the processing circuitry 4 are illustrated as separate components, this may or may not be the case in a different example.
- Other combinations such as integrating the light source 2 or the processing circuitry 4 with the image sensor 3 or a remote processing circuitry 4 arranged to wirelessly receive data from the image sensor 3 etc. are readily conceivable for the person skilled in the art.
- some, or all, of these components may form part of a single unit in a different example.
- a specific portion here the rim 20 of the steering wheel 8 is used to determine the distance D light source .
- the specific portion 20 has a known width "r” and the distance between the LED 2 and steering wheel 8 is "d".
- the rim 20 When illuminated by the LED 2, the rim 20 creates a corresponding shadow portion 22 with a width "R" on the target area 7.
- the processing circuitry 4 is arranged to detect and identify the size of the casted shadow portion 22 in an acquired image.
- the incident light from the LED 2 projected on the rim 20 and the corresponding shadow 22 create two similar triangles ⁇ abc ⁇ ⁇ ade as signified in figure 2b .
- a position "p" of a specific contour here an edge 31 of a spoke 32 of the steering wheel 8 is determined with respect to the optical axis A of the LED 2.
- a position "P” of a portion 34 of the shadow 9 corresponding to the position p is detected by the processing circuitry 4 in an acquired images.
- the distance between the LED 2 and the steering wheel 8 is labelled d.
- step S1 the light source 2 is arranged such that a shadow of the steering wheel 8 is casted on the face 5 of the user 6.
- the part of the steering wheel 8 will always be located between the light source 2 and the face 5.
- the shadow may only be visible for some positions of the steering wheel.
- step S2 the image sensor 3 acquires one or several images including the face 5 with the shadow 9 of the steering wheel 8.
- the acquired image(s) is provided to the processing circuitry 4 for suitable image processing.
- the processing circuitry 4 determines the distance between the image sensor 3 and the face 5, based on the relationships discussed above.
- the processing circuitry 4 extracts the width d and/or position p of the shadow 9. For example, by providing the captured shadow of the steering wheel as input to the image processing algorithm, the corresponding pixel information from the captured images can be extracted.
- the extracted pixel information may include pixel brightness or colour, pixel dimensions, or pixel location on the target area which in turn can be used to compute the size or position of the shadow.
- Information about current steering wheel angle may facilitate the identification of contours of the shadow, e.g. an edge of a spoke, which will have an inclination correlated to the steering wheel angle.
- step S4 the information from S3 is used to determine the distance D light source between the light source 2 and the target area 5. Finally, using a known geometrical relationship between the light source 2 and the image sensor 3, the distance D sensor between the image sensor 3 and the target area 7 can be accordingly calculated in step S5.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Description
- The present invention relates to determining a distance between a single image sensor and a target area, in particular in an eye-tracking system.
- Traditionally in monitoring systems for vehicles such as vehicle occupant identification arrangements, or eye-tracking systems, techniques to determine the distance of a sensor device to a vehicle occupant have focused on identifying the position of a body part of a vehicle occupant e.g. the driver's head or face.
- In most head or eye tracking systems, the face of a user is illuminated with a light source having a central wavelength outside the visible spectrum, and images of the face are acquired by one or several imaging sensors. In case of two image sensors ("stereo imaging") a distance to the face can be determined by triangulation. However, in systems with only one camera, determining the distance to the face may be difficult.
- One approach is to measure (in the image) inter-pupillary distance or other similar parameter, and estimate the distance based on an expected value of the parameter. One major problem with this distance determination approach is the uncertainty in the estimation due to natural distribution of the measured parameter which leads to a range estimation error of around 10%. A different choice of parameter, such as iris size or cornea curvature, may minimize the uncertainty in the measurements, but as soon as the driver's eyes are covered e.g. by wearing glasses, measuring these features is affected by the refraction of light and becomes unreliable.
-
WO 2019/036751 A1 (SEEING MACHINES LTD [AU]) 28 February 2019 relates to a method for determining a distance between an image sensor and a face of a driver of a vehicle, with an different configuration of the sensor and a different computation. $ - Thus, there is a need for an improved distance determination between a driver's face and a single image sensor.
- It is an object of the present invention to overcome the above mentioned problems, and to provide a feasible way to satisfactorily determine a distance between a single image sensor and a target area in a vehicle.
- According to a first aspect of the invention, this and other objects are achieved by a method for determining a distance Dsensor between an image sensor and a target area of a driver of a vehicle, comprising arranging a point light source at a known geometric relationship with respect to the image sensor, such that a steering wheel of the vehicle at least occasionally casts a shadow in the target area, determining a distance Dlight source between the light source and the target area based on an image acquired be the image sensor of the target area including the shadow and a geometric relationship (e.g. distance) between the light source and the steering wheel, and determining the distance Dsensor based on the distance Dlight source and the geometric relationship between the light source and the image sensor.
- According to a second aspect of the invention, this and other objects are achieved by an arrangement for determining a distance Dsensor between an image sensor and a target area of a driver of a vehicle, comprising a point light source arranged at a known geometric relationship with respect to the image sensor and arranged such that a steering wheel of the vehicle at least occasionally casts a shadow in the target area, processing circuitry configured to determine a distance Dlight source between the light source and the target area based on an image acquired by the image sensor of the target area including the shadow and a geometric relationship between the light source and the steering wheel, and determine the distance Dsensor based on the distance Dlight source and the geometric relationship between the light source and the image sensor.
- The invention is based on the realization that a steering wheel located between the light source(s) and the target area will cast a shadow, which in turn may be used to determine a distance between the light source and the target area if the geometrical relationships between 1) the light source(s) and the steering wheel, and 2) the light source(s) and the image sensor are known.
- With this approach, a reliable determination of the distance to the target area may be provided using only one single image sensor.
- In principle, a known geometrical relationship (e.g. distance) between the light source and the steering wheel makes it possible to determine a first distance between the light source(s) and the target area based on the location and/or size of the shadow, while the known geometrical relationship between the light source and the image sensor makes it possible to translate this first distance to the required distance Dsensor. (Of course, an explicit calculation of the first distance may not be necessary.)
- It is noted that the "known geometrical relationship" between the light source and the steering wheel in the following description for purposes of illustration is referred to as "distance". In many cases, this is also a correct description. In some cases, however, it is a simplification, as the steering wheel may not be in the same plane as the target area (e.g. face) and there is no single "distance". In such a case, the geometric relationship may be more complex than a scalar distance.
- In some embodiments, the distance Dlight source may be determined by detecting, in the image, a width of a shadow of a specific portion of the steering wheel, and calculating the distance Dlight source as d × R/r, where d is a distance between the light source and the steering wheel, r is a known width of the portion, and R is the detected width.
- In other embodiments, the distance Dlight source may be determined by detecting, in the image, a position P of a shadow of a specific contour of the steering wheel, and calculating the distance Dlight source as d × P/p, where d is a distance between the light source and the steering wheel, p is a position of the contour with respect to the optical axis of the light source, and P is the detected position, wherein the position p and the detected position P are both determined with respect to an optical axis of the light source.
- The typical design of a steering wheel, with a ring and spokes, and its rotational motion during driving, makes it highly suitable for the purposes of the invention.
- The portion or contour is part of a spoke of the steering wheel.
- An advantage with using the spoke is that the steering wheel typically is not adjustable sideways. The horizontal component of the radially extending spoke will thus be unaffected of steering wheel adjustments up/down.
- The distance between the steering wheel and the image sensor may be previously known, or determined during suitable calibration procedure. For example, the steering wheel may be provided with reflective markers with a known separation, making it possible to detect the distance between the image sensor and the steering wheel from an image acquired with the image sensor. Alternatively, the distance between the image sensor and the steering wheel is determined based on a value of a geometric parameter identified in an image acquired by said image sensor, and on a pre-identified value of the same geometric parameter at a known distance.
- In some embodiments, the geometry (e.g. width of a spoke) of the steering wheel is desirable to facilitate the determination of the distance. Such geometry may be known, e.g. from CAD information relating to the steering wheel of a vehicle. Alternatively, in situations when such information is not available, the geometry of the steering wheel may be determined from an image acquired by the image sensor, and based on a known distance between the image sensor and the steering wheel.
- In some embodiment, an angular position of the steering wheel is detected to facilitate the determination of the distance. If the geometry of the steering wheel and its angular position are known, the angle of the side of the spoke is known, which facilitates identification of the spoke-shadow. Also, the position of the spoke is known, so it is not critical to be able to acquire an image of the steering wheel. This allows an implementation with the image sensor unable to "see" the steering wheel.
- Alternatively, in some embodiments, the shadow pattern, geometry of the steering wheel, the distance between the steering wheel and the image sensor, or the angular position of the steering wheel may be determined by providing appropriate output of the measurements (e.g. geometric parameters identified in an acquired image) as a function over time to an artificial neural network trained on the variety of geometrical dimensions and locations of the steering wheel.
- The invention is defined in the appended claims.
- The present invention will be described in more detail with reference to the appended drawings, showing currently preferred embodiments of the invention.
-
Figure 1 is a schematic illustration of an eye-tracking system in accordance with an embodiment of the present invention. -
Figure 2a is a side view of the eye-tracking system infigure 1 . -
Figure 2b is a schematic illustration of geometrical relationships infigure 2a . -
Figure 3a is another schematic illustration of the eye-tracking system infigure 1 . -
Figure 3b is a schematic illustration of geometrical relationships infigure 3a . -
Figure 4 is a flowchart of a method in accordance with an embodiment of the present invention. - In the following detailed description, some embodiments of the present invention will be described. However, it is to be understood that features of the different embodiments are exchangeable between the embodiments and may be combined in different ways, unless anything else is specifically indicated. Even though in the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention, it will be apparent to one skilled in the art that the present invention may be practiced without these specific details.
- The basics and conventional techniques in electronics, sensor systems, image analysis, signal processing, data communication systems, image acquisition systems, and other components to carry out the invention are considered to be readily understood by the skilled person in the art and therefore for the sake of brevity, further explanations and details will be omitted in this description.
- In other instances, well known constructions or functions are not described in detail, so as not to obscure the present invention.
-
Figure 1 ,2a ,3a show an eye-tracking system 1 for detecting eyes of a user. The system comprises alight source 2, animage sensor 3 andprocessing circuitry 4. Theimage sensor 3 is configured to acquire images of a target area, here theface 5 of auser 6. The processing circuitry is configured to identify and detect theeyes 7 of theuser 6. - Although a single light source is illustrated in
figures 1 ,2a ,3a , it is possible to provide more than one light source. Alternatingly illuminating the target area with at least two light sources may be advantageous e.g. for purposes of reflex reduction/elimination. Thelight source 2 may be any suitable type of light source, including but not limited to light emitting diode (LED)s operating in different wavelength domains preferably having a central wavelength in the infrared (IR) or near IR part of the light spectrum, eye-safe laser sources with divergence lenses or the like. In this preferred embodiment thelight source 2 is an IR LED. In order to eliminate or minimize interference from ambient light the light source is used in combination with a band-pass filter having a pass-band centered around the center IR wavelength. The center wavelength can be in the near IR region, e.g. 840 nm or 940 nm. The filter has a pass-band enabling capturing of most light emitted by the light source, while at the same time blocking most ambient light. - The
image sensor 3 can be a camera or a photodetector such as semiconductor image sensors, photodetectors, thermal detectors, PIN diodes or avalanche diodes. Thecamera 3 may further be a charge coupled device (CCD) sensor array, or a complementary metal-oxide semiconductor (CMOS) sensor array, or similar. In the case of a narrow band light source, such as an IR LED as mentioned above, theimage sensor 3 may be provided with a pass-band filter (not shown) having a pass-band corresponding to the spectrum of thelight source 2. - Further the
light source 2 andimage sensor 3 may be arranged so that the transmitted signal is synchronously demodulated to filter out ambient IR noise, thus improving the quality of captured images even with low intensity IR LEDs. The IR light transmitted from the LED can be continuous wave over a predetermined time period or pulsing light at a predetermined frequency. The sensitivity and resolution of theimage sensor 3 may be selected such that the recorded image is suitable for further data processing such as image processing or to be used as an input for machine vision algorithms. - The
light source 2 andimage sensor 3 may be arranged to be time synchronized. Theprocessing circuitry 4 may be configured to operate computer vision systems and image processing algorithms for processing and analysing images acquired from theimage sensor 3. - Further, the
processing circuitry 4 may be configured to perform other functions such as determining position of the head of the user e.g. when the driver moves his or her head, theprocessing circuitry 4 may be configured to use the position information to determine the direction in which the head moves and compute changes in the distance between thetarget area 7 and theimage sensor 4. Theprocessing circuitry 4 is also configured for sending and/or receiving data between different components of the vehicle or remote data centres. Further, theprocessing circuitry 4 is in this case connected to theIR image sensor 3 and other components via wired connections. However, it is conceivable to use wireless connections instead of wired connections to access remote components. - A
steering wheel 8, is located between the light source and theface 5, so that the steering wheel casts ashadow 9 on theface 5. - The processing circuitry may also be connected to receive various sensor input to facilitate the processing. For example, information about current steering wheel angle may be provided to the processing circuitry.
- As it should be appreciated, even though in this example the
light source 2, theimage sensor 3 and theprocessing circuitry 4 are illustrated as separate components, this may or may not be the case in a different example. Other combinations such as integrating thelight source 2 or theprocessing circuitry 4 with theimage sensor 3 or aremote processing circuitry 4 arranged to wirelessly receive data from theimage sensor 3 etc. are readily conceivable for the person skilled in the art. Thus, some, or all, of these components may form part of a single unit in a different example. - With reference to
figure 2 , a specific portion, here therim 20 of thesteering wheel 8 is used to determine the distance Dlight source. In this example thespecific portion 20 has a known width "r" and the distance between theLED 2 andsteering wheel 8 is "d". When illuminated by theLED 2, therim 20 creates acorresponding shadow portion 22 with a width "R" on thetarget area 7. - In this embodiment, the
processing circuitry 4 is arranged to detect and identify the size of thecasted shadow portion 22 in an acquired image. The incident light from theLED 2 projected on therim 20 and the correspondingshadow 22 create two similar triangles △abc ~ △ade as signified infigure 2b . By using the triangle proportionality theorem, it is determined that the corresponding altitudes and sides of the two similar triangles are proportional to each other and in this example theprocessing circuitry 4 determines the distance Dlight source as Dlight source = d × R/r. - In the embodiment in
figure 3a , a position "p" of a specific contour, here anedge 31 of aspoke 32 of thesteering wheel 8 is determined with respect to the optical axis A of theLED 2. A position "P" of a portion 34 of theshadow 9 corresponding to the position p is detected by theprocessing circuitry 4 in an acquired images. Just as infigure 2a , the distance between theLED 2 and thesteering wheel 8 is labelled d. - Again, and with reference to
figure 3b , two similar triangles △apc ~ △aPe are formed, where c and e are located on the optical axis A. If the positions p and P are expressed as normal distances to the optical axis A, the distance Dlight source may be determined as Dlight source = d × P/p. It should be appreciated that determination of the distance Dlight source is also readily conceivable by choosing other similar triangles. - The operation of the
system 1 infigures 1 ,2a ,3a will be described in the following with reference to the flowchart inFigure. 4 . - In step S1, the
light source 2 is arranged such that a shadow of thesteering wheel 8 is casted on theface 5 of theuser 6. In some installations, the part of thesteering wheel 8 will always be located between thelight source 2 and theface 5. In other installations, the shadow may only be visible for some positions of the steering wheel. - In step S2, the
image sensor 3 acquires one or several images including theface 5 with theshadow 9 of thesteering wheel 8. The acquired image(s) is provided to theprocessing circuitry 4 for suitable image processing. - In steps S3-S5, the
processing circuitry 4 determines the distance between theimage sensor 3 and theface 5, based on the relationships discussed above. First, in step S3, theprocessing circuitry 4 extracts the width d and/or position p of theshadow 9. For example, by providing the captured shadow of the steering wheel as input to the image processing algorithm, the corresponding pixel information from the captured images can be extracted. The extracted pixel information may include pixel brightness or colour, pixel dimensions, or pixel location on the target area which in turn can be used to compute the size or position of the shadow. Information about current steering wheel angle may facilitate the identification of contours of the shadow, e.g. an edge of a spoke, which will have an inclination correlated to the steering wheel angle. - Then, in step S4, the information from S3 is used to determine the distance Dlight source between the
light source 2 and thetarget area 5. Finally, using a known geometrical relationship between thelight source 2 and theimage sensor 3, the distance Dsensor between theimage sensor 3 and thetarget area 7 can be accordingly calculated in step S5. - It is noted that the process in
figure 4 typically only needs to be performed once for each driving session. After an initial determination of the distance Dsensor, continuous distance measurement can be made using some invariable parameter (e.g. iris size). By measuring this parameter at the initially determined distance, any change of distance may be detected by monitoring this parameter. - The person skilled in the art realizes that the present invention by no means is limited to the preferred embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims. For example, the details of the eye tracking system disclosed herein are not critical for the operation of the present invention.
Claims (8)
- A method for determining a distance Dsensor between an image sensor and a face of a driver of a vehicle, comprising:
arranging a point light source at a known geometric relationship with respect to the image sensor, characterized by:said point light source positioned such that spokes of a steering wheel of the vehicle at least occasionally cast a shadow in the face;determining a distance Dlight source between the point light source and the face based on an image acquired by the image sensor of the face including the shadow, and a geometric relationship between said point light source and said steering wheel; anddetermining the distance Dsensor based on said distance Dlight source and said known geometric relationship between the point light source and the image sensor,wherein the distance Dlight source either is determined by:- detecting, in the image, a width (R) of a shadow of a spoke of the steering wheel, and- calculating the distance Dlight source as Dlight source = d × R/r, where d is a known distance between the point light source and the steering wheel, r is a known width of said spoke, and R is the detected width,or is determined by:- detecting, in the image, a position P of a shadow of a specific contour of a spoke of the steering wheel, and- calculating the distance Dlight source as Dlight source = d × P/p, where d is a known distance between the point light source and the steering wheel, p is a known position of the contour, and P is the detected position,wherein the position p and the detected position P are both determined with respect to an optical axis of the point light source. - The method according to claim 1, further comprising:determining a distance between the image sensor and the steering wheel, anddetermining the distance d between the point light source and the steering wheel based on said distance between the image sensor and the steering wheel and said known relationship between the point light source and the image sensor.
- The method according to claim 2, wherein the distance between said image sensor and said steering wheel is determined based on a value of a geometric parameter identified in an image acquired by said mage sensor, and on a pre-identified value of said geometric parameter at a known distance.
- The method according to claim 2, wherein the distance between said image sensor and said steering wheel is determined based on positions of a set of markers in an image acquired by said image sensor, said markers being arranged on said structure at predefined positions.
- The method according to one of the preceding claims, wherein the method further includes determining a geometry of said steering wheel from an image acquired by said mage sensor, and based on said distance between said image sensor and said structure.
- The method according to one of the preceding claims, wherein the method further includes detecting an angular position of the steering wheel.
- An arrangement for determining a distance Dsensor between an image sensor and a face of a driver of a vehicle, comprising:
a point light source arranged at a known geometric relationship with respect to the image sensor characterized by:said point light source isarranged such that a steering wheel of the vehicle at least occasionally casts a shadow in the face,processing circuitry configured to:determine a distance Dlight source between the point light source and the face based on an image acquired be the image sensor of the face including the shadow, and a geometric relationship between said light source and said steering wheel, anddetermine the distance Dsensor based on said distance Dlight source and said known geometric relationship between the point light source and the image sensor,wherein the distance Dlight source either is determined by:- detecting, in the image, a width (R) of a shadow of a spoke of the steering wheel, and- calculating the distance Dlight source as Dlight source = d × R/r, where d is a known distance between the point light source and the steering wheel, r is a known width of said spoke, and R is the detected width,or is determined by:- detecting, in the image, a position P of a shadow of a specific contour of a spoke of the steering wheel, and- calculating the distance Dlight source as Dlight source = d × P/p, where d is a known distance between the point light source and the steering wheel, p is a known position of the contour, and P is the detected position,wherein the position p and the detected position P are both determined with respect to an optical axis of the point light source. - The arrangement according to claim 7, wherein the system further comprises an angular sensor for detecting an angular position of the steering wheel.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19199873 | 2019-09-26 | ||
PCT/EP2020/076370 WO2021058455A1 (en) | 2019-09-26 | 2020-09-22 | Distance determination between an image sensor and a target area |
Publications (2)
Publication Number | Publication Date |
---|---|
EP4035060A1 EP4035060A1 (en) | 2022-08-03 |
EP4035060B1 true EP4035060B1 (en) | 2024-05-22 |
Family
ID=68072145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20772082.2A Active EP4035060B1 (en) | 2019-09-26 | 2020-09-22 | Distance determination between an image sensor and a target area |
Country Status (6)
Country | Link |
---|---|
US (1) | US12062199B2 (en) |
EP (1) | EP4035060B1 (en) |
JP (1) | JP7500714B2 (en) |
KR (1) | KR20220063267A (en) |
CN (1) | CN114467126A (en) |
WO (1) | WO2021058455A1 (en) |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7106898B2 (en) | 1999-12-06 | 2006-09-12 | California Institute Of Technology | 3D scanning using shadows |
JP2007050834A (en) | 2005-08-19 | 2007-03-01 | Denso Corp | Light shielding device for vehicle |
JP2007285778A (en) | 2006-04-13 | 2007-11-01 | Roland Dg Corp | Distance measuring method and surface shape measuring device |
JP5516096B2 (en) | 2010-06-09 | 2014-06-11 | 横浜ゴム株式会社 | Steering angle detection device and steering angle detection method |
US8938099B2 (en) | 2010-12-15 | 2015-01-20 | Canon Kabushiki Kaisha | Image processing apparatus, method of controlling the same, distance measurement apparatus, and storage medium |
JP6107264B2 (en) | 2013-03-15 | 2017-04-05 | 株式会社リコー | Imaging unit, color measuring device, image forming apparatus, color measuring system, and distance measuring method |
JP6342874B2 (en) | 2015-11-24 | 2018-06-13 | 矢崎総業株式会社 | Image recognition device |
US10349032B2 (en) * | 2016-09-30 | 2019-07-09 | Veoneer Us, Inc. | Vehicle occupant head positioning system |
JP6737212B2 (en) | 2017-03-14 | 2020-08-05 | オムロン株式会社 | Driver state estimating device and driver state estimating method |
US20200210733A1 (en) | 2017-08-22 | 2020-07-02 | Seeing Machines Limited | Enhanced video-based driver monitoring using phase detect sensors |
-
2020
- 2020-09-22 WO PCT/EP2020/076370 patent/WO2021058455A1/en unknown
- 2020-09-22 US US17/762,602 patent/US12062199B2/en active Active
- 2020-09-22 KR KR1020227013477A patent/KR20220063267A/en active Search and Examination
- 2020-09-22 JP JP2022519232A patent/JP7500714B2/en active Active
- 2020-09-22 EP EP20772082.2A patent/EP4035060B1/en active Active
- 2020-09-22 CN CN202080067491.5A patent/CN114467126A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN114467126A (en) | 2022-05-10 |
KR20220063267A (en) | 2022-05-17 |
EP4035060A1 (en) | 2022-08-03 |
US20220366585A1 (en) | 2022-11-17 |
JP2022549694A (en) | 2022-11-28 |
WO2021058455A1 (en) | 2021-04-01 |
US12062199B2 (en) | 2024-08-13 |
JP7500714B2 (en) | 2024-06-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10288734B2 (en) | Sensing system and method | |
EP1452127B1 (en) | Apparatus for detecting pupils | |
US20200210733A1 (en) | Enhanced video-based driver monitoring using phase detect sensors | |
EP3371781B1 (en) | Systems and methods for generating and using three-dimensional images | |
US9033502B2 (en) | Optical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable | |
US7695138B2 (en) | Safe eye detection | |
CN104127302B (en) | A kind of visually impaired people's walking along the street safety navigation method | |
JP4604190B2 (en) | Gaze detection device using distance image sensor | |
CN107894243A (en) | For carrying out the photoelectric sensor and method of optical detection to monitored area | |
US7583863B2 (en) | Method and system for wavelength-dependent imaging and detection using a hybrid filter | |
EP3218785A1 (en) | Eyewear-mountable eye tracking device | |
US20060279745A1 (en) | Color imaging system for locating retroreflectors | |
US20180110423A1 (en) | Optical laser speckle sensor for measuring a blood perfusion parameter | |
EP4035060B1 (en) | Distance determination between an image sensor and a target area | |
FI124966B (en) | Ophthalmic Apparatus and Eye Measurement Procedure | |
JP6555707B2 (en) | Pupil detection device, pupil detection method, and pupil detection program | |
WO2018170538A1 (en) | System and method of capturing true gaze position data | |
JP2016051317A (en) | Visual line detection device | |
EP3801196B1 (en) | Method and system for glint/reflection identification | |
CN113879321B (en) | Driver monitoring device and driver monitoring method | |
JP6597467B2 (en) | Face orientation measuring device | |
US20220121278A1 (en) | Connected device with eye tracking capabilities | |
CN118567103A (en) | Electronic device with nose tracking sensor | |
WO2023195872A1 (en) | Method and system for determining heartbeat characteristics | |
WO2016185486A1 (en) | Wheel alignment measurement system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220331 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602020031330 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06K0009000000 Ipc: G06V0020590000 Ref legal event code: R079 Ipc: G06V0020590000 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06V 20/59 20220101AFI20231218BHEP |
|
INTG | Intention to grant announced |
Effective date: 20240110 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SMART EYE AB |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20240326 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602020031330 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |