EP3900320A1 - Image capture device and associated system for monitoring a driver - Google Patents
Image capture device and associated system for monitoring a driverInfo
- Publication number
- EP3900320A1 EP3900320A1 EP19791287.6A EP19791287A EP3900320A1 EP 3900320 A1 EP3900320 A1 EP 3900320A1 EP 19791287 A EP19791287 A EP 19791287A EP 3900320 A1 EP3900320 A1 EP 3900320A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- capture device
- infrared
- image capture
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 13
- 230000005855 radiation Effects 0.000 claims abstract description 30
- 230000003287 optical effect Effects 0.000 claims abstract description 28
- 230000005670 electromagnetic radiation Effects 0.000 claims description 23
- 230000010354 integration Effects 0.000 claims description 19
- 239000011159 matrix material Substances 0.000 claims description 9
- 206010041349 Somnolence Diseases 0.000 claims description 7
- 238000001914 filtration Methods 0.000 abstract description 5
- 230000005540 biological transmission Effects 0.000 abstract 2
- 230000003595 spectral effect Effects 0.000 abstract 2
- 230000006870 function Effects 0.000 description 22
- 238000004458 analytical method Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 231100000289 photo-effect Toxicity 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
Definitions
- the present invention relates to an image capture device. It also relates to a device for monitoring a driver of a vehicle comprising an image capture device.
- Image capture devices for viewing a given scene both in the visible and in the infrared domain, and this with the same image sensor, have been developed recently.
- the image sensor of such a device somehow hybrid, is sometimes called an "RGB-IR" image sensor (according to the acronym of Red Green Blue - InfraRed, that is- i.e. Red Green Blue - InfraRed).
- This image sensor comprises a matrix of photosensitive pixels and a network of elementary optical filters coinciding with these different photosensitive pixels.
- Part of these elementary optical filters are color filters. They each transmit part of the visible radiation received by the image sensor, this part corresponding in practice to a red, a green, or a blue. These various elementary color filters make it possible to acquire a color image, for example of the “RGB” type.
- the other elementary optical filters of the image sensor are transparent, at least in part, in the infrared domain. They make it possible to acquire an image of the scene in question in the infrared domain.
- the color image and the image in the infrared domain, which contain additional information, are thus obtained with the same sensor, which is advantageous in particular in terms of cost and size.
- the ambient brightness in the environment of an image capture device is often clearly different in the visible range and in the infrared range. With such an image sensor, it is therefore generally not possible to obtain an optimal exposure both for the color image and for the image in the infrared range.
- the present invention provides an image capture device comprising an image sensor, which comprises: a network of optical filters receiving electromagnetic radiation and comprising first filter elements each capable of transmitting a first part of the electromagnetic radiation included in a given wavelength range of infrared, as well as second filter elements each capable of transmit at least one component of a second part of the electromagnetic radiation located in the visible, and
- a matrix of photosensitive pixels comprising first photosensitive pixels arranged so as to capture the first part of the electromagnetic radiation transmitted by the first filtering elements, as well as second photosensitive pixels arranged so as to capture the component transmitted by the second filtering elements, each first and second photosensitive pixels being able to generate an electrical signal representative of the power of the electromagnetic radiation which it picks up.
- the image capture device further comprises:
- a lighting device configured to emit infrared radiation in a field of vision of the image capture device, said infrared radiation being located at least partly in said range of wavelengths transmitted by the first filter elements, and
- a computer programmed to execute the following steps: a) determining an ambient brightness in the infrared range, b) controlling the power of the infrared radiation emitted by the lighting device, as a function of said ambient brightness in the range infrared, c) acquire the electrical signals generated by the first and second photosensitive pixels, compose a first image from the electrical signals generated by the first photosensitive pixels, and compose a second image from the electrical signals generated by the second photosensitive pixels.
- Infrared radiation which illuminates the scene located in the field of vision of the image capture device, includes the radiation emitted by the lighting device, as well as the infrared radiation possibly coming from other surrounding sources (light solar for example).
- the lighting device controlled as a function of the ambient light in the infrared domain, makes it possible to control the total power of the infrared radiation which illuminates the scene.
- the computer can also be programmed to execute the following steps: a ') determining an ambient brightness in the visible range, and b') controlling at least one of the following exposure parameters as a function of said ambient brightness in the visible range:
- each of said electrical signals being representative of an electric charge or voltage, accumulated by the corresponding photosensitive pixel during said integration duration
- controlling one or more of these exposure parameters makes it possible to obtain a suitable exposure of the second image, and thus to prevent it from being overexposed or underexposed.
- controlling the power of the infrared radiation emitted by the lighting device also makes it possible to obtain a suitable exposure of the first image (“infrared” image), even if the exposure parameters mentioned above are also fixed, depending on the ambient brightness in the visible range.
- the computer is programmed for, step c) having been executed previously, determining said ambient brightness in the visible range, in step a '), depending on brightness values of at least part of the image pixels of the second image which was produced during said previous execution of step c).
- the computer is programmed to, in step a '), determine said ambient brightness in the visible range so that that it is representative of a second level of average brightness, in the second image produced during said previous execution of step c).
- the computer is programmed to, in step b '), correct at least one of said exposure parameters as a function of a difference between, on the one hand, a second target value of average brightness, and, on the other hand, the second level of average brightness in the second image produced during the previous execution of step c).
- the computer is programmed for, step c) having been executed previously, determining said ambient brightness in the infrared range, in step a), as a function of values of brightness of at least part of the image pixels of the first image which was produced during said previous execution of step c).
- the computer is programmed to, in step a), determine said ambient brightness in the infrared range so that it is representative of a first brightness level average, in the first image produced during said previous execution of step c).
- the computer is programmed to, in step b), control the power of the infrared radiation emitted by the lighting device as a function of a difference between, on the one hand, a first target value of average brightness, and, on the other hand, the first level of average brightness in the first image produced during the previous execution of step c).
- said range of wavelengths transmitted by the first filter elements is between 700 nanometers and 1100 nanometers.
- the second filter elements include red filter elements having a red bandwidth transmitting wavelengths at least between 550 nanometers and 700 nanometers, green filter elements have a green bandwidth transmitting wavelengths between at least 450 nanometers and 650 nanometers and blue filter elements have blue bandwidth transmitting wavelengths for example between 400 nanometers and 550 nanometers.
- red filter elements having a red bandwidth transmitting wavelengths at least between 550 nanometers and 700 nanometers
- green filter elements have a green bandwidth transmitting wavelengths between at least 450 nanometers and 650 nanometers
- blue filter elements have blue bandwidth transmitting wavelengths for example between 400 nanometers and 550 nanometers.
- the different characteristics, variants and embodiments of the invention can be associated with each other in various combinations as long as they are not incompatible or mutually exclusive of each other.
- the invention also provides a system for monitoring a driver of a vehicle comprising an image capture device as described above, as well as a processing unit programmed to determine a level of incapacity driving the driver from at least said first image.
- the level of unfit driving can include a level of drowsiness and / or a level of distraction of the driver.
- FIG. 1 schematically represents a motor vehicle comprising an image capture device implementing the teachings of the invention, seen from the side,
- FIG. 2 shows in more detail certain elements of the image capture device of FIG. 1,
- FIG. 3 schematically represents a network of optical filters with which an image sensor of the image capture device of FIG. 1 is provided, seen from the front,
- FIG. 4 schematically represents this same image sensor, seen from the side
- FIG. 5 schematically represents an infrared image and a color image delivered by the image capture device of FIG. 1, and
- FIG. 6 schematically represents steps executed by a computer of the
- FIG. 1 shows a vehicle 5, here a motor vehicle, provided with a monitoring system 2 of a driver 3 of the vehicle.
- This monitoring system 2 comprises an image capture device 1 and a processing unit 20 which is programmed to determine a level of unsuitability for driving the driver 3, l L , from one or more images delivered by the image capture device 1.
- the image capture device 1 comprises an image sensor 9 ( Figure 2) and an optical system 10 such as a lens.
- the optical system 10 forms, on the sensor of images 9, an image of the content of the field of view 14 of the image capture device 1.
- the image capture device 1 is located in a passenger compartment 7 of the vehicle, in a region adjacent to a vehicle windshield. It is for example integrated in a dashboard, in a dashboard or in a vehicle control console.
- the image capture device 1 is oriented so that its field of vision 14 covers the area usually occupied by the head of the driver 3 when the latter is seated in the driver's seat.
- the image capture device 1 can thus capture images of the driver's face 3.
- the image sensor 9 is a somewhat hybrid sensor, making it possible to visualize the content of the field of vision 14 both in the infrared domain, and in the visible domain.
- the image sensor 9 is provided with a particular network 170 of optical filters (FIG. 3) which allows, with the same matrix 19 of photosensitive pixels, to acquire both:
- infrared image a first image 31, hereinafter called “infrared image”, produced from a first part of the electromagnetic radiation collected by the optical system 10, this first part being located in the infrared domain, and
- a second image 35 produced from a second part of the electromagnetic radiation collected by the optical system 10, this second part being located in the visible range.
- this second image 35 is a color image (it is also called "color image” in the following).
- the image capture device 1 is able to capture the infrared image in question, as well as this color image, while remaining very compact.
- the infrared image 31 can for example be used to monitor the driver 3 and determine his level of inability to drive l L.
- the color image is more pleasing to the human eye and contains more information than an infrared image (usually displayed in grayscale).
- the color image can therefore be acquired for multiple purposes, for example it can be used to communicate with a remote electronic device, for example in the context of a teleconference, or even be kept in memory for security reasons or as a "souvenir photo" immortalizing a journey.
- the image capture device 1 also includes a computer 13, comprising at least one processor and an electronic memory, programmed to control one or more exposure parameters, such as the integration time and the gain used for obtain the images 31, 35 in question.
- the computer 13 is programmed to control this or these exposure parameters as a function of an ambient brightness in the visible range. This makes it possible to obtain a suitable exposure of the second image 35.
- the image capture device 1 also includes a lighting device 11, for
- This lighting device 11 is controlled by the computer 13 as a function of an ambient brightness in the infrared range.
- controlling the power of the infrared radiation emitted by the lighting device 11 also makes it possible to obtain a suitable exposure of the infrared image, even if the exposure parameter or parameters mentioned above are adjusted as a function of the ambient brightness. in the visible domain, not in the infrared domain.
- the lighting device 11 controlled by the computer 13 therefore makes it possible, in a way, to avoid too great a difference between the ambient brightness in the visible range and that in the infrared range, which allows , for the same integration time ti or for the same gain G, to obtain an appropriate exposure both for the infrared image 31 and for the color image 35.
- This image capture device 1 will now be described in more detail.
- the image sensor 9 will be described in a first.
- the way to control the exposure parameters (integration time ti, gain G, aperture Ap) will be presented in a second step.
- the lighting device 11 and its control will be presented next, before describing the way of determining the driver unsuitability parameter 3.
- the network of optical filters 170 of the image sensor 9 is arranged opposite the matrix 19 of photosensitive pixels of this sensor, so as to filter electromagnetic radiation from the field of vision 14 of the image capture device 1 (radiation which has been collected by the optical system 10), before this radiation reaches the photosensitive pixels 21, 2 in question.
- the network of optical filters 170 comprises several filtering elements 171, 172, 173, 174, that is to say several elementary optical filters, each arranged opposite one of the photosensitive pixels 21, 21 '. Each photosensitive pixel thus captures part of the electromagnetic radiation which has been filtered by the filter element with which it is associated.
- these individual filter elements 171, 172, 173, 174 are of different types, for example of the blue, green, red and infrared type, which makes it possible to acquire the color image and the infrared image mentioned above.
- the network of optical filters 170 comprises:
- first filter elements 171 each capable of transmitting a first part of the electromagnetic radiation situated in a given wavelength range of the infrared
- Second filter elements 172, 173, 174 each capable of transmitting at least one component of a second part of the electromagnetic radiation located in the visible range, between 400 nanometers and 700 nanometers.
- the first filter elements 171 transmit only wavelengths located in said infrared wavelength range.
- This wavelength range mainly extends beyond 700 nanometers. It can for example extend from 700 nanometers to 1100 nanometers.
- the second filter elements 172, 173, 174 transmit only the wavelengths between 400 and 700 nanometers. As a variant, they could however transmit both wavelengths located in the visible range, and wavelengths located in the infrared range.
- the second filter elements 172, 173, 174 here include red filter elements 172, green filter elements 173 and blue filter elements 174.
- the terms “red”, “green”, “blue” are used in their sense common.
- the values of the red, green and blue bandwidths set out below are given by way of non-limiting example.
- the red filter elements 172 have a red bandwidth transmitting the component of the second part of the electromagnetic radiation having wavelengths comprised for example mainly between 550 nm and 700 nm.
- the green filter elements 173 have a green bandwidth transmitting the component of the second part of the electromagnetic radiation having wavelengths comprised for example mainly between 450 nm and 650 nm.
- the blue filter elements 174 have a blue passband transmitting the component of the second part of the electromagnetic radiation having wavelengths comprised for example mainly between 400 nm and 550 nm.
- the various filter elements 171, 172, 173, 174 of the network of optical filters are arranged with respect to each other so as to form a pattern 175, repeated regularly to form the network of optical filters 170.
- This pattern 175 comprises here four filter elements, in this case: one of the first filter elements 171, one of the red filter elements 172, one of the green filter elements 173, and one of the blue filter elements 174. These four elements adjacent filters form a square.
- the network of optical filters 170 is therefore comparable to a so-called “Bayer” network, in which one of the green filter elements has been replaced by a transparent element in the infrared.
- the pattern of filter elements, repeated several times to form the network of optical filters could be constituted differently (by including, for example, more green filter elements than red or blue filter elements).
- the matrix 19 of photosensitive pixels it comprises:
- first photosensitive pixels 21 arranged so as to capture the first part of the electromagnetic radiation transmitted by the first filter elements 171, and
- Each of the first and second photosensitive pixels 21, 2 produces, by electrical photo effect, an electrical signal representative of the power of the electromagnetic radiation that it has captured (in a way, each photosensitive element behaves like a photodiode ).
- This electrical signal is produced in the form of a charge or an electrical voltage presented between two terminals of the electrical capacity that constitutes the photosensitive element.
- This electrical signal is produced by the photosensitive pixel 21, 21 'considered at the end of a given integration time ti.
- each of these photosensitive pixels 21, 2 accumulates electrical charges, produced by photoelectric effect, during said integration time ti.
- the electrical signal produced by the photosensitive pixel 21, 2 considered corresponds to the charge, or to the electrical voltage between the terminals of this photosensitive pixel 21, 2, at the end of this integration time ti, sometimes called exposure time, acquisition time, or integration time.
- the electrical signal produced by each of the first and second photosensitive pixels 21, 21 ' is representative of the number of photons received by the photosensitive pixel considered during the integration time ti (for example proportional to this number of photons).
- the semiconductor substrate of the matrix 19 of photosensitive pixels 21, 2 is made of silicon (appropriately doped).
- the sensitivity of the photosensitive pixels in the infrared domain is thus limited to the near infrared domain: the sensitivity range of the first photosensitive pixels 21, provided with the first filter elements 171, is here between 700 nanometers and 1100 nanometers.
- the image sensor 9 is for example of the CMOS type (according to the English acronym of
- a picture taken by the image sensor 9 comprises:
- the values of the integration time ti and the gain G are fixed by the computer 13, which controls the image sensor 9.
- the amplified electrical signals produced during this shooting are processed by the computer 13, during a step c) (FIG. 6), to produce the infrared image 31 and the color image 35.
- composition of the infrared image and the color image Composition of the infrared image and the color image
- the infrared image 31 is formed of a matrix of image pixels 33 (FIG. 5), associated with the different photosensitive pixels 21, 21 ′ of the image sensor 9.
- the computer 13 is programmed to implement an interpolation algorithm making it possible to constitute a “complete” infrared image 31 although only one photosensitive pixel 21 out of four captures infrared radiation in the image sensor.
- the infrared image 31 here comprises as many image pixels 33 as what the image sensor 9 comprises of photosensitive pixels, first 21 and second 21 'photosensitive pixels included.
- Each image pixel 33 of the infrared image 31 is associated with a brightness value, representative of the value of one or more of the amplified electrical signals originating from the first photosensitive pixels 21.
- this brightness value is determined by function of the values of the amplified electrical signals coming from the first photosensitive pixels 21 which are closest to the position corresponds, on the image sensor 9, to the image pixel 33 considered.
- the computer 13 is also programmed to compose the color image 35, shown diagrammatically in FIG. 5, from the amplified electrical signals coming from the second photosensitive pixels 2.
- the color image 35 is also formed from a matrix of image pixels (not shown), associated with the different photosensitive pixels 21, 21 ′ of the image sensor 9.
- the computer 13 is programmed here to implement an interpolation algorithm making it possible to constitute a “complete” color image 35 although only one photosensitive pixel 21 out of four captures radiation located in the red, or green, or blue bandwidth mentioned above.
- the color image here comprises as many image pixels as what the image sensor 9 comprises of photosensitive pixels, first 21 and second 21 'photosensitive pixels included.
- Each pixel image of the color image 35 is associated with a brightness value, representative of values of some of the amplified electrical signals from second photosensitive pixels 21 '.
- This brightness value is representative of the intensity of the visible electromagnetic radiation received by the second photosensitive pixels 2 which, on the image sensor 9, are located in the immediate vicinity of the position which, on this sensor, is associated with the image pixel considered.
- the computer 13 is programmed here to initially compose three monochrome channels based on the amplified electrical signals from the second photosensitive pixels 2.
- the computer 13 thus composes a red channel 37, from the amplified electric signals coming from the second photosensitive pixels located opposite the red filter elements 172. It also composes a green channel 39 from the amplified electric signals coming from the second photosensitive pixels located opposite the green filter elements 173. Finally, it composes a blue channel 41 from the amplified electrical signals coming from the second photosensitive pixels associated with the blue filter elements 174.
- Each of these channels, red 37, green 39 and blue 41, is a gray level image (each pixel image of the channel considered has a brightness value, but no hue or chrominance value), associated with the color of the channel considered and of the same size as the color image 35 (that is to say comprising the same number of image pixels).
- the computer 13 then implements a
- a global luminosity value for the visible range (for example equal to the average of the luminosities of the different channels),
- the computer 13 is programmed here to execute step c) several times in succession, during which it acquires the signals amplified electrics delivered by the image sensor 9 then composes the infrared image 31 and the color image 35.
- This step is carried out here by a pretreatment module 130 of the computer 13 (FIG. 2).
- the exposure parameters of the image capture device 1, namely the integration time ti, the gain G, and, optionally, the aperture Ap of a diaphragm 12 of the optical system 10, are controlled here by an exposure control module 132 of the computer 13 ( Figure 2), during steps a ') and b') ( Figure 6) described below.
- the exposure control module 132 is programmed to determine the ambient brightness in the visible range which was mentioned above, during step a ').
- the ambient brightness in the visible range is representative of the power of visible electromagnetic radiation (the components of which are mainly between 400 nanometers and 700 nanometers), coming from an environment of the image capture device 1 , here coming from the field of vision 14 of this device, and received by a light sensor.
- this brightness sensor is produced by means of the image sensor 9 which has been described above.
- the ambient brightness in the visible range is determined from a color image 35, delivered by the pretreatment module 130 after a previous execution of step c), and corresponding therefore to a shooting previously carried out by the image sensor 9.
- the exposure control module 132 is programmed to determine the ambient brightness in the visible range as a function of the brightness values of at least part of the image pixels of this color image 35, previously acquired by the image capture device 1.
- the ambient brightness in the visible range can, as here, be determined by calculating an average of the brightness values of the image pixels of this color image 35.
- the ambient brightness in the visible range is then representative an average brightness level in this color image 35, hereinafter called the second brightness level, and noted L v .
- the average in question can relate to all of the image pixels of the color image 35, or relate only to some of these image pixels, located in an area of interest of the image corresponding for example to l image of the driver's face 3. Furthermore, it can be provided that this average takes into account only the brightness values of the image pixels which satisfy a given criterion, for example which are included in a given range of values.
- the second level of luminosity L v can, by way of example, be representative of the average luminosity of the zones of low luminosity of the color image 35, or of the zones of intermediate luminosity of this image.
- the ambient brightness in the visible range could be determined as a function of the brightness values of the various chromatic channels of the color image considered (red, green and blue channel), possibly affected by different weighting coefficients , instead of being determined as a function of luminosity values of the image pixels of the overall color image resulting from the fusion of these three chromatic channels.
- step b ' the exposure control module 132 controls the aforementioned exposure parameters, as a function of the ambient brightness in the visible range determined in step a') above.
- the exposure control module 132 here corrects the values of the exposure parameters, which are then applied to the image sensor 9 and to the diaphragm 12, as a function of a difference s2 between:
- a target value of average luminosity hereinafter called the second target value L v, o, and
- the second level of brightness L v average brightness in the color image 35 acquired previously.
- This correction is made so as to gradually bring the second brightness level L v to the second target value L v, o, during repetitions of steps a '), b') and c).
- This correction can consist, for example, of adding a corrective term to a previous value of the exposure parameter considered, this corrective term being proportional to the difference s2 mentioned above (proportional correction). More generally, this correction consists in slaving the second level of brightness L v to the second target value Lv , o.
- This control can in particular be of the proportional, proportional-integral, or proportional, integral and derived type (that is to say of the "PID" type).
- the second target value L v, o corresponds for example to the average brightness in an image considered to be adequately exposed.
- An image is considered to be suitably exposed when, for example:
- the average brightness in this image is included in a given interval, this interval extending for example from a quarter to three quarters of the maximum brightness value which can be associated with an image pixel,
- the second target value L v, o can for example be between a quarter and three quarter of the maximum brightness value mentioned above, or between a third and two thirds of this maximum value. For example, if the brightness values in question are coded on eight bits, being between 0 and 255, the maximum brightness value in question (high saturation) is equal to 255, and the second target value L v , o can then be between 63 and 191, for example, or between 85 and 170.
- the exposure control module 132 controls both the integration time ti, the gain G, and the aperture Ap of the diaphragm of the optical system 10 (aperture diaphragm) .
- the opening Ap could however be fixed (or possibly be manually adjustable).
- the exposure control module 132 could be programmed for, the integration time ti being fixed, controlling only the value of the gain G, as a function of the second level of brightness L v in the color image 35 ( or, conversely, to control only the integration time ti).
- G gain control is sometimes called "automatic gain control” or "AGC” in the specialized literature.
- Lighting device 11 is capable of emitting infrared radiation located at least partly in the range of wavelengths transmitted by the first filter elements 171 of the optical filter network of the image sensor 9 (range including it is recalled that it extends here from 700 nanometers to 1100 nanometers).
- the lighting device 11 can for example be produced by means of light-emitting diodes.
- the infrared radiation emitted by the lighting device 11 is emitted in the form of a light beam 15, directed so as to illuminate at least part of the field of vision 14 of the image capture device 1 (FIG. 1). In this case, this light beam is directed towards the area usually occupied by the face of the driver 3, when seated in the driver's seat.
- the power of the infrared radiation emitted by the lighting device 11 is controlled by a lighting control module 131 of the computer 13 ( Figure 2).
- this module here controls the electrical power P E which supplies the lighting device 11.
- the lighting control module 131 is programmed more precisely to control the power of the infrared radiation emitted as a function of ambient brightness in the infrared domain.
- the ambient brightness in the infrared range is defined here in a manner comparable to the ambient brightness in the visible range presented above, but for the infrared range.
- the control of the power of the infrared radiation emitted is comparable to the control of the exposure parameters mentioned above, but it is carried out according to the ambient light in the infrared domain, instead of being carried out according to the ambient brightness in the visible range.
- the lighting control module 131 is programmed to control the power of the infrared radiation emitted during steps a) and b), shown in Figure 6 and described below.
- the lighting control module 131 determines the ambient brightness in the infrared range.
- the ambient brightness in the infrared domain is representative of the power of infrared electromagnetic radiation (whose components extend beyond 700 nanometers), coming from the environment of the capture device.
- this brightness sensor is produced by means of the image sensor 9.
- the image sensor could however be a separate sensor from the image sensor, such as an infrared photodiode, arranged to receive radiation from the field of vision of the image capture device.
- the ambient brightness in the visible range is determined from an infrared image 31, delivered by the pretreatment module 130 after a previous execution of step c), and therefore corresponding to a shot previously made by the image sensor 9.
- the lighting control module 131 is programmed to determine the ambient brightness in the infrared domain as a function of the brightness values of at least part of the image pixels 33 of this infrared image. 31, previously acquired by the image capture device 1.
- the ambient brightness in the infrared range can, as here, be determined by calculating an average of the brightness values of the image pixels 33 of this infrared image 31.
- the ambient brightness in the infrared range is then representative of an average brightness level in this infrared image 31, hereinafter called the first brightness level, and denoted L
- the average in question can relate to all of the image pixels 33 of the infrared image 31, or relate only to some of these image pixels 33, located in an area of interest of the image corresponding for example to the image of the driver's face 3. Furthermore, it is possible to provide for this average to take into account only the brightness values of the image pixels 33 which satisfy a given criterion, for example which are included in a given range of values.
- R can, by way of example, be representative of the average brightness of the low light areas of the infrared image 31, or of the intermediate light areas of this image.
- step b) the lighting control module 131 controls the electrical power P E which supplies the lighting device 11, as a function of the ambient brightness in the infrared range determined in step a) former.
- the lighting control module 131 here corrects the value of the electric power P E , as a function of a difference if between: - on the one hand, a target value of average brightness in the infrared range, hereinafter called the first target value L
- R average brightness in the infrared image 31 acquired previously.
- This correction is made so as to gradually bring the first brightness level L
- This correction can consist, for example, of adding a corrective term to a previous value of the electrical power P E , this corrective term being proportional to the difference if mentioned above (proportional correction). More generally, this correction consists in slaving the first level of brightness U R to the first target value L
- This control can in particular be of the proportional, proportional-integral, or proportional, integral and derived type (that is to say of the "PID" type).
- the first target value L ! R, o corresponds for example to the average brightness in an image considered to be adequately exposed (an example of definition of an image considered to be adequately exposed was given above).
- the first target value L ! R, o can for example be between a quarter and three quarter of the maximum brightness value that can be associated with an image pixel 33 of the infrared image 31, or be between a third and two thirds of this maximum value. For example, if the brightness values of the image pixels 33 are coded on eight bits, being between 0 and 255, the maximum brightness value mentioned above (high saturation) is equal to 255, and the first target value L ! R, o can then be between 63 and 191, for example, or between 85 and 170.
- the electronic processing unit 20 of the monitoring system 2 is programmed to determine the level of inability to drive l L of the driver 3 from at least one of the infrared images 31 produced by the capture device. of images 1.
- the level of inability to drive l L comprises for example a level of drowsiness of the driver 3 and / or a level of distraction of the driver 3 (the level of inability to drive l L can in particular be driver drowsiness level 3 or distraction level).
- the processing unit 20 can for example be programmed so as to analyze the infrared image 31 in question, or a sequence of infrared images 31 produced by the image capture device 1, in order to identify the face of the driver 3 and / or certain areas of the face of the driver 3, in particular the areas of the infrared image 31 corresponding to the eyes of the driver 3.
- the processing unit 20 can then determine the level of drowsiness of the driver 3 by measuring the duration and / or the blinking frequency of the eyes of the driver 3, previously identified in the infrared image 31.
- the processing unit 20 can determine the distraction level of the driver 3 as a function of a posture of the head of the driver 3 deduced from the infrared image 31, and as a function of the evolution of this posture during time.
- the processing unit 20 can also evaluate (by analysis of the infrared image 31, or of a sequence of infrared images 31), and use to determine the level of distraction and / or the level of drowsiness, the viewing direction of the driver 3 or the evolution of this viewing direction over time.
- the processing unit 20 can also evaluate (by analysis of the infrared image 31, or of a sequence of infrared images 31), and use to determine the level of distraction and / or the level of drowsiness, the diameter of the pupil of at least one eye of the conductor 3 (and precisely the variations of this diameter).
- the processing unit 20 can be programmed to, when determining the level of inability to drive l L of the driver, also take account of one or more color images 35 delivered by the capture device d 'images 1.
- the color image 35 can be used in other applications.
- the computer 13 can for example transmit the color image 35, or a sequence of color images 35 to a telecommunication module 43 of the vehicle 5.
- This telecommunication module 43 is configured to transmit the color image 35, or the sequence of color images 35 received at a remote electronic device, for example a multifunction mobile or a computer, for example via a wifi transmitter.
- the color image 35 or the color image sequence 35 can then be used within the framework of a teleconference, for example a videoconference.
- the computer 13 could also transmit the color image 35, or the sequence of color images 35 to a memory of the vehicle 5 so that it is stored there.
- the second image composed from the electrical signals generated by the second photosensitive pixels of the image sensor, could be a monochrome image, instead of being a color image.
- the second filter elements of the network of optical filters could moreover all be of the same type (for example being all of the green filter elements), instead of comprising three different types of optical filters (respectively red, green and blue).
- module can designate an electronic circuit, or a part of electronic circuit distinct from the other modules, or a specific group of instructions stored in the memory of the computer.
- first and second images mentioned above could correspond to somewhat raw images (sometimes called “RAW” images in English), obtained without interpolation.
- the number of image pixels of the infrared image for example, would be equal to the number of said first photosensitive pixels, instead of being equal to the total number of photosensitive pixels of the image sensor.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1873307A FR3091114B1 (en) | 2018-12-19 | 2018-12-19 | image capture device and associated driver monitoring system |
PCT/EP2019/079617 WO2020126179A1 (en) | 2018-12-19 | 2019-10-30 | Image capture device and associated system for monitoring a driver |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3900320A1 true EP3900320A1 (en) | 2021-10-27 |
Family
ID=66542409
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19791287.6A Withdrawn EP3900320A1 (en) | 2018-12-19 | 2019-10-30 | Image capture device and associated system for monitoring a driver |
Country Status (5)
Country | Link |
---|---|
US (1) | US11845335B2 (en) |
EP (1) | EP3900320A1 (en) |
CN (1) | CN113302912A (en) |
FR (1) | FR3091114B1 (en) |
WO (1) | WO2020126179A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111611977B (en) * | 2020-06-05 | 2021-10-15 | 吉林求是光谱数据科技有限公司 | Face recognition monitoring system and recognition method based on spectrum and multiband fusion |
CN112037732B (en) * | 2020-09-11 | 2021-12-07 | 广州小鹏自动驾驶科技有限公司 | Apparatus, system, and storage medium for brightness control of vehicle display based on infrared camera |
CN115802183B (en) * | 2021-09-10 | 2023-10-20 | 荣耀终端有限公司 | Image processing method and related device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0342708B1 (en) * | 1988-05-20 | 1995-01-11 | Sanyo Electric Co., Ltd. | Image sensing apparatus having automatic iris function of automatically adjusting exposure in response to video signal |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090268023A1 (en) * | 2008-04-27 | 2009-10-29 | Wen-Hsiung Hsieh | Surveillance camera device with a light source |
CN101803928A (en) | 2010-03-05 | 2010-08-18 | 北京智安邦科技有限公司 | Video-based driver fatigue detection device |
US20130222603A1 (en) * | 2012-02-28 | 2013-08-29 | Aptina Imaging Corporation | Imaging systems for infrared and visible imaging |
JP6089872B2 (en) * | 2013-03-28 | 2017-03-08 | 富士通株式会社 | Image correction apparatus, image correction method, and biometric authentication apparatus |
JP6191701B2 (en) * | 2013-11-11 | 2017-09-06 | 日本電気株式会社 | POS terminal device, product recognition method and program |
US9674465B2 (en) * | 2015-06-03 | 2017-06-06 | Omnivision Technologies, Inc. | Non-visible illumination scheme |
JP7043262B2 (en) * | 2015-06-30 | 2022-03-29 | スリーエム イノベイティブ プロパティズ カンパニー | Illuminator |
CN105306796A (en) * | 2015-10-10 | 2016-02-03 | 安霸半导体技术(上海)有限公司 | Night vision equipment with regular infrared illumination function and global shutter CMOS (Complementary Metal Oxide Semiconductor) sensor |
CN106231179B (en) * | 2016-07-29 | 2019-05-24 | 浙江大华技术股份有限公司 | A kind of day and night double optical-filter switcher switching methods and device |
US20180164156A1 (en) * | 2016-12-12 | 2018-06-14 | Microsoft Technology Licensing, Llc | Hybrid Sensor with Enhanced Infrared Detection Capabilities |
CN108965654B (en) * | 2018-02-11 | 2020-12-25 | 浙江宇视科技有限公司 | Double-spectrum camera system based on single sensor and image processing method |
JP7438127B2 (en) * | 2018-04-19 | 2024-02-26 | シーイング マシーンズ リミテッド | Infrared light source protection system |
US11881018B2 (en) * | 2018-07-10 | 2024-01-23 | Hangzhou Taro Positioning Technology Co., Ltd. | Detecting dual band infrared light source for object tracking |
ES2971615T3 (en) * | 2018-09-14 | 2024-06-06 | Zhejiang Uniview Tech Co Ltd | Automatic exposure method and apparatus for dual spectrum imaging and dual spectrum imaging camera and machine storage medium |
-
2018
- 2018-12-19 FR FR1873307A patent/FR3091114B1/en active Active
-
2019
- 2019-10-30 EP EP19791287.6A patent/EP3900320A1/en not_active Withdrawn
- 2019-10-30 US US17/414,607 patent/US11845335B2/en active Active
- 2019-10-30 WO PCT/EP2019/079617 patent/WO2020126179A1/en unknown
- 2019-10-30 CN CN201980089501.2A patent/CN113302912A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0342708B1 (en) * | 1988-05-20 | 1995-01-11 | Sanyo Electric Co., Ltd. | Image sensing apparatus having automatic iris function of automatically adjusting exposure in response to video signal |
Also Published As
Publication number | Publication date |
---|---|
FR3091114B1 (en) | 2021-06-11 |
CN113302912A (en) | 2021-08-24 |
US20220048386A1 (en) | 2022-02-17 |
US11845335B2 (en) | 2023-12-19 |
WO2020126179A1 (en) | 2020-06-25 |
FR3091114A1 (en) | 2020-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3973693B1 (en) | Image capture device for multiple captures and surveillance system for an associated driver | |
EP3900320A1 (en) | Image capture device and associated system for monitoring a driver | |
EP3657784B1 (en) | Method for estimating a fault of an image capturing system and associated systems | |
CA2909554A1 (en) | Device for acquiring bimodal images | |
EP2710340B1 (en) | Camera arrangement for a vehicle and method for calibrating a camera and for operating a camera arrangement | |
FR2984664A1 (en) | DISPLAY DEVICE | |
CN110574368B (en) | Solid-state imaging device, imaging system, and object recognition system | |
FR2633475A1 (en) | LOW-LEVEL TV SYSTEM WITH COLOR IMAGES | |
FR2899696A1 (en) | METHOD FOR PROCESSING A RELATIVE LIGHT PHENOMENON ON A DIGITAL IMAGE AND ASSOCIATED TREATMENT SYSTEM | |
FR3104363A1 (en) | Image capture device | |
EP1351498A2 (en) | Real time processing method of an image signal | |
FR3101504A1 (en) | Image capture device and image capture method | |
EP3729796B1 (en) | Imaging process and imaging system for high and low light levels | |
FR3113165A1 (en) | Method and system for imaging a scene in space | |
FR3081585A1 (en) | SYSTEM FOR PRODUCING IMAGE OF A DRIVER OF A VEHICLE AND ASSOCIATED INBOARD SYSTEM | |
EP4090012B1 (en) | Videoconferencing system for reducing a parallax effect associated with the direction of the gaze of a user | |
FR2968876A1 (en) | System for acquisition of images of scene, has separation unit separating flow of beam from focusing optics into predetermined fractions, and merging unit merging images output from sensors to generate final image | |
WO2012001056A1 (en) | Low-noise bioccular digital vision device | |
WO2020193320A1 (en) | Image-capture device, system and method | |
FR3082032A1 (en) | CONDUCTOR MONITORING DEVICE AND RELATED ON-BOARD SYSTEM | |
FR3129803A1 (en) | Image capture system and imaging assembly comprising such a system | |
CA3161535A1 (en) | Document-analysing terminal and document-analysing method | |
WO2007101934A2 (en) | Method and device for formulating nonsaturated images by a charge transfer camera or the like |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210527 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: VALEO COMFORT AND DRIVING ASSISTANCE |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230824 |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: VALEO COMFORT AND DRIVING ASSISTANCE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20240104 |