GB2611289A - An image processing system and method thereof - Google Patents
An image processing system and method thereof Download PDFInfo
- Publication number
- GB2611289A GB2611289A GB2113545.4A GB202113545A GB2611289A GB 2611289 A GB2611289 A GB 2611289A GB 202113545 A GB202113545 A GB 202113545A GB 2611289 A GB2611289 A GB 2611289A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- illumination module
- state
- wavelength
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000012545 processing Methods 0.000 title claims description 67
- 238000005286 illumination Methods 0.000 claims abstract description 70
- 238000003384 imaging method Methods 0.000 claims abstract description 32
- 238000012544 monitoring process Methods 0.000 claims abstract description 10
- 230000003190 augmentative effect Effects 0.000 claims abstract description 5
- 230000008569 process Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 5
- 230000008901 benefit Effects 0.000 description 24
- 230000003287 optical effect Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 5
- 230000009977 dual effect Effects 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Medical Informatics (AREA)
- Combustion & Propulsion (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Transportation (AREA)
- Software Systems (AREA)
- Chemical & Material Sciences (AREA)
- Optics & Photonics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Studio Devices (AREA)
- Eye Examination Apparatus (AREA)
Abstract
A method and system for obtaining images suitable for eye-tracking. An imaging module 104 captures at least one image while an illumination module 108 is on (204, fig.2), and at least one consecutive image while the illumination module is off (206, fig.2). The at least two consecutive frames are processed to produce a resultant image containing at least one eye across a selected wavelength range (optionally 380-800nm RGB, 700-1000nm, NIR, or 400-700nm, visible light). The pixel intensity of the “on” and “off” images may be calculated, and the ratio of the average value used to control an analog dimming function of the imaging module. The images may be aligned by matching identical features across the frames and the difference between the pixel intensity may be calculated to produce a resultant image in the chosen wavelength. The eye-tracking may be an estimated line-of-sight for virtual reality (VR) applications, an estimated eye-position for augmented reality (AR) applications, or a user-state estimation for driver monitoring. The illumination module may be a light-emitting diode or a vertical cavity surface emitting laser (VCSEL).
Description
AN IMAGE PROCESSING SYSTEM AND METHOD THEREOF
TECHNICAL FIELD
This disclosure relates to image processing, and more in particular, a method, system and device for processing images for eye tracking commonly used in motor vehicle applications such as driver monitoring.
BACKGROUND
Driver monitoring systems (DMS) has been used in the automotive industry for determining a status of operators for some years. The nature of DMS uses a driver's facial characteristics identification, for example eye movement or head position to determine status of the operator.
Increasingly, other types of automotive monitoring systems such as cabin monitoring systems (CMS) is necessitated, of which CMS monitors an entire passenger cabin, to determine for example, number of passengers onboard, intruders while vehicle is parked and/or potential attacks such as robbery. Such applications require image analyses to identify different types of objects within the passenger cabin.
Due to the nature of lighting conditions within a passenger cabin of motor vehicles, it is always challenging to obtain clear, full resolution images for such analysis.
Further, different types of objects within a passenger cabin have different texture and depth, which increases the complexity of identifying different types of objects within a passenger cabin.
There is therefore a need to provide a method and system for processing images suitable for eye tracking, that overcomes, or at least ameliorates, the problems described above. Furthermore, other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taking in conjunction with the accompanying drawings and this background of the disclosure.
SUMMARY
A purpose of this disclosure is to ameliorate the problem of obtaining full resolution images for eye tracking, by providing the subject-matter of the independent claims.
Further, a purpose of this disclosure is to ameliorate the problem of identifying different types of objects captured in a monitoring system, by providing the subject-matter of the independent claims.
The objective of this disclosure is solved by a method of processing images for eye tracking, the method comprising: capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state; and consecutively capturing at least one image, by way of an imaging module, when the illumination module is operating in an OFF state; characterized by: receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information -of at least one eye and -of a selected range of wavelength.
An advantage of the above described aspect of this disclosure yields a method of processing images suitable for eye tracking, of which image information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: the image information of the one or more resultant images comprises at least range of wavelength of the group consisting of: * 380nm to 800nm in a red, blue, green (RGB) range; * 700nm to 1100nm in a near-infrared (NIR) range; and * 400nm to700nm in a visible light range.
The advantage of the above aspect of this disclosure is to select image information falling within a selected scope of range of wavelengths, such that the one or more resultant images processed will only contain image information between 380nm to 1100nm, and preferably between (1) 380nm to 800nm in a RGB range; (2) 700nm to 1100nm in a NIR range; and (3) 400nm to 700nm in a visible light range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: determining, by way of the image processing unit, a first pixel intensity value (P1) from the at least one image captured when the at least one illumination module is operating in the ON state; and determining, by way of the image processing unit, a second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state.
The advantage of the above aspect of this disclosure is to capture images alternatively with controllable amplitude, such image information consecutively captured in different selected wavelengths.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: calculating, a ratio between an average value of the first pixel intensity value (P1) from the at least one image captured when the at least one illumination module is operating in the ON state; and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state; and controlling an analog dimming function of the imaging module in response to the ratio calculated.
The advantage of the above aspect of this disclosure is to control an analog diming in response to a calculated ratio, where the ratio is an average value of a first pixel value of the first image captured and an average value of a second pixel value of the second image consecutively captured. An advantage of obtaining is calculated ratio is such that brightness of lighting module can be adjusted according to the calculated ratio as a form of feedback information between a previous frame and a subsequent frame to achieve precise analog dimming control.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: obtaining the one or more resultant images comprising an image in a near-infrared (NIR) wavelength by subtracting, the first pixel intensity value (P1) from determined according to the at least one images captured when the illumination module is operating in the ON state from the second pixel intensity value (P2) from determined according to the at least one images captured when the illumination module is operating in the OFF state.
The advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the NIR wavelength or within a wavelength range of 700nm to 1100nm, by applying an image subtracting method. An advantages of obtaining a full resolution resultant image containing image information in the NIR wavelength or 700nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: aligning a first image frame captured and a second image frame captured consecutively by: determining, by way of the image processing unit, at least one identical feature between the first image frame captured and the second image frame captured consecutively; and matching, the at least one identical feature determined between the first image frame captured and the second image frame captured consecutively, prior to obtaining the one or more resultant images.
The advantage of the above aspect of this disclosure is to process images of captured by the imaging module, by feature matching and image alignment to produce full resolution resultant images. In particular, this aspect is advantageous for sensing fast-moving objects captured.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: the eye tracking is selected from the group consisting of: estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications; and estimation of a state of a user, for purposes of driver monitoring applications.
The advantage of the above aspect of this disclosure is to yield resultant images suitable for eye tracking, for purposes of virtual reality (VR) applications; augmented reality (AR) applications and driver monitoring applications.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, in which: obtaining the one or more resultant images comprising an image in a red, blue, green (RGB) wavelength and an infrared (IR) wavelength by: capturing the at least one image, by way of the imaging module, when the illumination module is operating in the ON state. and
consecutively obtaining the one or more resultant images comprising an image in a red, blue green (RGB) wavelength and a visible light wavelength by capturing the at least one image, by way of the imaging module, when the illumination module is operating in the OFF state.
The advantage of the above aspect of this disclosure is to obtain one or more resultant images containing image information in the RGB and IR wavelength, by controlling a status of the illumination module, to process at least two images consecutively, in at least two different types of wavelength and more in particular within a range of 380nm to 800nm. An advantages of obtaining a full resolution resultant image containing image information in the RGB and IR wavelength or withing a range of 380nm to 1100nm is to perform further eye tracking analysis in the aforesaid wavelength range.
Preferred is a method of processing images for eye tracking as described above or as described above as being preferred, further comprising: subtracting, by way of the image processing unit, a visible light wavelength from the one or more resultant images obtained.
The advantage of the above aspect of this disclosure is to apply an image subtracting method to remove visible light wavelength such that the one or more resultant images contains resultant images in within a range of selected wavelength without visible light. An advantage of this aspect of this disclosure is to yield suppression of undesirable noise signals.
The objective of this disclosure is solved by an image processing system for processing images for eye tracking function, the system comprising: an illumination module operable to operate between an ON state and an OFF state; an imaging module operable to capture at least one image when the illumination module operates in the ON state; and to consecutively capture at least one image when the illumination module operates in the OFF state; and a processing unit operable to process information of the at least one image captured; characterized in that: the image processing unit is further operable to obtain one or more resultant images in a selected group of wavelength, such that the one or more resultant images are suitable for eye tracking.
An advantage of the above described aspect of this disclosure yields an image processing system of processing images suitable for eye tracking, of which image information obtained is selected from a range of wavelength, such that specific details of image information may be analyse for eye tracking.
Preferred is an image processing system as described above or as described above as being preferred, in which: the imaging module is selected from the group consisting of.
an image sensor operable to capture images in a combination of RGB wavelength and IR wavelength; and a global shutter sensor.
The advantage of the above aspect of this disclosure is to yield an imaging processing system using only a single image sensor configuration suitable for capturing full resolution images in multiple wavelengths. Advantageously, this yields an imaging processing system which requires minimal hardware.
Preferred is an image processing system as described above or as described above as being preferred, in which: the image information of the one or more resultant images comprises at least range of wavelength of the group comprising: * 380 nm to 800 nm in a red, blue, green (RGB) range; * 700nm to 1100nm in a near-infrared (NIR) range; and * 400nm to 700nm in a visible light range.
The advantage of the above aspect of this disclosure yields multiple images, each 25 image containing information in a different wavelength, which maybe processed by the image processing system disclosed herein.
Preferred is an image processing system as described above or as described above as being preferred, in which: the illumination module is selected from the group consisting of: * light-emitting diode (LED); and * vertical cavity surface emitting laser (VSCEL).
The advantage of the above aspect of this disclosure yields different types of illumination module suitable for use in the image processing system disclosed herein.
The objective of this disclosure is solved by a computer program product comprising instructions to cause the image processing system as described above or as described above as being preferred to execute the steps of the method as described above or as described above as being preferred.
The advantage of the above aspect of this disclosure is to yield a computer program product to cause the image processing system to execute the steps of the method as disclosed herein for processing images captured.
The objective of this disclosure is solved by a computer-readable medium having 15 stored thereon the compute program product as described above or as described above as being preferred.
The advantage of the above aspect of this disclosure is to yield a computer-readable medium for storing the computer program product, to cause the image processing 20 system to execute the steps of the method as disclosed herein for processing images captured.
BRIEF DESCRIPTION OF DRAWINGS
Other objects and aspects of this disclosure will become apparent from the following 25 description of embodiments with reference to the accompanying drawings in which: FIG. 1 shows a system block diagram in accordance with a preferred embodiment.
FIG. 2 shows exemplary image frame intervals of images captured in accordance 30 with a preferred embodiment.
FIG. 3 shows a flowchart 300 for processing images for eye tracking in a preferred embodiment.
FIG. 4a-c shows an exemplary frame by frame image subtraction process in a preferred embodiment.
In various embodiments described by reference to the above figures, like reference signs refer to like components in several perspective views and/or configurations.
DETAILED DESCRIPTION OF EMBODIMENTS
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses of the disclosure. Furthermore, there is no intention to be bound by any theory presented in the preceding background of the disclosure or the following detailed description. It is the intent of this disclosure to present an image processing method and system which yields full resolution images for eye tracking purposes.
Hereinafter, the term "first, "second", "third" and the like used in the context of this disclosure may refer to modification of different elements in accordance to various exemplary embodiments, but not limited thereto. The expressions may be used to distinguish one element form another element, regardless of sequence of importance. By way of an example, "a first image" and "a second image" may indicate different images regardless of order or importance. On a similar note, a first image may be referred to as the second image and vice versa without departing from the scope of this disclosure.
FIG. 1 shows a system block diagram 100 in accordance with a preferred embodiment. In particular, system 100 shows an image processing system for eye tracking functions. In an embodiment, system 100 includes a control module 102 for executing image processing functions and an image module 104 for executing image processing functions. The imaging module 104 further includes an image sensor 106 with a lens or imaging optics 110 for receiving light rays and an illumination module 108 having illumination optics 112. The image module 104 may include circuitry for example a driver circuit and/or a digital -analog circuit (DAC) for driving the image module 104. The control module 102 and image module 104 may be in electrical communication.
As shown in FIG. 1, the control module 102 is a system-on-chip (SoC) operable to control the entire image processing system 100 and execute algorithms. The image sensor 106 may be a global shutter type image sensor. Example of imaging optics 110 may include lens and/or optical filters suitable for working with selected wavelengths operable by the image sensor 106. As shown in FIG. 1, the image processing system 100 further includes illumination module 108. Suitable types of illumination module 108 includes light emitting diodes (LED) and vertical cavity surface emitting laser (VCSEL). The image processing system 100 may further include illumination optics 112. Examples of illumination optics 112 includes diffusor or reflector. In an embodiment, the image sensor 106 of the imaging module is operable to capture a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state. By alternatively switching the illumination module 108 in an ON mode and an OFF mode, the image sensor 106 will receive sensor signals within a certain wavelength at a predetermined time interval. This process of capturing a first image when the illumination module 108 is operating in an ON state, and consecutively capture at least second image when the illumination module 108 is operating in an OFF state may be adjusted or designed according to exposure time and gain setting based on image processing requirements. In this disclosure, the focus shall relate to eye tracking functions.
FIG. 2 shows a switching logic mode 200 in accordance with an exemplary embodiment. The aforesaid switching logic mode can be achieved by switching signals generated by either a controller or using logic circuit chips.
As shown in FIG. 2, a pulse wave (A) indicates an exemplary image frame interval 202 to illustrate images captured within one interval by an imaging module, as disclosed herein. A pulse wave (B) illustrates the switching between an ON mode and an OFF mode by an illumination module simultaneous to the capturing of an image in one interval, or the image frame interval 202. By way of an example, when the illumination module is operating in an ON mode 208, a first image frame 204 captured contains a combination of selected wavelengths. Preferably, the first image frame comprises at least two selected wavelengths. In this exemplary embodiment, the first image frame 204 comprises a combination of selected wavelengths, i.e. red (R), green (G), blue (B) wavelength and infrared (IR) wavelength. In response to the illumination module switching to an OFF mode 210, a second image 206 is consecutively captured. In this exemplary embodiment, the second image 210 captured when the illumination module is switched to the OFF mode contains only RGB wavelengths. The aforesaid image frame rate interval 202 repeats as the illumination module continues to switch between an ON mode 208' and an OFF mode 210' to capture multiple first image frame 204 and second image frame 206'. A brightness delta (A) may be obtained by finding the difference in brightness between the first image frame 206 and a subsequent first image frame 206'. A main advantage of this switching logic mode 200 is to that at least two types of images in different wavelengths are captured, thus the images captured may be processed to produce two distinct resultant images, each resultant image comprising image information of a selected range of wavelength, where each of the distinct resultant image contains image information in different types of selected wavelength, i.e. a first image frame containing RGB and IR, while a second image frame contains RGB, thus more information may be obtained from the resultant images In this exemplary embodiment, the image sensor is operable to detect a combination of at least two types of wavelengths. Suitable examples of the image sensor operable to capture sensing signals in dual wavelength may be an image sensor operable to sense red, blue, green (RGB) and infrared (IR) wavelengths. In this exemplary embodiment, the illumination module is a near infrared (NIR) light source. An advantage of this embodiment is to produce high resolution images captured under dimly lit ambient conditions. An exemplary scenario will be capturing images for eye tracking function of an operator sitting within an interior of a motor vehicle operating at night.
FIG. 3 which shows a flowchart 300 for processing images for eye tracking in a preferred embodiment. At step 302, the control module 102 executes a command causing the illumination module 108 to operate in an ON mode, and the imaging module 104 to capture a first image. In this embodiment, the illumination module 108 is operating in the ON mode, the image information captured in the first image comprises image information in dual wavelengths.
The first image may comprise image information in a range of 400nm to 700nm, or a visible light range, and may further comprises image information in a range of 700nm to 1100nm in a NIR range. Accordingly, the control module 102 is operable to determine a pixel value (P1) of the first image, which may include both visible light optical power and NIR light optical power.
At the next step 304, the control module 102 consecutively executes a command causing the illumination module 108 to operate in an OFF mode, and the imaging module 104 to consecutively capture at least one second image. When the illumination module 108 is operating in the OFF mode, the image information captured in the at least one second image comprises image information in a single wavelength or in a range of 400nm to 700nm, or a visible light range. Accordingly, the control module 102 may determine a pixel value (P2) of the second image, which will include both visible light optical power and NIR light optical power.
Optionally at step 306, the control module 102 may further process the at least one second image captured when the illumination module 108 is operating in an OFF mode, to obtain a high-resolution colour image containing image information in 380 nm to 800 nm in a red, blue, green (RGB) range only. An example of suitable imaging processing step may be a demosaicing algorithm, such as colour filter array or colour filter mosaic At step 308, the control module 102 may further process the first image and the at least one second image captured, by subtracting the first pixel value (P1) determined from the second pixel value (P2) determined, such that a resultant image comprising image information in a range of 700nm to 1100nm in a near-infrared (NIR) range is yield. Advantageously, the resultant image produced is a full resolution NIR image. The advantage of processing full resolution NIR image for eye tracking is of importance in the field of machine vision applications, in particular where eye tracking is applicable. By way of example, applicable eye tracking function for estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications and estimation of a state of a user, for purposes of driver monitoring applications.
For clarity and brevity, the principles of image subtraction process in this disclosure are explained in detail below. As mentioned above, one of the advantages of using a single image sensor 106 capturing single RGB sensing signals and IR sensing signals with an NIR illumination module 108 configuration is to capture images for eye tracking function under dimly lit conditions.
The aforesaid configuration addresses the some of the problems in eye tracking image processing systems, i.e., lack of high-quality images to accurately estimate position of eyes.
Under lit conditions, for example during daytime, visible light imaging pixels and NIR imaging pixels share the same exposure time during image capturing but with different quantum efficiency (QE) and irradiance at pixel surface. The brightness difference between visible light optical power and NIR light optical power can lead to poor image quality. To counter the effects leading to poor image quality, the brightness of NIR illumination module needs to be adjusted. This can be achieved by using a driver chip with analog dimming function. However, analog dimming is not controllable via direct inter-chip communication. A control module, such as a controllable DAC chip or a pulse width modulation (PVVM) based resistor-capacitor (RC) circuit, RC network or RC filter, for generating an analog signal may be necessary to control this analog dimming function.
In contrast, the control module 102 of this disclosure executes calculating of a ratio between an average value of the first pixel intensity value (P1) from the at least one image captured when the at least one illumination module is operating in the ON state and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state, and controlling an analog dimming function of the imaging module 104 in response to the ratio calculated. The calculated ratio is used as an additional feedback to control the analog diming settings to the subsequent image frame.
Under extreme ambient lighting condition, an NIR illumination module may not be operable to supply sufficient brightness to meet an expected ratio to yield a full resolution resultant image. Under such circumstances, an optical light neutral density (ND) filter (not shown) may be combined with NIR long pass filter to reduce the ambient visible lighting brightness while maintaining the NIR light brightness as a mitigation solution.
Turning now to FIG. 4a-c, the principles of generating full resolution NIR images at step 308 can be achieved using pixelwise processing. As shown in FIG. 4a, an image containing image information in RGB wavelength and IR wavelength is the minued while a corresponding image represented by FIG. 4b containing image information in RGB wavelength only is the subtrahend. The respective images may be processed by the image processing system 100 at steps 302 and step 308. Applying the image subtraction algorithm, the pixelwise post-processing formulas can be defined as follows: * P3(/R)(1,1) = P1(9 + IR)(1,1)-P2(9)(1,1) * P3UR)(1,2) = Pl(G + IR)(1,2) -P2(G)(1,2) * P3(IR)(1,2) = P1(R + IR)(1,2)-P2(G)(1,3) * * P3(///)(2,2) = P1(IR)(2,2) -P2(/R)(2,2) wherein P = pixel intensity value, for e.g P1 = first pixel intensity value R = red color value B = blue color value G = green color value IR = infrared value Thus, it can be seen, an image processing method and system having the advantage of yielding full resolution in selected wavelengths has been provided.
More advantageously, the image processing system is operable to consecutively capture images in different wavelengths, thus processing at least two images of different distinct wavelengths within an image frame interval. By determining the pixel intensity value of each image pixel, the image processing method and system as disclosed herein yields full resolution images under different ambient lighting conditions, to achieve accuracy to eye tracking. While exemplary embodiments have been presented in the foregoing detailed description of the disclosure, it should be appreciated that a vast number of variation exist.
It should further be appreciated that the exemplary embodiments are only examples, and are not intended to limit the scope, applicability, operation, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the disclosure, it being understood that various changes may be made in the function and arrangement of elements and method of operation described in the exemplary embodiment without departing from the scope of the disclosure as set forth in the appended claims.
Reference Signs System 102 Control module 104 Imaging module 106 Image sensor 108 Illumination module Imaging optics 112 Illumination optics 114 Object 116 Optical power Switching logic mode 202 Image frame interval 204 RGB + IR wavelength image frame 206: RGB image frame 208 Illumination module in ON mode 210 Illumination module in OFF mode 300 Flowchart 300 302 Capturing dual wavelength image 304 Capturing single wavelength image.
306 Demosaicing to yield image in a selected wavelength 308 Image subtraction to yield full resolution image in a selected wavelength
Claims (15)
- Patent claims 1 A method of processing images for eye tracking, the method comprising: capturing at least one image, by way of an imaging module, when an illumination module is operating in an ON state; and consecutively capturing at least one image, by way of an imaging module, when the illumination module is operating in an OFF state; characterized by: receiving and processing, by way of an image processing unit, the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information -of at least one eye and -of a selected range of wavelength.
- 2. The method of claim 1, wherein the image information of the one or more resultant images comprises at least range of wavelength of the group consisting of: * 380 nm to 800 nm in a red, blue, green (RGB) range; * 700nm to 1100nm in a near-infrared (NIR) range; and * 400nm to 700nm in a visible light range.
- 3. The method of claims 1 -2, further comprising: determining, by way of the image processing unit, a first pixel intensity value (P1) from the at least one image captured when the at least one illumination module is operating in the ON state; and determining, by way of the image processing unit, a second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state.
- 4. The method of claim 3, further comprising: calculating, a ratio between an average value of the first pixel intensity value (P1) from the at least one image captured when the at least one illumination module is operating in the ON state; and an average value of the second pixel intensity value (P2) from the at least one image captured consecutively when the at least one illumination module is operating to the OFF state; and controlling an analog dimming function of the imaging module in response to the ratio calculated.
- 5. The method of claims 1 -4, further comprising: obtaining the one or more resultant images comprising an image in a near-infrared (NIR) wavelength by subtracting, the first pixel intensity value (P1) from determined according to the at least one images captured when the illumination module is operating in the ON state from the second pixel intensity value (P2) from determined according to the at least one images captured when the illumination module is operating in the OFF state.
- 6. The method according to any one of the preceding claims, further comprising aligning a first image frame captured and a second image frame captured consecutively by: determining, by way of the image processing unit, at least one identical feature between the first image frame captured and the second image frame captured consecutively; and matching, the at least one identical feature determined between the first image frame captured and the second image frame captured consecutively, prior to obtaining the one or more resultant images.
- 7. The method according to any one of the preceding claims, wherein the eye tracking is selected from the group consisting of: estimation of light of sight of user, for purposes of virtual reality (VR) applications; estimation of eye position of user, for purposes of augmented reality (AR) applications; and estimation of a state of a user, for purposes of driver monitoring applications.
- 8. The method according to any one of the preceding claims, further comprising: obtaining the one or more resultant images comprising an image in a red, blue, green (RGB) wavelength and an infrared (IR) wavelength by: and capturing the at least one image, by way of the imaging module, when the illumination module is operating in the ON state.consecutively obtaining the one or more resultant images comprising an image in a red, blue green (RGB) wavelength and a visible light wavelength by capturing the at least one image, by way of the imaging module, when the illumination module is operating in the OFF state.
- 9. The method according to any one of the preceding claims, further comprising: subtracting, by way of the image processing unit, a visible light wavelength from the one or more resultant images obtained.
- 10.An image processing system for processing images for eye tracking function, the system comprising: an illumination module operable to operate between an ON state and an OFF state; an imaging module operable to capture at least one image when the illumination module operates in the ON state; and to consecutively capture at least one image when the illumination module operates in the OFF state; and a processing unit operable to process information of the at least one image captured; characterized in that: the image processing unit is further operable to receive and process the at least one image captured when the illumination module is operating in the ON state; and the at least one image captured when the illumination module is operating in the OFF state; such that one or more resultant images suitable for eye tracking are obtained wherein the one or more resultant images comprises image information -of at least one eye and -of a selected range of wavelength.
- 11 The system of claim 10, wherein the imaging module is selected from the group consisting of: an image sensor operable to capture images in a combination of RGB wavelength and IR wavelength; and a global shutter sensor.
- 12.The system of claims 10-11, wherein the image information of the one or more resultant images comprises at least range of wavelength of the group comprising: * 380 nm to 800 nm in a red, blue, green (RGB) range; * 700nm to 1100nm in a near-infrared (NIR) range; and * 400nm to 700nm in a visible light range.
- 13.The system of claim 10-12, wherein the illumination module is selected from the group consisting of: * light-emitting diode (LED); and * vertical cavity surface emitting laser (VSCEL).
- 14.A computer program product comprising instructions to cause the image processing system in claim 10 -13 to execute the steps of the method of claims 1 to 10.
- 15.A computer-readable medium having stored thereon the computer program of claim 14.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2113545.4A GB2611289A (en) | 2021-09-23 | 2021-09-23 | An image processing system and method thereof |
PCT/EP2022/073654 WO2023046406A1 (en) | 2021-09-23 | 2022-08-25 | An image processing system and method thereof |
EP22769591.3A EP4405783A1 (en) | 2021-09-23 | 2022-08-25 | An image processing system and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2113545.4A GB2611289A (en) | 2021-09-23 | 2021-09-23 | An image processing system and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202113545D0 GB202113545D0 (en) | 2021-11-10 |
GB2611289A true GB2611289A (en) | 2023-04-05 |
Family
ID=78399700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2113545.4A Pending GB2611289A (en) | 2021-09-23 | 2021-09-23 | An image processing system and method thereof |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4405783A1 (en) |
GB (1) | GB2611289A (en) |
WO (1) | WO2023046406A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7522344B1 (en) * | 2005-12-14 | 2009-04-21 | University Of Central Florida Research Foundation, Inc. | Projection-based head-mounted display with eye-tracking capabilities |
EP3187100A1 (en) * | 2014-08-29 | 2017-07-05 | Alps Electric Co., Ltd. | Line-of-sight detection device |
WO2017134918A1 (en) * | 2016-02-01 | 2017-08-10 | アルプス電気株式会社 | Line-of-sight detection device |
US20180270436A1 (en) * | 2016-04-07 | 2018-09-20 | Tobii Ab | Image sensor for vision based on human computer interaction |
WO2021087573A1 (en) * | 2019-11-07 | 2021-05-14 | Seeing Machines Limited | High performance bright pupil eye tracking |
-
2021
- 2021-09-23 GB GB2113545.4A patent/GB2611289A/en active Pending
-
2022
- 2022-08-25 EP EP22769591.3A patent/EP4405783A1/en active Pending
- 2022-08-25 WO PCT/EP2022/073654 patent/WO2023046406A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7522344B1 (en) * | 2005-12-14 | 2009-04-21 | University Of Central Florida Research Foundation, Inc. | Projection-based head-mounted display with eye-tracking capabilities |
EP3187100A1 (en) * | 2014-08-29 | 2017-07-05 | Alps Electric Co., Ltd. | Line-of-sight detection device |
WO2017134918A1 (en) * | 2016-02-01 | 2017-08-10 | アルプス電気株式会社 | Line-of-sight detection device |
US20180270436A1 (en) * | 2016-04-07 | 2018-09-20 | Tobii Ab | Image sensor for vision based on human computer interaction |
WO2021087573A1 (en) * | 2019-11-07 | 2021-05-14 | Seeing Machines Limited | High performance bright pupil eye tracking |
Also Published As
Publication number | Publication date |
---|---|
EP4405783A1 (en) | 2024-07-31 |
GB202113545D0 (en) | 2021-11-10 |
WO2023046406A1 (en) | 2023-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10230873B2 (en) | Vehicle vision system with color correction | |
US10063786B2 (en) | Vehicle vision system with enhanced low light capabilities | |
US20200244929A1 (en) | Vehicular driving assistance system | |
US10356337B2 (en) | Vehicle vision system with gray level transition sensitive pixels | |
US7646884B2 (en) | Active night vision image intensity balancing system | |
CN110234545B (en) | Imaging system and method for vehicle occupant identification and monitoring | |
CN107709096A (en) | For handling streaming transmitting video image to correct the system and method for the flickering of amplitude modulated light | |
US20160119527A1 (en) | Vehicle vision system camera with dual filter | |
US20210127051A1 (en) | Camera fusion and illumination for an in-cabin monitoring system of a vehicle | |
CN110428637A (en) | A kind of road gate grasp shoot method and road gate capturing system | |
JP7057818B2 (en) | Low light imaging system | |
GB2611289A (en) | An image processing system and method thereof | |
JP6607128B2 (en) | Virtual image display device, virtual image display method, and control program | |
JP6655504B2 (en) | Image processing apparatus, image processing system, moving object, and image processing method | |
US12122298B2 (en) | Camera mirror system with IR LED night vision system | |
CN113168515A (en) | Scanning device for reducing search space of field of view | |
US20220055540A1 (en) | Camera mirror system with ir led night vision system | |
US20230342891A1 (en) | An image processing method for removal of glare spots and system for the same | |
US20240062388A1 (en) | Regional spatial enhancement of rgb-ir image | |
CN116263987A (en) | Method for capturing images in a vehicle interior and vehicle interior camera system | |
CN117333859A (en) | License plate recognition method, storage medium and camera system | |
WO2019003361A1 (en) | Imaging system and imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
COOA | Change in applicant's name or ownership of the application |
Owner name: CONTINENTAL AUTOMOTIVE TECHNOLOGIES GMBH Free format text: FORMER OWNER: CONTINENTAL AUTOMOTIVE GMBH |