WO2013098708A2 - Acquisition de données multispectrales - Google Patents

Acquisition de données multispectrales Download PDF

Info

Publication number
WO2013098708A2
WO2013098708A2 PCT/IB2012/057406 IB2012057406W WO2013098708A2 WO 2013098708 A2 WO2013098708 A2 WO 2013098708A2 IB 2012057406 W IB2012057406 W IB 2012057406W WO 2013098708 A2 WO2013098708 A2 WO 2013098708A2
Authority
WO
WIPO (PCT)
Prior art keywords
image sensor
light
sensor
line
lighting conditions
Prior art date
Application number
PCT/IB2012/057406
Other languages
English (en)
Other versions
WO2013098708A3 (fr
Inventor
Tommaso Gritti
Frederik Jan De Bruijn
Stephanus Joseph Johannes Nijssen
Ruben Rajagopalan
Constant Paul Marie Jozef Baggen
Lorenzo Feri
Robert James Davies
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2013098708A2 publication Critical patent/WO2013098708A2/fr
Publication of WO2013098708A3 publication Critical patent/WO2013098708A3/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/32Investigating bands of a spectrum in sequence by a single detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths

Definitions

  • Embodiments of the present invention generally relate to the field of imaging, and more particularly, to multispectral data acquisition.
  • Image sensors are becoming more and more ubiquitous: from high quality professional video cameras, to harsh weather robust surveillance cameras, industrial cameras for industrial vision, cameras for use in computer game interfaces, and, more recently, cameras in smart phones. Hence there is extremely large set of applications in which cameras are the preferred sensor.
  • Example technologies for use in image sensors include Charge Couple Devices (CCD) and Complementary Metal Oxide Semiconductor (CMOS).
  • the scope for improving image sensing is by no means limited to ever increasing spatial resolution.
  • Another possible extension is to increase the number of wavelengths separately captured by the sensor.
  • One useful property of the typical silicon substrates used in CCD and CMOS cameras is their sensitivity to a relatively large range of electromagnetic radiation around the visible wavelengths: the silicon sensitivity may not be constant, but may for example be sufficient to capture radiation from 360 nm, close to visible blue, to 900 nm, well into the near-infra-red (NIR) domain. This property is typically employed to capture color images.
  • NIR near-infra-red
  • a set of red (R), green (G) and blue (B) filters is formed over groups of neighbouring pixels, so as to form a respective group of three constituent pixels R, G and B for each image point to be captured (the individual constituent RGB pixels in a group are sometimes referred to as sub pixels and the group or image point is sometimes referred to as a color pixel).
  • Multispectral (MS) imaging is an extension over trichromatic image sensing which captures more than three wavelengths for a given spatial image point. Its implementation may greatly depend on the particular application to which it is targeted, which determines the number of separate wavelengths to be measured, the accuracy in the separation of each band (i.e. whether the separate sub-bands should completely separate or whether limited overlap is allowed), the range of distances of the objects to be measured, and, naturally, the price of the solution.
  • multispectral imaging refers to the imaging technology capable of capturing image data at more than three frequencies across the electromagnetic spectrum. Multispectral imaging allows extraction of additional information that the human eye fails to capture with its receptors for red, green and blue (RGB).
  • the multispectral data or information acquired with MS imaging can be used in a variety of fields to identify people or other objects or their constituent material(s) on the basis of their respective absorption or emission spectra, e.g. based on visible and/or infrared imaging.
  • Such fields include for example object recognition and classification, industrial vision, food verification, material recognition, people detection, tracking, and satellite remote sensing to be able to distinguish the material of objects extremely far from the sensor.
  • One way to acquire multispectral image is to use a special instrument that is intrinsically designed to capture more than three frequencies for each image point, for example comprising separate dedicated sensors for each wavelength and/or dedicated optics for each wavelength.
  • a special instrument that is intrinsically designed to capture more than three frequencies for each image point, for example comprising separate dedicated sensors for each wavelength and/or dedicated optics for each wavelength.
  • Such solutions work by trading spatial resolution for spectral resolution.
  • a spectrometer or prism may be used to split an incoming light beam from each image point into its constituent spectrum, to be detected over a group of pixels each arranged to detect the light from a different part of the spectrum for a given image point.
  • an instrument comprises a camera with special invention filter pattern designed with more frequencies of filter than the conventional red, green and blue.
  • Such techniques may be used to acquire multispectral data with relatively high accuracy.
  • their cost is generally too expensive.
  • use and maintenance of these special instruments requires specialists with proficient skills.
  • Other solutions enable the acquisition of multispectral data by means of a more common image sensor, for example a sensor embedded in a standard RGB camera, which does not in itself intrinsically capture light at more than three frequencies at once. Instead the object in question is illuminated using lighting units providing different colors, and different samples are captured for each image point under each of the different illuminations.
  • Such solutions trade off temporal resolution for spectral resolution. They require multiple light sources, with different spectral distributions, to sequentially illuminate the scene. For example one such solution is presented in "Multispectral Imaging Using Multiplexed Illumination " (Park, J.
  • embodiments of the present invention provide apparatus for use in acquiring multispectral data from an object illuminated by a light source including a plurality of different light emitting elements, wherein at least one of the plurality of light emitting elements is activated at a time instant to create varying spectral lighting conditions over time; the apparatus comprising: an image sensor configured such that lines of the image sensor are exposed progressively to acquire light signal samples under the varying lighting conditions for a line of the image sensor; and a module configured to take a multispectral measurement of said object by collecting multiple light signal samples for a same one of said lines of the image sensor under different ones of said spectral lighting conditions, and storing the samples for the same line in a same data set.
  • embodiments of the present invention provide a method for acquiring multispectral data from an object illuminated by a light source including a plurality of different light emitting elements, wherein at least one of the plurality of light emitting elements is activated at a time instant to create varying spectral lighting conditions over time; the method comprising: progressively exposing lines of an image sensor to acquire light signal samples under the varying lighting conditions for a line of the image sensor; and collecting multiple light signal samples for a same one of said lines of the image sensor under different ones of said spectral lighting conditions, and storing the samples for the same line as a multispectral measurement of said object in a same data set.
  • embodiments of the present invention provide a computer program product for use in acquiring multispectral data from an object illuminated by a light source including a plurality of different light emitting elements, wherein at least one of the plurality of light emitting elements is activated at a time instant to create varying spectral lighting conditions over time.
  • the computer program product comprises code embodied on a computer-readable storage medium and configured so as when executed on a processing apparatus to perform operations of: progressively exposing lines of an image sensor to acquire light signal samples under the varying lighting conditions for a line of the image sensor; and collecting multiple light signal samples for a same one of said lines of the image sensor under different ones of said spectral lighting conditions, storing the samples for the same line as a multispectral measurement of said object in a same data set.
  • embodiments of the present invention provide apparatus for use in acquiring multispectral data from an object illuminated by different spectral lighting conditions over time, the apparatus comprising: an image sensor and a separate light sensor arranged such that when the image sensor faces said object the light sensor faces said light source; and a multispectral measurement module configured to take a multispectral measurement of said object by capturing multiple light signal samples for a same spatial element of the said image sensor under different ones of said spectral lighting conditions, using the light sensor to determine under the influence of which of said lighting conditions the spatial element was exposed for each of said samples.
  • embodiments of the present invention provide a system for use in multispectral data acquisition.
  • the system comprises: a light source including a plurality of light emitting elements having different spectra, wherein at least one of the plurality of light emitting elements is activated at a time instant to create varying lighting conditions over time; and an image sensor configured such that lines of the image sensor are exposed progressively to acquire light signal samples under the varying lighting conditions for a line of the image sensor.
  • embodiments of the present invention provide a method for multispectral data acquisition.
  • the method comprises: creating varying lighting conditions over time by controlling a light source including a plurality of light emitting elements having different spectra, such that at least one of the plurality of light emitting elements is activated at a time instant; and progressively exposing lines of an image sensor to acquire light signal samples under the varying lighting conditions for a line of the image sensor.
  • embodiments of the present invention provide a computer program product comprising a computer program tangibly embodied on a computer-readable medium.
  • the computer program is configured to: create varying lighting conditions over time by controlling a light source including a plurality of light emitting elements having different spectra, such that at least one of the plurality of light emitting elements is activated at a time instant; and progressively exposing lines of an image sensor to acquire light signal samples under the varying lighting conditions for a line of the image sensor.
  • the light source includes multiple light emitting elements each having a different spectrum, and is configured such that each light emitting element is, at least partially, emitting light while some of the other light emitting elements are not.
  • the image sensor is configured to work with a "rolling shutter effect.” That is, exposure of the image sensor is done progressively line-byline rather than globally. As such, it is possible to collect samples for any line of the image sensor illuminated by each individual light emitting element. In this way, meaningful pixel level multispectral data encoded in the light source may be captured with a sufficient accuracy, while removing any requirement of synchronization and/or calibration between the image sensor and lighting source. Accordingly, performance and ease of the multispectral data acquisition may be significantly improved.
  • FIG. 1 is a high-level block diagram illustrating a system for use in multispectral data acquisition in accordance with an exemplary embodiment of the present invention
  • FIG. 2 is a schematic diagram illustrating an example of images captured with a standard image sensor under illumination with a light source including two light emitting elements with different spectra in accordance with an exemplary embodiment of the present invention
  • FIGS. 3A-3D are a schematic diagrams illustrating the low-pass filtering characteristics of the acquisition process of an image sensor in accordance with an exemplary embodiment of the present invention
  • FIG. 4 is a schematic diagram illustrating examples of relationship between the light modulation time of the light emitting elements and the sensor line rate of the image sensor in accordance with an exemplary embodiment of the present invention
  • FIG. 5 is a flowchart illustrating a method for use in multispectral data acquisition in accordance with an exemplary embodiment of the present invention
  • FIG 6 is a schematic diagram illustrating an example of a mobile user terminal capturing multispectral data from a scene
  • FIG 7 is a schematic diagram illustrating an example of image data captured using front facing and rear facing image sensors
  • FIG. 8 is a schematic diagram illustrating example color filter transmission spectra for an RGB camera and example emission spectra of red, green and blue LEDs.
  • FIG 9 is a schematic diagram illustrating examples of individual contributions of red, green and blue LEDS when captured with example red, green and blue color filters of a camera.
  • embodiments of the present invention provide a system, method, and computer program product for use in multispectral data acquisition.
  • exposure of the image sensor is performed progressively, while different light emitting elements are activated at different time instances so as to create time- varying lighting conditions.
  • Collaboration of such "rolling shutter” exposure and the varying light conditions is exploited to enable acquisition of multispectral data at pixel level, without any synchronization or calibration.
  • FIG. 1 a block diagram illustrating high-level architecture of a system 100 for use in multispectral data acquisition in accordance with an exemplary embodiment of the present invention is shown.
  • the system 100 in accordance with embodiments of the present invention comprises a light source 102.
  • the light source 102 includes a plurality of light emitting elements 102-1, 102-2...102-N.
  • light emitting diodes LEDs
  • OLEDs organic light emitting diodes
  • the light emitted by the light emitting elements 102-1, 102-2...102-N will illuminate the scene, and especially a target object 104. It is noted that though only one target object is shown in FIG. 1, there could be more than one object in the scene.
  • the light source is controlled such that at least one of the plurality of light emitting elements 102-1, 102-2...102- N is activated at each time instant to create light conditions that vary over time.
  • the light source may comprise a driver (not shown) responsible for activating different light emitting element(s) at different time instances according to predefined controlling strategy.
  • the light source 102 is controlled so that the light emitting elements 102-1, 102-2...102-N is activated alternatively or sequentially, for example.
  • each of the light emitting elements 102-1, 102-2...102-N may be activated, at least for a short time span, while all the others are not active. Durations of the active times for different light emitting elements may or may not be equivalent.
  • the target object 104 will be illuminated by a different lighting element included in the light source. As a result, the lighting conditions varying over time are created in the scene.
  • combination in which the illuminations of different light emitting elements 102-1, 102-2...102-N overlap for some time is possible as well.
  • one or more of the light emitting elements 102-1, 102-2...102-N may be activated at a time instant.
  • different one or more of the light emitting elements 102-1, 102- 2...102-N may be activated.
  • different combinations or subsets of the multiple light emitting elements are activated, thereby creating varying lighting conditions over time.
  • the system 100 further comprises an image sensor 106 for capturing image data.
  • the image sensor 106 may be configured such that the lines thereof are exposed progressively to acquire light signal samples (i.e., pixel values). That is, the exposure of the image sensor 106 is not based on a global timing. Instead, each line of the image sensor 106 is reset after the previous one, giving rise to an effect known as "rolling shutter effect.”
  • CMOS sensors such as most of image sensor embedded in mobile phones or portable computers may work with rolling shutter acquisition.
  • standard and cheap image sensors that are available in the market may be used as the image sensor 106 in embodiments of the present invention.
  • lines of the image sensor may be exposed sequentially at a given rate referred as sensor line rate.
  • more than one but less than all lines of the image sensor may be exposed at a time.
  • Other progressive patterns for exposure of the image sensor are also possible, and the scope of the invention is not limited in this regard.
  • CMOS sensor functions as the image sensor, and the image sensor is equipped with three color filters for red (R), green (G), and blue (B), respectively. Lines of the image sensor (CMOS sensor in this case) are exposed progressively one by one.
  • a first light emitting element is activated to illuminate the scene, thereby creating a first lighting condition.
  • a given line of the image sensor is exposed to capture light signal samples for this given line under the first lighting condition.
  • this given line is exposed again under a different second lighting condition, for example, created by the illumination of a different second light emitting element.
  • the mechanism works based on the fact that even if the frequency of switching among the different light emitting elements is much higher than the line rate of the image sensor, it is still able to collect light signal samples for a same line under different lighting conditions created by different light emitting elements.
  • FIG. 2 For example, referring to FIG. 2, three consecutive image frames captured by a standard RGB-based image sensor are shown. There are two LEDs serving as the light emitting elements in the light source. As seen from the three consecutive images frames, there is a rolling shutter effect in which a series of strips representing illuminations by different LEDs are observed as "rolling" over the target object which is marked by a rectangle in the top row. That is, the contributions by different light emitting elements are available, even though in this case the light modulation frequency of the LEDs is set to be 298 Hz which is much higher than the frame rate of the image sensor and is also above the frequencies causing perceivable flicker in the captured images.
  • each light signal sample may be stored according to the lighting condition under which the current sample is acquired.
  • samples for a line that are acquired under the same lighting condition may be stored in a same dataset, and samples under different lighting conditions are stored in different datasets.
  • multispectral data for the line may be derived, for example, by combining the light signal samples acquired under different lighting conditions.
  • the calculation of multispectral data may be performed by one or more processors 108 associated with the image sensor.
  • the processor(s) may be co-located with the image sensor in a camera.
  • the processor(s) may be located in a separate device from the image sensor.
  • the information on characteristics of the target object or its portion corresponding to the line(s) of the image sensor may be obtained.
  • material information of the target object may be extracted from the multispectral data for the corresponding sensor line(s).
  • a requirement for material information extraction may be that the image sensor and the target object shall not move for the number of image frames required to gather enough samples for the corresponding lines under different lighting conditions. Possible extensions would include motion estimation or more tracking algorithms based on computer vision to compensate for camera/object motions.
  • the characteristics information of the target object is the cooking time when the target object is a food. Specifically, it is possible to determine the ingredients of the target object (food in this case) based on the material information extracted from the multispectral data. Then the cooking time of the food may be evaluated automatically.
  • the light signal samples acquired for each line of the image sensor may include information on individual color channels of the current light primary. For example, in exemplary embodiments where the image sensor operates on RGB primaries, the image sensor may be configured to establish under the influence of which light primary the line was exposed.
  • sampling values for each individual color channel namely, R, G and B
  • R, G and B may be obtained and stored separately.
  • Those skilled in the art will readily appreciate that measuring contributions of individual light emitting elements with different color channel would increase the information captured by the image sensor compared to what would be available under white light. For example, for an image sensor with three color filters for red, green and blue, respectively, and a light source including three light emitting elements, nine kinds of spectral samples may be acquired. Further, it is found in practice that the spectral samples for each color channel are sufficient enough to increase the performance of multispectral data acquisition and potential subsequent measurements. Of course, it is noted that embodiments of the present invention are of course applicable to the white light.
  • the system 100 allows the collaborative setting of the light emitting elements 102- 1 , 102-2. . . 102-N in the light source 102 and the image sensor 106, in order to further improve the performance of multispectral data acquisition.
  • light emitting elements usually implement certain forms of time modulation.
  • most LEDs are typically configured with a light modulation frequency conforming to pulse-width-modulation (PWM) or other forms of modulation.
  • PWM pulse-width-modulation
  • the image sensor has its own exposure time and sensor line rate.
  • the image sensor may be configured such that the lines thereof are exposed sequentially at a certain sensor line rate, as described above. In each exposure, the line is exposed for a time interval referred as the exposure time.
  • the exposure time of the image sensor 106 is denoted as C ET
  • the sensor line rate of the image sensor 106 is denoted as S LR
  • the light modulation frequency of a light emitting element is denoted as L MF -
  • the light from each light emitting element 102- 1 , 102-2. . . 102-N may be time-modulated according to PWM.
  • FIG. 3A shows a schematic diagram of the light signal sent by a lamp with a light modulation frequency of L MF - For simplicity only a single primary is illustrated and the resulting signal is a single train of pulses. The resulting light is a series of pulses, as shown in FIG.
  • FIG. 3A where the horizontal axis represents time (T) and the vertical axis represents the signal level which pulses with frequency (f).
  • the resulting signal may be a linear combination of various trains of pulses (one per primary), all with the same base frequency.
  • FIG. 3B shows the corresponding spectrum.
  • C ET and L MF are set to make sure the code is visible.
  • S LR is set to be larger than two times the highest light modulation frequency that is used in the system. More generally it is desirable to set it as high as possible.
  • FIG. 3C shows the low-pass filter characteristic of the acquisition process of a rolling shutter camera with an exposure time C ET - Referring to FIG. 3C where the horizontal axis represents frequency (f) and the vertical axis represents signal level, it can be seen that at the blind spots 302-1, 302-2...302-M corresponds to multiples of 1/C ET where the low-pass filter has a value of zero.
  • each of the multiple light emitting elements 102-1, 102-2...102-N is configured with a light modulation frequency (L MF ) that is determined at least partially based on the line rate (C ET ) of the image sensor 106 to avoid the blind spots in signal detection.
  • configuration of the system 100 preferably ensures that L MF ⁇ m ⁇ 1/C ET - Other factors may also be taken into consideration when determining the light modulation frequency of the light emitting elements, for example, the status and requirements related to the light emitting elements themselves.
  • the setting of light modulation frequency based on the sensor line rate does not imply any synchronization or calibration between the light emitting elements and the image sensor like the prior art, but rather only a much less frequent update is required, for example, when the light hue, saturation, or brightness of the light emitting elements is changed.
  • the scope of the invention is not limited to the above constraint of the exposure time and light modulation frequency.
  • a light emitting element has an irregular time modulation pattern other than a PWM
  • the sensor line rate S LR of the image sensor 106 is set to be greater than the light modulation frequency L MF of the light emitting elements 102-1 . . . 102-N.
  • L MF light modulation frequency
  • FIG. 4 examples are shown where the light modulation frequency L MF of light emitting elements is decreased while keeping the sensor line rate S LR unchanged (L MF decreases from the top-left to bottom- right in FIG. 4). It is seen that the lower L MF , the larger amount of consecutive lines contain the samples under the same lighting condition.
  • the benefit of having larger bands of uniform contributions is that the identification and acquisition of multispectral information is more reliable, since such information may be derived from more samples.
  • the sensor line rate S LR of image sensor is preferably set to be much higher than the light modulation frequency L MF - AS an example, S LR may be set to be greater than twice of the L MF - Any other suitable settings are also possible. In fact, the sensor line rate S LR may be set as high as possible.
  • the light source and image sensor may be configured and controlled independently. Even if the collaborative settings may be performed between the light source and the image sensor for further improvement of the system performance, such settings are quite simple and less frequent.
  • FIG. 5 a flowchart illustrating a method for use in multispectral data acquisition in accordance with an exemplary embodiment of the present invention is shown.
  • step S502 lighting conditions varying over time are created by controlling a light source (e.g., the light source 102 as shown in FIG.l) including a plurality of light emitting elements (e.g., the light emitting elements 102-1, 102-2...102-N).
  • a light source e.g., the light source 102 as shown in FIG.l
  • a plurality of light emitting elements e.g., the light emitting elements 102-1, 102-2...102-N.
  • Each of the light emitting elements in the light source has a spectrum different with each other.
  • LEDs, OLEDs, or fluorescents may be used as the light emitting elements.
  • the light source is controlled such that at least one of the plurality of light emitting elements is activated at a time instant.
  • the light emitting elements may be configured to sequentially illuminate the scene. That is, each of the light emitting elements may be activated, at least for a short time span, while all the others are not active.
  • combination in which the illuminations of different light emitting elements overlap for some time may be exploited as well.
  • one or more of the light emitting elements 102-1, 102-2...102-N may be activated at a time instant, and the varying lighting conditions are created by different combinations of the light emitting elements.
  • the light modulation frequency of any light emitting element may be determined at least partially based on the exposure time of the image sensor. In this way, the blind spots where the light signal samples goes undetectable may be eliminated.
  • step S504 lines of an image sensor are progressively exposed.
  • the acquired light signal samples may include information on individual color channels, e.g., R, G, B channels.
  • the sensor line rate may be set to be, for example, two times greater than the light modulation frequency of each of the light emitting elements.
  • the method 500 may proceed to step S506 to derive multispectral data for a line of the image sensor by combining samples for the line that are previously acquired under different light conditions.
  • Multispectral data may be derived accordingly by any suitable methods or algorithms, and the scope of the invention is not limited in this regard.
  • step S508 information on characteristics is determined, for example, material information of the captured target object corresponding to one or more lines of the image sensor.
  • the above has introduced a methodology capable of operating without any synchronization between image sensor and lighting unit, by exploiting the rolling shutter characteristics available in most image sensors.
  • Such a system allows for further reducing the cost of a multispectral imaging system, thanks to the possibility of adopting low cost image sensors which can be manufactured in large volumes.
  • the system comprises:
  • a light source 102 containing more than one light emitting elements 102-1 ... 102-N such that the spectrum of each light emitting element is not identical to any of the other, e.g. each emitting a respective light primary (Lp 1 );
  • each light emitting element 102-1 ... 102-N is, at least partially, emitting light while some of the other light emitting elements are not;
  • a processing unit 108 used to analyze the images captured by the image sensor, applying the following processing for each incoming frame
  • the processing for each frame comprises:
  • step (v) processing combined information (a set of C(Lp J )) to extract material information for each individual line.
  • step (ii) processing combined information (a set of C(Lp J )) to extract material information for each individual line.
  • step (ii) there may be a limitation related to optional step (ii), where each acquired line is analysed to establish under which light primary it had been acquired. That is, without any assumption on the material being observed, it may not be possible to univocally identify which light primary is affecting each line.
  • a pixel which is captured resulting in a green value could be either be the result of a white light primary on a green material, or of green light primary on a white material.
  • the sensor 106 collects multispectral data for a given line of the image which increases the amount of information available to the skilled operator, for him or her to use as he or she finds useful.
  • the skilled operator may have predetermined knowledge of the spectra emitted by the light source 102, e.g. knowing it to comprise a sequence of certain red, green and blue LEDs.
  • the skilled operator may know that the material in question is one of a certain category or subset, e.g. is trying to distinguish between a few different types of plastic or foodstuffs having known properties. Given this context, it may be that only one of the six possible interpretations maps to the absorption or emission spectrum of one of the expected materials.
  • step (v) may be performed as described in WO 2012/020381, which provides a training approach which uses machine learning to mitigate the need for tedious calibration. More generally the skilled user can use the multispectral data collected by the present invention in any way he or she sees fit.
  • an apparatus comprising the image sensor and a separate light sensor, in which the light sensor is arranged to face the light source when the image sensor faces the object in question.
  • the apparatus may comprise a mobile user terminal housing the image sensor and light sensor, the image sensor being mounted on one face of the mobile user terminal and the light sensor being mounted on an opposing face of the mobile user terminal.
  • the light sensor may comprises a second image sensor configured such that lines of the second image sensor are also exposed progressively to acquire light signal samples under the varying lighting conditions.
  • the two sensors may be comprised by the front and back cameras of a mobile user terminal, e.g. a "smart phone" or other "smart device”.
  • FIG. 6 gives an example usage scenario, illustrating the positions of a second sensor 604 which may be referred to as the direct-line sensor (e.g. front facing camera) and the first sensor 102 which may be referred to as the reflected- light sensor 106 (e.g. back facing camera) mounted on opposing faces of a smart device 602.
  • the object 104 being studied e.g. a fresh produce display, is illuminated by multiplexed illumination 606 from the light source 102, and this light 606 is reflected back from the object 104 into the reflected- light sensor 106 (e.g. rear facing camera of the device 602).
  • the light source 102 may be mounted on a ceiling 608 above the object 104.
  • the second, direct-line sensor 604 being mounted on the opposing face of the device 602, points in a direction of the multiplexed illumination 606 when the other sensor 106 is pointed in a direction of the object 104 from which that same light 606 is reflected.
  • each of the two image sensors 106, 604 is a separate "rolling shutter” type image sensor in which lines of the sensor are progressively exposed (rather than all lines being exposed globally at once).
  • the progressive exposure of the lines of the two image sensors 106, 604 may be synchronized so that a co-exposed line of the second image sensor 604 can be used to establish under the influence of which of the spectral lighting conditions a given line of the first sensor 106 was exposed for each of said samples.
  • the processing module 108 is configured, in taking the multispectral measurement, to use the second, direct-line facing image sensor 604 (or other separate light sensor) to determine information establishing under the influence of which of the spectral lighting conditions a given line of the first, reflected-light sensor 106 was exposed for each of the samples.
  • the processing module 108 stores this information with the samples for processing the multispectral measurement.
  • the accuracy of the measurement is further enhanced by exploiting the possibility to simultaneously acquire, by means of front and back facing cameras of a smart device, data from the camera facing the light source directly, and data from the camera facing the object under analysis containing the very same light modulation after reflection by the object.
  • the data acquired from the camera facing directly the light source allows the light primary affecting every line to be established, while the other camera is exploited in the same fashion as the methodology described above, with the difference of not having to solve for the unknown in step (ii).
  • the invention exploits the characteristics of the time modulation of LED based light sources.
  • Most LED drivers implement some form of pulse- width-modulation (PWM) in order to achieve different light intensity settings. For RGB and, more generally, for multi-primary lights, each primary LED is modulated in order to obtain the desired color temperature.
  • PWM pulse- width-modulation
  • mage sensors embedded in smartphones, tablets and compact cameras are based on CMOS sensors for which exposure is not based on a global timing.
  • each line of the sensor is reset after the previous one, giving rise to an effect known as "rolling shutter effect". This effect is particularly noticeable when recording videos of object moving across the camera field of view horizontally.
  • the system comprises the following:
  • a light source 102 containing more than one light emitting elements 102-1...102-N such that the spectrum of each light emitting element is not identical to any of the other, e.g. each emitting a respective light primary (Lp 1 );
  • each light emitting element 102-1...102-N is, at least partially, emitting light while some of the other light emitting elements are not;
  • a device 602 embedding two image sensors 106, 604, both with rolling shutter acquisition, both with the same sensor characteristics, facing in two opposite directions, and for which acquisition is synchronized (i.e. the time at which a line is exposed on one sensor can be related to the time at which another line on the other sensor is exposed);
  • a processing unit 108 used to analyze the images captured by both image sensors 106, 604, applying the following processing for each incoming frame.
  • identifying which of the two image sensors 106, 604 is facing the light source directly, i.e. which is the direct-light sensor, and as a consequence which is capturing reflected light, i.e. which is the reflected-light-sensor;
  • FIG.7 illustrates an image of the ceiling 608 directly above the device 602, captured by the smart device front-facing camera 604, i.e. the direct line sensor.
  • the horizontal bars represent the light modulation effect captured by the rolling shutter of the camera 604, in which one or more light primaries (Lp 1 ) are visible in each captured line.
  • the right hand side of FIG. 7 illustrates an image of the produces 104 (or other object or objects) directly below the device 602, captured by the device back-facing camera 102, i.e. the reflected light sensor.
  • the information related to the light primaries (Lp 1 ) extracted from the direct line sensor 604 is applicable to the reflected light sensor 106. This can advantageously be used in overcoming the ambiguities related to the derivation of the light primary (Lp 1 ).
  • FIG. 8 and FIG. 9 illustrate a reason why measuring several LED contributions with different color channels increases the information captured by the camera, compared to what would be available under white light.
  • FIG. 8 illustrates a typical color- filter transmission spectra for RGB cameras, and emission spectrum of most common red (R), green (G) and blue (B) LEDs.
  • FIG. 9 illustrates individual contributions of R, G and B LEDs when captured with blue color filter of a camera with color filters shown in FIG. 8, corresponding to C 3 (Lp 1 ), C 3 (Lp 2 ) and C2(Lp 3 ).
  • the values corresponding to the red and green color filter of the camera can be computed in a similar way. For the case shown in FIG.
  • multispectral imaging is well suited for all kinds of material recognition and can strongly increase accuracy of object classification in the wider field of computer vision, e.g. which captures an image with a standard trichromatic sensor.
  • One example application is food recognition for optimal cleaning, e.g. for applying a proper amount of cleaning depending on recognized fruit and/or vegetables. Too aggressive cleaning of a vegetable in which pesticides typically do not penetrate too deep may also remove vitamins. On the other hand too soft cleaning of a vegetable which has absorbed too much pesticide would be insufficient. Recognizing fruits and/or vegetables solely from shape, texture or color can be extremely challenging, given the variation in appearance. Classifying the types using embodiments of the present invention would allow for an easier and at the same time more robust approach.
  • Another example application is food recognition for optical cooking time, e.g. automatic cooking time detection becomes possible with a simple camera. Recognition of ingredients based on embodiments of the present invention would greatly improve robustness of the system.
  • Another example application is material recognition for floor care. Different floors require different treatment, and recognition of them on the basis of colour/texture alone can be challenging (especially given the fact that often, plastic based floors are printed to mimic different materials). Embodiments of the present invention would allow a much more robust detection, with the addition of commoditized hardware. Yet another example application is material recognition for automatic lighting calibration. Light sensors are commonly used to sense the amount of light present in the environment. A typical installation procedure requires a series of steps which are aimed at measuring the reflectivity coefficient of the objects at which the light sensor is pointing. With the embodiments of the present invention it the different type of materials may be recognized, thus allowing for an automatic setup.
  • an image sensor and separate light sensor e.g. another image sensor such as an opposing camera
  • the idea may be expanded to determine lighting conditions under which any spatial element of an image sensor is exposed, from a global exposure even down to individual pixels.
  • an apparatus comprising an image sensor and a separate light sensor arranged such that when the image sensor faces the object in question the light sensor faces the light source illuminating that object.
  • the apparatus also comprises a multispectral measurement module configured to take a multispectral measurement of said object by capturing multiple light signal samples for a same spatial element of the said image sensor under different ones of said spectral lighting conditions, using the light sensor to determine under the influence of which of said lighting conditions the spatial element was exposed for each of said samples.
  • the apparatus may comprise a mobile user terminal housing said image sensor and light sensor, the image sensor being mounted on one face of the mobile user terminal and the light sensor being mounted on an opposing face of the mobile user terminal; and again the light sensor may comprise a second image sensor, e.g. one of the image sensors comprising a front-facing camera of the mobile user terminal and the other comprising a rear-facing camera of that same terminal.
  • the invention need not be dealt with as an entire system comprising both light source and image sensor - rather, as the light source and sensor parts of the system are capable of being operated separately, so they may be provided separately and/or implemented by different parties.
  • the invention is embodied in a separate sensing device (e.g. smart device 602) which can operate in the presence of whatever lighting happens to already be present in the environment, as long that lighting creates different lighting conditions, e.g. using the inherent LED modulation in the existing, everyday lighting of a room.
  • the option of dedicated lighting being provided for the multispectral imaging is not excluded either.
  • some specific embodiments thereof have been described above.
  • any suitable light emitting elements can be used to put the invention into practice.
  • any suitable image sensor such as CMOS-based or Charge Couple Device (CCD) based ones may be used in embodiments of the invention, whether currently known or developed in the future.
  • the image sensor may work under any color model other than RGB model.
  • Multispectral data may be easily captured by configuring a system (for example, the system 100 as shown in FIG.1) comprising a light source and an image sensor.
  • the light source includes multiple light emitting elements each having a different spectrum. Each light emitting element is controlled to be activated while at least some of the other light emitting elements are not.
  • the image sensor is configured to work with "rolling shutter effect," namely, performing exposure progressively line-by-line. By use of this simple configuration, it is possible to collect samples for every line of the image sensor illuminated by each individual light emitting element. In this way, meaningful pixel level multispectral information encoded in the light source may be captured with a sufficient accuracy, while removing any requirement of synchronization or calibration between the image sensors and lighting source. Accordingly, performance of multispectral data acquisition may be significantly improved.
  • the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • While various aspects of the exemplary embodiments of this invention may be illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • various blocks shown in FIG. 5 may be viewed as method steps, and/or as operations that result from operation of computer program code, and/or as a plurality of coupled logic circuit elements constructed to carry out the associated function(s).
  • At least some aspects of the exemplary embodiments of the inventions may be practiced in various components such as integrated circuit chips and modules, and that the exemplary embodiments of this invention may be realized in an apparatus that is embodied as an integrated circuit, FPGA or ASIC that is configurable to operate in accordance with the exemplary embodiments of the present invention.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

La présente invention a trait à l'acquisition de données multispectrales en provenance d'un objet éclairé par une source de lumière qui inclut plusieurs éléments électroluminescents, chaque élément électroluminescent ayant un spectre différent. Les lignes d'un capteur d'image sont progressivement exposées, et une mesure multispectrale est réalisée grâce à la collecte de plusieurs échantillons de signaux lumineux pour une même ligne parmi les lignes du capteur d'image dans des conditions d'éclairage spectrales différentes.
PCT/IB2012/057406 2011-12-30 2012-12-18 Acquisition de données multispectrales WO2013098708A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161581796P 2011-12-30 2011-12-30
US61/581,796 2011-12-30

Publications (2)

Publication Number Publication Date
WO2013098708A2 true WO2013098708A2 (fr) 2013-07-04
WO2013098708A3 WO2013098708A3 (fr) 2013-09-12

Family

ID=47664373

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/057406 WO2013098708A2 (fr) 2011-12-30 2012-12-18 Acquisition de données multispectrales

Country Status (1)

Country Link
WO (1) WO2013098708A2 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9060113B2 (en) 2012-05-21 2015-06-16 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
WO2016190884A1 (fr) * 2015-05-28 2016-12-01 Empire Technology Development Llc Activation simultanée de multiples sources d'éclairage pour analyse d'échantillon
US9593982B2 (en) 2012-05-21 2017-03-14 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US9979853B2 (en) 2013-06-07 2018-05-22 Digimarc Corporation Information coding and decoding in spectral differences
WO2018067212A3 (fr) * 2016-06-20 2018-06-14 Massachusetts Institute Of Technology Procédés et systèmes d'imagerie multiplexée à codage temporel
US10113910B2 (en) 2014-08-26 2018-10-30 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US10452935B2 (en) 2015-10-30 2019-10-22 Microsoft Technology Licensing, Llc Spoofed face detection
CN110927073A (zh) * 2019-11-06 2020-03-27 广东弓叶科技有限公司 多光谱成像方法、电子装置及存储介质
WO2020178052A1 (fr) * 2019-03-01 2020-09-10 Basf Coatings Gmbh Procédé et système de reconnaissance d'objets par l'intermédiaire d'une application de vision artificielle
CN113777063A (zh) * 2021-09-07 2021-12-10 福州大学 一种果蔬农残快速实时检测多光谱探测系统及其使用方法
US11295152B2 (en) * 2019-03-01 2022-04-05 Basf Coatings Gmbh Method and system for object recognition via a computer vision application

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012020381A1 (fr) 2010-08-11 2012-02-16 Koninklijke Philips Electronics N.V. Procédé et appareil permettant de reconnaître un objet intéressant

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7557924B2 (en) * 2005-08-15 2009-07-07 X-Rite, Inc. Apparatus and methods for facilitating calibration of an optical instrument
JP2010517460A (ja) * 2007-01-29 2010-05-20 ジョンイル パク マルチスペクトル映像取得方法およびその装置
US8253824B2 (en) * 2007-10-12 2012-08-28 Microsoft Corporation Multi-spectral imaging

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012020381A1 (fr) 2010-08-11 2012-02-16 Koninklijke Philips Electronics N.V. Procédé et appareil permettant de reconnaître un objet intéressant

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DU, H. ET AL.: "A Prism- based System for Multispectral Video Acquisition", IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, 2009
PARK, J. ET AL.: "Multispectral Imaging Using Multiplexed Illumination", IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, 2007
PARK, J. ET AL.: "Multispectral Imaging using multiplexed illumination", IEEE INTERNATIONAL CONFERENCE ON IMAGE VISION, 2007

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9060113B2 (en) 2012-05-21 2015-06-16 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US10498941B2 (en) 2012-05-21 2019-12-03 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US9593982B2 (en) 2012-05-21 2017-03-14 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US9979853B2 (en) 2013-06-07 2018-05-22 Digimarc Corporation Information coding and decoding in spectral differences
US10447888B2 (en) 2013-06-07 2019-10-15 Digimarc Corporation Information coding and decoding in spectral differences
US10113910B2 (en) 2014-08-26 2018-10-30 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
WO2016190884A1 (fr) * 2015-05-28 2016-12-01 Empire Technology Development Llc Activation simultanée de multiples sources d'éclairage pour analyse d'échantillon
US10753794B2 (en) 2015-05-28 2020-08-25 Empire Technology Development Llc Concurrent activation of multiple illumination sources for sample analysis
US10452935B2 (en) 2015-10-30 2019-10-22 Microsoft Technology Licensing, Llc Spoofed face detection
WO2018067212A3 (fr) * 2016-06-20 2018-06-14 Massachusetts Institute Of Technology Procédés et systèmes d'imagerie multiplexée à codage temporel
US10425598B2 (en) 2016-06-20 2019-09-24 Massachusetts Institute Of Technology Methods and systems for time-encoded multiplexed imaging
WO2020178052A1 (fr) * 2019-03-01 2020-09-10 Basf Coatings Gmbh Procédé et système de reconnaissance d'objets par l'intermédiaire d'une application de vision artificielle
US11295152B2 (en) * 2019-03-01 2022-04-05 Basf Coatings Gmbh Method and system for object recognition via a computer vision application
CN110927073A (zh) * 2019-11-06 2020-03-27 广东弓叶科技有限公司 多光谱成像方法、电子装置及存储介质
CN113777063A (zh) * 2021-09-07 2021-12-10 福州大学 一种果蔬农残快速实时检测多光谱探测系统及其使用方法

Also Published As

Publication number Publication date
WO2013098708A3 (fr) 2013-09-12

Similar Documents

Publication Publication Date Title
WO2013098708A2 (fr) Acquisition de données multispectrales
TWI667918B (zh) 監測方法及攝影機
CN106152937B (zh) 空间定位装置、系统及其方法
US8942471B2 (en) Color sequential flash for digital image acquisition
RU2447471C2 (ru) Цветная последовательная вспышка для получения цифровых изображений
JP2022522822A (ja) コンピュータビジョンアプリケーションを介する物体認識方法及びシステム
WO2020245441A1 (fr) Système et procédé de reconnaissance d'objets utilisant des outils de mappage tridimensionnels dans une application de vision artificielle
US20220319149A1 (en) System and method for object recognition under natural and/or artificial light
JP7277615B2 (ja) 光の3dマッピングとモデリングを使用した物体認識システム及び方法
WO2012020381A1 (fr) Procédé et appareil permettant de reconnaître un objet intéressant
US20220307981A1 (en) Method and device for detecting a fluid by a computer vision application
US11079277B2 (en) Spectral imaging device and method
US20220230340A1 (en) System and method for object recognition using 3d mapping and modeling of light
WO2024127274A1 (fr) Améliorations dans la détection spectrale
WO2024127273A1 (fr) Améliorations relatives à la détection spectrale
CA3219510A1 (fr) Systeme et procede de reconnaissance d'objets utilisant un blocage de lumiere reflechissant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12821188

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 12821188

Country of ref document: EP

Kind code of ref document: A2