NL2031092B1 - Multispectral imaging system and method for using the system - Google Patents

Multispectral imaging system and method for using the system Download PDF

Info

Publication number
NL2031092B1
NL2031092B1 NL2031092A NL2031092A NL2031092B1 NL 2031092 B1 NL2031092 B1 NL 2031092B1 NL 2031092 A NL2031092 A NL 2031092A NL 2031092 A NL2031092 A NL 2031092A NL 2031092 B1 NL2031092 B1 NL 2031092B1
Authority
NL
Netherlands
Prior art keywords
pixel
red
blue
pixel group
band
Prior art date
Application number
NL2031092A
Other languages
Dutch (nl)
Inventor
Gerhardus Damen René
Original Assignee
Db2 Vision B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Db2 Vision B V filed Critical Db2 Vision B V
Priority to NL2031092A priority Critical patent/NL2031092B1/en
Priority to PCT/EP2023/054857 priority patent/WO2023161477A1/en
Application granted granted Critical
Publication of NL2031092B1 publication Critical patent/NL2031092B1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/251Colorimeters; Construction thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/255Details, e.g. use of specially adapted sources, lighting or optical systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/58Extraction of image or video features relating to hyperspectral data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • G01J2003/2806Array and filter array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing
    • G01N2021/1797Remote sensing in landscape, e.g. crops
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/314Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
    • G01N2021/3155Measuring in two spectral ranges, e.g. UV and visible
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/314Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
    • G01N2021/3166Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths using separate detectors and filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/314Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
    • G01N2021/317Special constructive features
    • G01N2021/3177Use of spatially separated filters in simultaneous way
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/02Mechanical
    • G01N2201/021Special mounting in general
    • G01N2201/0214Airborne
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/02Mechanical
    • G01N2201/021Special mounting in general
    • G01N2201/0216Vehicle borne
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention is related to a multispectral imaging system for providing an image for determining a state of vegetation. The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system of the invention. The multispectral imaging system of the invention comprises both a processor and an imaging device, e.g. a camera, e.g. a single optical path camera. The imaging device is suitable to be mounted on a vehicle, such as a remote controlled vehicle such as a drone or a remote controlled car.

Description

P35525NLO0/SBI
Title: MULTISPECTRAL IMAGING SYSTEM AND METHOD FOR USING THE SYSTEM
The invention is related to a multispectral imaging system for providing an image for determining a state of vegetation. The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system of the invention.
Imaging systems are used in the art to help determine a state of vegetation. During a lifecycle of a vegetation, such as a crop or a fruit bearing tree, light in different bandwidths is reflected in amounts that vary over the lifecycle of the vegetation. For example, older vegetation may appear more brown that younger vegetation which may appear more green. A downside of judging a state of vegetation using visible light is that a vegetation may already become unhealthy before a change in visible wavelengths occurs.
It is known that healthy vegetation has a different reflectivity in the infrared (IR) band compared to unhealthy vegetation, e.g. as a result of dehydration of the vegetation. This difference in reflectivity is, in practice, often measured compared to the reflectivity in the RGB (red, green, blue) bands.
An example of a vegetation index that provides an indication of a state of the vegetation is a
Normalized Difference Vegetation Index (NDVI), which takes both R and IR bands into account. Determination of a vegetation index such as the NDVI requires precise analyse of the different color components comprised in an image of the vegetation.
Imaging systems for providing images suitable for determining a vegetation index are known, for example, from WO 2021/014764 A1 wherein an imaging system is disclosed including a multiband pass filter that selectively transmits band light in a specific color band, a color filter that transmits the band light in the specific band per pixel to an image pickup device. The system of WO 2021/014764 A1 may be used to determine a state of vegetation by analysing the color components of the vegetation in an image provided with the system.
An aim of the invention is to provide an improved, or at least an alternative, imaging system for providing an image for determining a state of vegetation.
The aim of the invention is achieved by a multispectral imaging system according to claim 1.
The multispectral imaging system of the invention comprises both a processor and an imaging device, e.g. a camera, e.g. a single optical path camera. The imaging device is suitable to be mounted on a vehicle, such as a remote controlled vehicle such as a drone or a remote controlled car. The imaging device may also be mounted on an automated vehicle, such as a drone performing automated flights. In practice the imaging device may be mounted on the vehicle.
The processor may be a microcomputer that is integrated with the imaging device such that both the processor and the imaging device are mountable or mounted on a vehicle, e.g. wherein the imaging device and the processor are connected through a wired connection. In other embodiments the processor and the imaging device may be connected through a wireless connection, e.g. a radio connection, that allows data to be send from the imaging device to the processor. The processor is configured to receive information, e.g. pixel values, from the imaging device and process this data, e.g. to output an image. The outputted image may, for example, be stored on an SD card or other storage medium.
The imaging device comprises an optical sensor comprising a plurality of pixel groups. Each pixel group is comprised of four adjacent pixels. A pixel group of the optical sensor has four pixels that are adjacent, e.g. in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source. Thus pixels values of pixels in the same pixel group may be assumed to come from the same source which allows the pixels of the same pixel group to be placed on top of each other to determine the true color of the source.
The imaging device further comprises a multiband bandpass filter provided in the light path of the imaging device between the optical sensor and an aperture of the imaging device. For example the aperture comprises a lens and/or a shutter for guiding incoming light and/or closing the aperture. The multiband bandpass filter is configured to selectively transmit light in a red band, a blue band, a green band, and an infrared band to the optical sensor. The bandpass filter allows light in relatively narrow bands to be transmitted towards the optical sensor such that there is reduced cross talk between colors. Cross talk between colors of different bands reduces the effectiveness of the imaging system in providing images for determining a state of vegetation because the cross talk reduces the accuracy of the vegetation index determined based on the image. The bandpass filter may be a global filter which allows light in the four bands to be transmitted globally over the filter while being opaque to light having other wavelengths.
The imaging device further comprises a color filter which is provided between the optical sensor and the multiband bandpass filter. The color filter is configured to transmit light in a single band to a corresponding pixel of the imaging device such that, for each pixel group, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band. Thus each pixel group comprises two green pixels, one IR pixel and either one red or one blue pixel.
Pixel groups that receive light from the blue band are called blue pixel groups and pixel groups that receive light from the red band are called red pixel groups. The pixel groups are ordered such that blue pixel groups are adjacent to red pixel groups. For example, the blue pixel groups and red pixel groups can be provided alternatingly in a first direction and/or a perpendicular second direction across the optical sensor.
When the imaging device is used to measure light, the imaging device is configured to assign to a pixel value to the pixels that receive light, wherein the pixel value corresponds to an amount of light that is received by the corresponding pixel. For example, a pixel that receives little light is assigned a low pixel value, whereas a pixel that receives a lot of light is assigned a high pixel value.
The processor is configured to receive the pixel values from the optical sensor so that the processor may output the image. The processor is configured to determine a virtual red or blue pixel value for each blue or red pixel group, such that each pixel group comprises a pixel value, either measured or virtual, for each band. In order to determine a virtual red pixel value for a blue pixel group the processor is configured to: - determine a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; - determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and - determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group.
By determining a contrast between the pixel groups by comparing green pixel values of pixel groups that are adjacent to the pixel group for which the virtual pixel value is determined it may be determined if an edge, e.g. a sudden change in light intensity over the pixel groups, is present. If an edge is present in a direction for the green pixels it is likely that an edge will also be present in that direction for the red pixels. Determining a virtual red pixel value based on pixels that have a higher likelihood of having changes in intensity reduces the accuracy of the virtual pixel value. Thus the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current blue pixel group. Since the adjacent pixel groups are red pixel groups, a virtual red pixel value may be determined based on the red pixel value of the pixel group which has to green pixel value closest to the green pixel value of the blue pixel group.
The processor is similarly configured to determine a virtual blue pixel value for each red pixel group by: - determine a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; - determine which of the adjacent blue pixel groups has a lowest contrast with the red pixel group; and - determine the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group that has the lowest contrast with the red pixel group;
By comparing green pixel values of pixel groups that are adjacent to the pixel group for which the virtual pixel value is determined it may be determined if an edge, e.g. a sudden change in light intensity over the pixel groups, is present. If an edge is present in a direction for the green pixels it is likely that an edge will also be present in that direction for the blue pixels.
Determining a virtual blue pixel value based on pixels that have a higher likelihood of having changes in intensity reduces the accuracy of the virtual pixel value. Thus the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current red pixel group. Since the adjacent pixel groups are all blue pixel groups, a virtual blue pixel value may be determined based on the blue pixel value of the pixel group which has to green pixel value closest to the green pixel value of the red pixel group.
By increasing the number of green pixels an image which is perceived sharper may be displayed on a display device. Human eyes perceive sharpness based on contrast of the image, which is determined by the black/white signal send to the display device. The black/white signal is largely based on the pixel values of green pixels, thus green pixels determine in large part the perceived sharpness of the image. Images which have a high number of green pixels are conceived to be sharper compared to images having a lower number of green pixels. Thus making the images more suitable for visual inspection.
A downside of increasing the number of green pixels is the reduction in red and blue pixels which leads to a reduced effectiveness in determining a state of vegetation if no further steps are taken. The above mentioned steps of determining the virtual red pixel values and virtual blue pixel values alleviate this problem by accurately determining the virtual red and blue 5 pixel values based on pixel groups that have the closest contrast to the pixel group for which the virtual pixel value is to be determined.
The processor is further configured to generate an image based on the pixel values and the virtual pixel values and to output the image, for example to an image processor for determining the state of the vegetation or to a storage device, such as an SD card.
The imaging system of the invention allows for providing images with increased sharpness that are suitable for determining a state of the vegetation, e.g. that may be used in determining a vegetation index. The images are better suitable for a visual inspection while allowing determining of a state of the vegetation.
In embodiments the processor is further configured to: - use an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and - generate the image based on the corrected pixel values and the corrected virtual pixel values.
In these embodiments, a further step is taken to improve the resulting image for use in determining a state of vegetation. The processor is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. The use of an s-curve, in the form of a gamma correction, which was traditionally used to compensate for nonlinear behaviour of old
TV tubes.
Pixels with higher values are pixels that received more light and pixels with lower values are pixels that received less light. It was found that pixels that are oversaturated or undersaturated contribute less to determining of the state of the vegetation than pixels having intermediate values. By correcting the pixel values the resulting image is more suitable for determining a state of the vegetation because pixel values that are not suitable for determining the state of the vegetation, e.g. because they correspond to areas that are too bright or too dim, are compressed which results in an increased dynamic range for pixels having intermediate pixel values. This results in more usable values in the range suitable for determining the state of the vegetation. The combination of determining virtual pixel values and using an s-curve results in outputted images that are more suitable for determining a state of vegetation.
In embodiments, the processor is configured to use an adjustable s-curve when correcting the pixel values and virtual pixel values. Adjusting the s-curve allows to adjust the dynamic range of pixels having values relevant for determining the vegetation index, e.g. pixels that are not oversaturated or undersaturated. This improves the flexibility of the system for use for different purposes or different conditions. The correction may be adjusted by the processor, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
In embodiments, the processor is configured to determine the corresponding virtual blue pixel value based on the blue pixel group that has the lowest contrast with the red pixel group by interpolating blue pixel values of blue pixel groups in the direction of the blue pixel group that has the lowest contrast with the red pixel group and wherein the processor is configured to determine the corresponding virtual red pixel value based on the red pixel group that has the lowest contrast with the blue pixel group by interpolating red pixel values of red pixel groups in the direction of the red pixel group that has the lowest contrast with the blue pixel group. For example, in these embodiments the virtual pixel values are interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values.
The interpolation may be based on interpolation techniques or extrapolation techniques.
In embodiments the multiband bandpass filter is configured such that the blue band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the green band, red band and infrared band, the red band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, green band and infrared band, the green band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, red band and infrared band, the infrared band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, green band and red band. Leakage between colors from different bands may occur due to the spectral response of the color filter, wherein each band allows some light of other bands to be transmitted. This leakage is reduced by configuring the multiband bandpass filter such that it allows narrow bands of light to pass wherein the range of each band is such that the leakage in that range of the color filter is lower, e.g. compared to for another range.
In a further embodiment the blue band has wavelengths in the range of 410-450nm, the green band has wavelengths in the range of 555 — 585nm, the red band has wavelengths in the range of 645-675nm, and/or the infrared band has wavelengths in the range of 850 — 870nm.
The color filter may not be enough to eliminate cross talk between the colors because the color filter, e.g. in a blue band, may allow some light of another band, e.g. a red band, to be transmitted, e.g. in the blue band, thus allowing cross talk.
By using a bandpass filter that transmits light in the ranges of this embodiment, preferably in combination with a color filter which has a minimal amount of leakage in these ranges, cross talk between the different colors is further reduced. Additionally, the width of the ranges is such that sufficient light, e.g. during daylight, enters the pixels to allow for an image to be taken which is suitable for determining a state of the vegetation.
In embodiments, the imaging device has a rolling shutter and wherein the imaging device is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time. A rolling shutter allows a controlled amount of light to reach the optical sensar. A downside of using a rolling shutter is that pixels are illuminated at different times which results in the so-called rolling shutter effect when the camera or the object is moving. This distorts the image which reduces the effectiveness of the image for determining a state of the vegetation. In order to have sufficient overlap between images, when the camera is placed on a moving vehicle, e.g. a drone, a relatively low frame rate, smaller than 5 fps, is required.
However, at such low frame rates the roller shutter effect is relevant. In order to reduce the effect of the rolling shutter effect, the imaging device has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second. Additionally, e.g. for this embodiment, the imaging device may have a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field. In this case the processor may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
In embodiments the imaging system is configured to output data relating to one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken. This data may be stored together with the image to provide additional information about the image, for example when the image was taken, where the image was taken, the conditions when the image was taken.
In embodiments the processor is configured to determine if the image should be outputted based on one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken. This embodiment may reduce overlap between images that are outputted.
In embodiments, the processor is configured to apply a gain factor to each color band such that at a predetermined color temperature the color bands have the same level.
The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system according to the invention.
In an embodiment of the method, the method comprises: - determining pixel values by performing a measurement; - determining a virtual red pixel value for each blue pixel group, by: o determining a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; o determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group, - determining a virtual blue pixel value for each red pixel group, by: o determining a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; o determining which of the adjacent blue pixel groups has a lowest contrast with the red pixel group; and o determining the corresponding virtual blue pixel value based an the blue pixel value of the blue pixel group that has the lowest contrast with the red pixel group, - generating an image based on the pixel values and the virtual pixel values; and - outputting the image.
In a further embodiment of the method, the method further comprises the steps of: - using an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and - generating the image based on the corrected pixel values and the corrected virtual pixel values.
In embodiments, the method comprises taking images with a frame rate between 50 and 60
Hz and with a shutter time of less than half a frame time.
The method according to the present invention may comprise one or more of the features and/or benefits that are disclosed herein in relation to the multispectral imaging system according to the present invention, in particular as recited in the appended claims.
The invention is further related to a vehicle, such as a drone, comprising the multispectral imaging system according to the invention.
The invention will be explained with reference to the drawing below, in which: - Fig. 1a shows an example of a pixel group; - Fig. 1b shows an example of a color filter comprising various pixel groups; - Fig. 2 shows an example of the multispectral imaging system; - Fig. 3 shows an effect of the s-curve on the sensor output; and - Fig. 4 shows a flow chart of a method.
Figure 1a shows an example of a pixel group 5 that comprises two green pixels 6, G, an infrared pixel IR, and either a red pixel R or a blue pixel B. A pixel group 5 comprising a red pixel R is called a red pixel group RG and a Figure 1b shows how the pixel group 5 of figure
1a may be arranged in the color filter 8 for use in the multispectral imaging system 1 of the invention.
Figure 1b shows an optical sensor 8 that comprises twelve pixel groups, wherein a red pixel group RB is adjacent to a blue pixel group BG. In practical embodiments, an optical sensor may comprise many more, for example more than a thousand, pixel groups.
The optical sensor 8 may have a spectral response, wherein a pixel 6, for example a green pixel G, allows some light having a different wavelength than green light to pass. This reduces the effectiveness of the use of the optical sensor 8 in an imaging system 1 for use in determining a state of a vegetation. For this reason, a multiband bandpass filter 7 is used which allows light in narrow bands to pass to reduce crosstalk between the colors.
As can be seen in figure 1a, the pixels 6 of the pixel group 5 are adjacent to each other in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source. Further, the pixel groups 5 themselves are arranged in a square formation in the optical sensor 8.
Figure 2 shows an example of the multispectral imaging system 1 comprising a processor 2 and an imaging device 3. The processor 2 and the imaging device 3 are connected, e.g. via a wireless or wired connection, such that the processor 2 may receive pixel values measured by the imaging device 3.
The imaging device 3 comprises an optical sensor 4, a multiband bandpass filter 7 and a color filter 8. The multiband bandpass filter 7 and the color filter 8 are arranged in a light path of the imaging device 3 such that light may enter the imaging device 3, travel through the bandpass filter 7 and the color filter 8 towards the optical sensor 4. For example, the imaging device 3 may comprise a lens 8 to collect light, e.g. coming from a field of crops.
The color filter 8 is provided on the optical sensor 4. In embodiments, the color filter 8 may be integrated with the optical sensor 4. In other embodiments, the color filter may be separate from the optical sensor 4. The color filter 8 may be a color filter 8 as shown in figure 1b. The color filter 8 comprises pixel groups 5 which each comprise 4 pixels 6.
The optical sensor 4 may have a resolution of four megapixels, so that the infrared color band has a resolution of one megapixels, the red and blue color bands have a resolution of 0.5 megapixels and the green color band has a resolution of two megapixels. The processor 2 is configured to determine virtual red and blue pixel values such that the resolution of the red and blue pixels in combination with the virtual pixels is one megapixels for each band.
The multiband bandpass filter 7 reduces crosstalk between the bands by allowing light of a narrow width, in the four bands, to be transmitted and to stop light having other wavelengths.
The imaging device 3 may be mounted on a vehicle such as a drone or an automatic land vehicle. The processor 2 may be a microcomputer which is in radio contact with the imaging device 3. In other embodiments, the processor 2 may be a microcomputer that is integrated with the imaging device 3, e.g. with a wired connection. The processor 2 is configured to output an image, for example to a display, a storage medium such as an SD card, or to an image printer.
The color filter 8 is configured to transmit a light in a single band to a corresponding pixel 6 of the imaging device 3, such that, for each pixel group 5, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band. Thus each pixel group 5 comprises two green pixels G, one IR pixel IR and either one red R or one blue pixel B.
The imaging device 3 is provided with a virtual shutter which, in this embodiment, is a rolling shutter. In order to reduce the rolling shutter effect, the imaging device 3 is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time. The rolling shutter allows a controlled amount of light to reach the optical sensor 4. A downside of using the rolling shutter is that pixels 6 are illuminated at different times which results in the rolling shutter effect when the camera or the object is moving such as when the imaging device 3 is mounted on a vehicle. This may distort the image which reduces the effectiveness of the image for determining a state of the vegetation. In order to have sufficient overlap between images, when the camera is placed on a moving vehicle, e.g. a drone, a relatively low frame rate, smaller than 5 fps, is required. However, at such low frame rates the roller shutter effect is relevant. In order to reduce the effect of the rolling shutter effect, the imaging device 3 has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second. Additionally, e.g. for this embodiment, the imaging device 3 has a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field.
In this case the processor 2 may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
Figure 3 shows an effect of the s-curve on the output of the optical sensor 4. The first graph shows a linear graph wherein a constant s-curve is used, which translates into a corrected output which is linear with respect to the measured values. The second graph 11 shows a traditional gamma curve at 0.45 gamma which was traditionally used to compensate for nonlinear behaviour of old TV tubes. 10 The third graph 12 shown is an example of an s-curve as used by the invention. The processor 2 may be configured to use the s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. The image is then generated based on the corrected pixel values.
This allows to further improve the resulting image for use in determining a state of vegetation.
The processor 2 is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
The s-curve shown 12 may be an adjustable s-curve such that the relation between the sensor output and the corrected output may vary depending on the needs of the user.
Adjusting the s-curve allows to adjust the dynamic range of pixels 5 having values relevant for determining the vegetation index, e.g. pixels 5 that are not oversaturated or undersaturated.
This improves the flexibility of the system 1 for use for different purposes or different conditions. The correction may be adjusted by the processor 2, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
Figure 4 shows a flow chart of a method for outputting an image. The method comprises determining pixel values 100 by performing a measurement with the imaging system 1. The measurement may be taking an image of a field of crops. By allowing light to fall on the optical sensor 4 during a measurement the pixels 6 receive light and pixel values are determined.
The next step is to determine virtual pixel values 103 for both red pixel groups and blue pixel groups by first determining a contrast 101 between a pixel group 5 and adjacent pixel groups by comparing green pixel values of these groups 5. This allows to determine which of the, e.g., red pixel groups 5 has the green pixel value closest to the adjacent, e.g., blue pixel group 5, which in turn allows to determine the contrast between these groups 5. Next it is determined which of these groups 5 has the lowest contrast 102 with the group for which the 5 virtual pixel value is to be determined.
Subsequently, the corresponding virtual pixel value is determined 103 based on the corresponding pixel value of the pixel group 5 having the lowest contrast with the group 5 for which the virtual pixel is to be determined. For example, the processor 2 may determine the corresponding virtual pixel value based on the pixel group 5 that has the lowest contrast with the pixel group 5 for which the virtual pixel is to be determined by interpolating pixel values of pixel groups 5 in the direction of the pixel group that has the lowest contrast. This allows the virtual pixel values to be interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values. The interpolation may be based on interpolation techniques or extrapolation techniques.
Next, optionally, an s-curve is used to correct 104 the pixel values and virtual pixel values by compressing pixels 5 and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. By correcting the pixel values the resulting image is more suitable for determining a state of the vegetation because pixel values that are not suitable for determining the state of the vegetation, e.g. because they correspond to areas that are too bright or too dim, are compressed which results in an increased dynamic range for pixels having intermediate pixel values. This results in more usable values in the range suitable for determining the state of the vegetation.
Next, an image is generated 105 based on the (corrected) pixel values and the (corrected) virtual pixel values and the image is outputted 106.

Claims (14)

CONCLUSIESCONCLUSIONS 1. Multispectraal afbeeldingssysteem (1) voor het verkrijgen van een afbeelding voor het vaststellen van een toestand van vegetatie waarbij het afbeeldingssysteem (1) een processor (2) en een afbeeldingsinrichting (3) omvat die op een voertuig, bijvoorbeeld een drone, aanbrengbaar is, waarbij de afbeeldingsinrichting (3) omvat: - een optische sensor (4) die meerdere pixelgroepen (5) omvat, waarbij elke pixelgroep (5) vier naburige pixels (6) omvat; - een multiband bandpass filter (7) die tussen de optische sensor (4) en een opening van de afbeeldingsinrichting (3) is aangebracht, waarbij het multiband bandpass filter (7) is ingericht om selectief licht in een rode band, een blauwe band, een groene band, en een infrarode band door te laten richting de optische sensor (4); en - een kleurfilter (8) dat tussen de optische sensor (4) en het multiband bandpass filter (7) is aangebracht, waarbij het kleurfilter (8) is ingericht om licht dat van het multiband bandpass filter (7) wordt ontvangen door te laten naar de optische sensor (4) zo dat, voor elke pixelgroep (5), een eerste pixel (6) van de corresponderende pixelgroep (5) licht van de groene band ontvangt, een tweede pixel (6) van de corresponderende pixelgroep (5) licht van de infrarode band ontvangt, een derde pixel (8) van de corresponderende pixelgroep (5) licht van de groene band ontvangt, en een vierde pixel (6) van de corresponderende pixelgroep (5) licht van of de rode band of de blauwe band ontvangt, waarbij pixelgroepen (5) die licht ontvangen van de blauwe band blauwe pixelgroepen (BG) heten en waarbij pixelgroepen (5) die licht van de rode bant ontvangen rode pixelgroepen (RG) heten, waarbij blauwe pixelgroepen (BG) naburig zijn aan rode pixelgroepen (RG), waarbij het afbeeldingssyteem (1) is ingericht om pixelwaarden aan elke pixel (6) toe te kennen wanneer licht op de optische sensor (4) valt, bijvoorbeeld tijdens een meting, waarbij de processor (2) verbonden is met de optische sensor (4) om de pixelwaarden daarvan te ontvangen, en waarbij de processor (2) is ingericht om: - een virtuele rode pixelwaarde vast te stellen voor elke blauwe pixelgroep (BG), door: o het vaststellen van een contrast tussen de blauwe pixelgroep (BG) en rode pixelgroepen (RG) die naburig zijn aan de blauwe pixelgroep (BG) door groene pixelwaarden van de rode pixelgroepen (RG) die naburig zijn met de blauwe pixelgroep (BG) te vergelijken met groene pixelwaarden van de blauwe pixelgroep (BG);1. Multispectral imaging system (1) for obtaining an image for determining a vegetation condition, wherein the imaging system (1) comprises a processor (2) and an imaging device (3) that can be mounted on a vehicle, for example a drone wherein the imaging device (3) comprises: - an optical sensor (4) comprising several pixel groups (5), each pixel group (5) comprising four neighboring pixels (6); - a multiband bandpass filter (7) arranged between the optical sensor (4) and an opening of the imaging device (3), wherein the multiband bandpass filter (7) is designed to selectively transmit light into a red band, a blue band, a green band, and an infrared band to pass towards the optical sensor (4); and - a color filter (8) arranged between the optical sensor (4) and the multiband bandpass filter (7), wherein the color filter (8) is adapted to transmit light received from the multiband bandpass filter (7) to the optical sensor (4) so that, for each pixel group (5), a first pixel (6) of the corresponding pixel group (5) receives light from the green band, a second pixel (6) of the corresponding pixel group (5) receives light from the infrared band, a third pixel (8) of the corresponding pixel group (5) receives light from the green band, and a fourth pixel (6) of the corresponding pixel group (5) receives light from either the red band or the blue band, where pixel groups (5) that receive light from the blue band are called blue pixel groups (BG) and where pixel groups (5) that receive light from the red band are called red pixel groups (RG), where blue pixel groups (BG) are adjacent to red pixel groups (RG), where the imaging system (1) is designed to assign pixel values to each pixel (6) when light falls on the optical sensor (4), for example during a measurement, where the processor (2) is connected to the optical sensor (4) to receive the pixel values thereof, and wherein the processor (2) is arranged to: - determine a virtual red pixel value for each blue pixel group (BG), by: o determining a contrast between the blue pixel group (BG) and red pixel groups (RG) adjacent to the blue pixel group (BG) by comparing green pixel values of the red pixel groups (RG) adjacent to the blue pixel group (BG) with green pixel values of the blue pixel group (BG); o het vaststellen welke van de naburige rode pixelgroepen (RG) het laagste contrast heeft met de blauwe pixelgroep (BG); en o het vaststellen van de corresponderende virtuele rode pixelwaarde gebaseerd op de rode pixelwaarde van de rode pixelgroep (RG) die het laagste contrast heeft met de blauwe pixelgroep (BG); - een virtuele blauwe pixelwaarde vast te stellen voor elke rode pixelgroep (RG), door: o het vaststellen van een contrast tussen de rode pixelgroep (RG) en blauwe pixelgroepen (BG) die naburig zijn aan de rode pixelgroep (RG) door groene pixelwaarden van de blauwe pixelgroepen (BG) die naburig zijn met de rode pixelgroep (RG) te vergelijken met groene pixelwaarden van de rode pixelgroep (RG); o het vaststellen welke van de naburige blauwe pixelgroepen (BG) het laagste contrast heeft met de rode pixelgroep (RG); en o het vaststellen van de corresponderende virtuele blauwe pixelwaarde gebaseerd op de blauwe pixelwaarde van de blauwe pixelgroep (BG) die het laagste contrast heeft met de rode pixelgroep (RG); - een afbeelding te genereren gebaseerd op de pixelwaarde en de virtuele pixelwaarden; en - het outputten van de afbeelding.o determining which of the neighboring red pixel groups (RG) has the lowest contrast with the blue pixel group (BG); and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group (RG) that has the lowest contrast with the blue pixel group (BG); - establish a virtual blue pixel value for each red pixel group (RG), by: o establishing a contrast between the red pixel group (RG) and blue pixel groups (BG) adjacent to the red pixel group (RG) by green pixel values comparing the blue pixel groups (BG) adjacent to the red pixel group (RG) with green pixel values of the red pixel group (RG); o determining which of the neighboring blue pixel groups (BG) has the lowest contrast with the red pixel group (RG); and o determining the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG); - generate an image based on the pixel value and the virtual pixel values; and - outputting the image. 2. Multispectraal afbeeldingssysteem volgens conclusie 1, waarbij de processor (2) verder is ingericht om: — een s-curve te gebruiken om de pixelwaarden en virtuele pixelwaarden van pixels en virtuele pixels met hogere en lagere pixelwaarden te corrigeren door deze te comprimeren ten opzichte van pixels met tussenliggende pixelwaarden; — de afbeelding te genereren gebaseerd op de gecorrigeerde pixelwaarden en de gecorrigeerde virtuele pixelwaarden.A multispectral imaging system according to claim 1, wherein the processor (2) is further arranged to: - use an s-curve to correct the pixel values and virtual pixel values of pixels and virtual pixels with higher and lower pixel values by compressing them with respect to of pixels with intermediate pixel values; — generate the image based on the corrected pixel values and the corrected virtual pixel values. 3. Multispectraal afbeeldingssysteem volgens conclusie 2, waarbij de processor (2) is ingericht om een aanpasbare s-curve te gebruiken om de pixelwaarden en virtuele pixelwaarden te corrigeren.Multispectral imaging system according to claim 2, wherein the processor (2) is arranged to use an adjustable s-curve to correct the pixel values and virtual pixel values. 4. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij de processor (2) is ingericht om de corresponderende virtuele blauwe pixelwaarde vast te stellen gebaseerd op de blauwe pixelgroep (BG) die het laagste contrast heeft met de rode pixelgroep (RG) door de blauwe pixelwaarden van de blauwe pixelgroepen (BG) te interpoleren in de richting van de blauwe pixelgroep (BG) die het laagste contrast met de rode pixelgroep (RG) heeft,Multispectral imaging system according to one or more of the preceding claims, wherein the processor (2) is arranged to determine the corresponding virtual blue pixel value based on the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG) by interpolating the blue pixel values of the blue pixel groups (BG) in the direction of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG), En waarbij de processor (2) is ingericht om de corresponderende virtuele rode pixelwaarde vast te stellen gebaseerd op de rode pixelgroep (RG) die het laagste contrast heeft met de blauwe pixelgroep (BG) door de rode pixelwaarden van de rode pixelgroepen (RG) te interpoleren in de richting van de rode pixelgroep (RG) die het laagste contrast met de blauwe pixelgroep (BG) heeft.And wherein the processor (2) is arranged to determine the corresponding virtual red pixel value based on the red pixel group (RG) that has the lowest contrast with the blue pixel group (BG) by comparing the red pixel values of the red pixel groups (RG). interpolate in the direction of the red pixel group (RG) that has the lowest contrast with the blue pixel group (BG). 5. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij het multiband bandpass filter (7) is ingericht zo dat de blauwe band van het multiband bandpass filter (7) een golflengte bereik heeft waarin het kleurfilter (8) een lagere hoeveelheid licht van de groene band, de rode band en de infrarode band door laat, de rode band van het multiband bandpass filter (7) een golflengte bereik heeft waarin het kleurfilter (8) een lagere hoeveelheid licht van de groene band, de blauwe band en de infrarode band door laat, de groene band van het multiband bandpass filter (7) een golflengte bereik heeft waarin het kleurfilter (8) een lagere hoeveelheid licht van de blauwe band, de rode band en de infrarode band door laat, en zo dat de infrarode band van het multiband bandpass filter (7) een golflengte bereik heeft waarin het kleurfilter (8) een lagere hoeveelheid licht van de groene band, de rode band en de blauwe band door laat.5. Multispectral imaging system according to one or more of the preceding claims, wherein the multiband bandpass filter (7) is arranged such that the blue band of the multiband bandpass filter (7) has a wavelength range in which the color filter (8) receives a lower amount of light. of the green band, the red band and the infrared band, the red band of the multiband bandpass filter (7) has a wavelength range in which the color filter (8) allows a lower amount of light from the green band, the blue band and the infrared band, the green band of the multiband bandpass filter (7) has a wavelength range in which the color filter (8) allows a lower amount of light from the blue band, the red band and the infrared band to pass through, and so that the infrared band of the multiband bandpass filter (7) has a wavelength range in which the color filter (8) allows a lower amount of light from the green band, the red band and the blue band to pass through. 6. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij de blauwe band licht met golflengte in het bereik van 410 — 450 nm doorlaat, de groene band licht met een golflengte in het bereik van 555 — 585 nm doorlaat, de rode band licht met een golflengte in het bereik van 645 — 675 nm doorlaat, en/of de infrarode band licht met een golflengte in het bereik van 850 — 870 nm doorlaat.6. Multispectral imaging system according to one or more of the preceding claims, wherein the blue band transmits light with a wavelength in the range 410 - 450 nm, the green band transmits light with a wavelength in the range 555 - 585 nm, the red band transmits light with a wavelength in the range 645 - 675 nm, and/or the infrared band transmits light with a wavelength in the range 850 - 870 nm. 7. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij de afbeeldingsinrichting (3) een rollende sluiter heeft en waarbij de afbeeldingsinrichting (3) is ingericht om een frame rate tussen de 50 en 60 Hz te en een sluitertijd van minder dan een halve frametijd te hebben.7. Multispectral imaging system according to one or more of the preceding claims, wherein the imaging device (3) has a rolling shutter and wherein the imaging device (3) is designed to have a frame rate between 50 and 60 Hz and a shutter speed of less than to have half frame time. 8. Multispectral afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij het afbeeldingssysteem (1) is ingericht om data out te putten die gerelateerd is aan een of meer van een locatie van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, een hoogte van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, een snelheid van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, informatie betreffende de lichtomstandigheden wanneer de afbeelding is gemaakt, en een hoek van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt.8. Multispectral imaging system according to one or more of the preceding claims, wherein the imaging system (1) is arranged to output data related to one or more of a location of the imaging device (3) when the image is taken, a height of the imaging device (3) when the image was taken, a speed of the imaging device (3) when the image was taken, information regarding lighting conditions when the image was taken, and an angle of the imaging device (3) when the image was taken . 9. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij de processor (2) is ingericht om vast te stellen of een afbeelding geoutput moet worden gebaseerd op een of meer van een locatie van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, een hoogte van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, een snelheid van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt, informatie betreffende de lichtomstandigheden wanneer de afbeelding is gemaakt, en een hoek van de afbeeldingsinrichting (3) wanneer de afbeelding is gemaakt.A multispectral imaging system according to any one of the preceding claims, wherein the processor (2) is configured to determine whether to output an image based on one or more of a location of the imaging device (3) when the image is captured , a height of the imaging device (3) when the image is taken, a speed of the imaging device (3) when the image is taken, information regarding lighting conditions when the image is taken, and an angle of the imaging device (3) when the image has been created. 10. Multispectraal afbeeldingssysteem volgens een of meer van de voorgaande conclusies, waarbij de processor (2} is ingericht om een winstfactor toe te passen op elke kleurband zo dat op een vooraf bepaalde kleurtemperatuur de kleurbanden hetzelfde niveau hebben.10. Multispectral imaging system according to one or more of the preceding claims, wherein the processor (2} is arranged to apply a gain factor to each color band so that at a predetermined color temperature the color bands have the same level. 11. Werkwijze voor het verschaffen van een afbeelding voor het vaststellen van een toestand van vegetatie waarbij gebruik wordt gemaakt van een multispectraal afbeeldingssysteem (1) volgens een of meer van de voorgaande conclusies.11. Method for providing an image for determining a vegetation condition, using a multispectral imaging system (1) according to one or more of the preceding claims. 12. Werkwijze volgens conclusie 11, waarbij de werkwijze omvat: — het vaststellen van pixelwaarden voor de pixels {6) van de optische sensor (4) door het doen van een meting met de afbeeldingsinrichting (3); — het vaststellen van een virtuele rode pixelwaarde voor elke blauwe pixelgroep (BG), door: o het vaststellen van een contrast tussen de blauwe pixelgroep (BG) en rode pixelgroepen (RG) die naburig zijn aan de blauwe pixelgroep (BG) door groene pixelwaarden van de rode pixelgroepen (RG) die naburig zijn met de blauwe pixelgroep (BG) te vergelijken met groene pixelwaarden van de blauwe pixelgroep (BG); o het vaststellen welke van de naburige rode pixelgroepen (RG) het laagste contrast heeft met de blauwe pixelgroep (BG); en o het vaststellen van de corresponderende virtuele rode pixelwaarde gebaseerd op de rode pixelwaarde van de rode pixelgroep (RG) die het laagste contrast heeft met de blauwe pixelgroep (BG); — het vaststellen van een virtuele blauwe pixelwaarde voor elke rode pixelgroep (RG), door: o het vaststellen van een contrast tussen de rode pixelgroep (RG) en blauwe pixelgroepen (BG) die naburig zijn aan de rode pixelgroep (RG) door groene pixelwaarden van de blauwe pixelgroepen (BG) die naburig zijn met de rode pixelgroep (RG) te vergelijken met groene pixelwaarden van de rode pixelgroep (RG); o het vaststellen welke van de naburige blauwe pixelgroepen (BG) het laagste contrast heeft met de rode pixelgroep (RG); en o het vaststellen van de corresponderende virtuele blauwe pixelwaarde gebaseerd op de blauwe pixelwaarde van de blauwe pixelgroep (BG) die het laagste contrast heeft met de rode pixelgroep (RG); — het genereren van een afbeelding gebaseerd op de pixelwaarde en de virtuele pixelwaarden; en — het outputten van de afbeelding.Method according to claim 11, wherein the method comprises: - determining pixel values for the pixels {6) of the optical sensor (4) by making a measurement with the imaging device (3); — establishing a virtual red pixel value for each blue pixel group (BG), by: o establishing a contrast between the blue pixel group (BG) and red pixel groups (RG) adjacent to the blue pixel group (BG) by green pixel values comparing the red pixel groups (RG) adjacent to the blue pixel group (BG) with green pixel values of the blue pixel group (BG); o determining which of the neighboring red pixel groups (RG) has the lowest contrast with the blue pixel group (BG); and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group (RG) that has the lowest contrast with the blue pixel group (BG); — establishing a virtual blue pixel value for each red pixel group (RG), by: o establishing a contrast between the red pixel group (RG) and blue pixel groups (BG) adjacent to the red pixel group (RG) by green pixel values comparing the blue pixel groups (BG) adjacent to the red pixel group (RG) with green pixel values of the red pixel group (RG); o determining which of the neighboring blue pixel groups (BG) has the lowest contrast with the red pixel group (RG); and o determining the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG); — generating an image based on the pixel value and the virtual pixel values; and — outputting the image. 13. Werkwijze volgens conclusie 12, waarbij de werkwijze verder omvat: — het gebruiken van een s-curve om de pixelwaarden en virtuele pixelwaarden van pixels en virtuele pixels met hogere en lagere pixelwaarden te corrigeren door deze te comprimeren ten opzichte van pixels met tussenliggende pixelwaarden; — het genereren van de afbeelding gebaseerd op de gecorrigeerde pixelwaarden en de gecorrigeerde virtuele pixelwaarden.A method according to claim 12, wherein the method further comprises: - using an s-curve to correct the pixel values and virtual pixel values of pixels and virtual pixels with higher and lower pixel values by compressing them relative to pixels with intermediate pixel values ; — generating the image based on the corrected pixel values and the corrected virtual pixel values. 14. Voertuig omvattende een multispectraal afbeeldingssysteem volgens een of meer van de conclusies 1 — 10.14. Vehicle comprising a multispectral imaging system according to one or more of claims 1 - 10.
NL2031092A 2022-02-28 2022-02-28 Multispectral imaging system and method for using the system NL2031092B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2031092A NL2031092B1 (en) 2022-02-28 2022-02-28 Multispectral imaging system and method for using the system
PCT/EP2023/054857 WO2023161477A1 (en) 2022-02-28 2023-02-27 Multispectral imaging system and method for using the system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL2031092A NL2031092B1 (en) 2022-02-28 2022-02-28 Multispectral imaging system and method for using the system

Publications (1)

Publication Number Publication Date
NL2031092B1 true NL2031092B1 (en) 2023-09-07

Family

ID=82100119

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2031092A NL2031092B1 (en) 2022-02-28 2022-02-28 Multispectral imaging system and method for using the system

Country Status (2)

Country Link
NL (1) NL2031092B1 (en)
WO (1) WO2023161477A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180040104A1 (en) * 2016-08-04 2018-02-08 Intel Corporation Restoring Color and Infrared Images from Mosaic Data
EP3761638A1 (en) * 2018-03-23 2021-01-06 Sony Corporation Signal processing apparatus, signal processing method, image capture apparatus, and medical image capture apparatus
WO2021014764A1 (en) 2019-07-24 2021-01-28 Sony Corporation Image processing apparatus, image pickup apparatus, method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180040104A1 (en) * 2016-08-04 2018-02-08 Intel Corporation Restoring Color and Infrared Images from Mosaic Data
EP3761638A1 (en) * 2018-03-23 2021-01-06 Sony Corporation Signal processing apparatus, signal processing method, image capture apparatus, and medical image capture apparatus
WO2021014764A1 (en) 2019-07-24 2021-01-28 Sony Corporation Image processing apparatus, image pickup apparatus, method and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ORIT SKORKA ET AL: "Color correction for RGB sensors with dual-band filters for in-cabin imaging applications", ELECTRONIC IMAGING, vol. 2019, no. 15, 13 January 2019 (2019-01-13), US, pages 46 - 1, XP055715959, ISSN: 2470-1173, DOI: 10.2352/ISSN.2470-1173.2019.15.AVM-046 *

Also Published As

Publication number Publication date
WO2023161477A1 (en) 2023-08-31

Similar Documents

Publication Publication Date Title
KR101824290B1 (en) High resolution multispectral image capture
US10419685B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
CA2783538C (en) Device and method for detecting vehicle license plates
US10368063B2 (en) Optical test device for a vehicle camera and testing method
KR101666137B1 (en) Method for estimating a defect in an image-capturing system, and associated systems
TWI684365B (en) Camera and method of producing color images
US20120169995A1 (en) Method and device for producing high-quality fundus images
CN103124332A (en) Image processing apparatus and image processing method
CN104285173A (en) Focus detection device
US9420197B2 (en) Imaging device, imaging method and imaging program
JP6872137B2 (en) Signal processing equipment, signal processing methods, and programs
KR101685887B1 (en) Image processing device, image processing method, and computer-readable storage medium having image processing program
US20200111193A1 (en) Image acquisition using time-multiplexed chromatic illumination
US20110254974A1 (en) Signal processing apparatus, solid-state image capturing apparatus, electronic information device, signal processing method, control program and storage medium
CN102894957A (en) Image processing apparatus for fundus image, and image processing method for fundus image
US10416026B2 (en) Image processing apparatus for correcting pixel value based on difference between spectral sensitivity characteristic of pixel of interest and reference spectral sensitivity, image processing method, and computer-readable recording medium
KR101695246B1 (en) Device for estimating light source and method thereof
JP6013284B2 (en) Imaging apparatus and imaging method
US20160360093A1 (en) Imaging apparatus, imaging method, and storage medium
JP2020176873A (en) Calibration device, calibration method, spectroscopic camera, and display device
CN111510692A (en) Image processing method, terminal and computer readable storage medium
NL2031092B1 (en) Multispectral imaging system and method for using the system
US20080252748A1 (en) System and computer-readable medium for automatic white balancing
US20230028674A1 (en) Imaging method for separating spectral signal components, and associated endoscopy system
US12078538B2 (en) Detection of light source distortion in an imaging system