WO2023161477A1 - Multispectral imaging system and method for using the system - Google Patents
Multispectral imaging system and method for using the system Download PDFInfo
- Publication number
- WO2023161477A1 WO2023161477A1 PCT/EP2023/054857 EP2023054857W WO2023161477A1 WO 2023161477 A1 WO2023161477 A1 WO 2023161477A1 EP 2023054857 W EP2023054857 W EP 2023054857W WO 2023161477 A1 WO2023161477 A1 WO 2023161477A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- red
- blue
- pixel group
- band
- Prior art date
Links
- 238000000701 chemical imaging Methods 0.000 title claims abstract description 27
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000003384 imaging method Methods 0.000 claims abstract description 70
- 230000003287 optical effect Effects 0.000 claims abstract description 33
- 238000005096 rolling process Methods 0.000 claims description 12
- 238000005259 measurement Methods 0.000 claims description 8
- 230000000694 effects Effects 0.000 description 11
- 239000003086 colorant Substances 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000002310 reflectometry Methods 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 238000013213 extrapolation Methods 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000011179 visual inspection Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000018044 dehydration Effects 0.000 description 1
- 238000006297 dehydration reaction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/251—Colorimeters; Construction thereof
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/255—Details, e.g. use of specially adapted sources, lighting or optical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/58—Extraction of image or video features relating to hyperspectral data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/194—Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2803—Investigating the spectrum using photoelectric array detector
- G01J2003/2806—Array and filter array
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N2021/1793—Remote sensing
- G01N2021/1797—Remote sensing in landscape, e.g. crops
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/314—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
- G01N2021/3155—Measuring in two spectral ranges, e.g. UV and visible
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/314—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
- G01N2021/3166—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths using separate detectors and filters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/314—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
- G01N2021/317—Special constructive features
- G01N2021/3177—Use of spatially separated filters in simultaneous way
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/02—Mechanical
- G01N2201/021—Special mounting in general
- G01N2201/0214—Airborne
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/02—Mechanical
- G01N2201/021—Special mounting in general
- G01N2201/0216—Vehicle borne
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- the invention is related to a multispectral imaging system for providing an image for determining a state of vegetation.
- the invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system of the invention.
- Imaging systems are used in the art to help determine a state of vegetation.
- a vegetation such as a crop or a fruit bearing tree
- light in different bandwidths is reflected in amounts that vary over the lifecycle of the vegetation. For example, older vegetation may appear more brown that younger vegetation which may appear more green.
- a downside of judging a state of vegetation using visible light is that a vegetation may already become unhealthy before a change in visible wavelengths occurs.
- NDVI Normalized Difference Vegetation Index
- Imaging systems for providing images suitable for determining a vegetation index are known, for example, from WO 2021/014764 A1 wherein an imaging system is disclosed including a multiband pass filter that selectively transmits band light in a specific color band, a color filter that transmits the band light in the specific band per pixel to an image pickup device.
- the system of WO 2021/014764 A1 may be used to determine a state of vegetation by analysing the color components of the vegetation in an image provided with the system.
- An aim of the invention is to provide an improved, or at least an alternative, imaging system for providing an image for determining a state of vegetation.
- the aim of the invention is achieved by a multispectral imaging system according to claim 1.
- the multispectral imaging system of the invention comprises both a processor and an imaging device, e.g. a camera, e.g. a single optical path camera.
- the imaging device is suitable to be mounted on a vehicle, such as a remote controlled vehicle such as a drone or a remote controlled car.
- the imaging device may also be mounted on an automated vehicle, such as a drone performing automated flights. In practice the imaging device may be mounted on the vehicle.
- the processor may be a microcomputer that is integrated with the imaging device such that both the processor and the imaging device are mountable or mounted on a vehicle, e.g. wherein the imaging device and the processor are connected through a wired connection.
- the processor and the imaging device may be connected through a wireless connection, e.g. a radio connection, that allows data to be send from the imaging device to the processor.
- the processor is configured to receive information, e.g. pixel values, from the imaging device and process this data, e.g. to output an image.
- the outputted image may, for example, be stored on an SD card or other storage medium.
- the imaging device comprises an optical sensor comprising a plurality of pixel groups.
- Each pixel group is comprised of four adjacent pixels.
- a pixel group of the optical sensor has four pixels that are adjacent, e.g. in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source.
- pixels values of pixels in the same pixel group may be assumed to come from the same source which allows the pixels of the same pixel group to be placed on top of each other to determine the true color of the source.
- the imaging device further comprises a multiband bandpass filter provided in the light path of the imaging device between the optical sensor and an aperture of the imaging device.
- the aperture comprises a lens and/or a shutter for guiding incoming light and/or closing the aperture.
- the multiband bandpass filter is configured to selectively transmit light in a red band, a blue band, a green band, and an infrared band to the optical sensor.
- the bandpass filter allows light in relatively narrow bands to be transmitted towards the optical sensor such that there is reduced cross talk between colors. Cross talk between colors of different bands reduces the effectiveness of the imaging system in providing images for determining a state of vegetation because the cross talk reduces the accuracy of the vegetation index determined based on the image.
- the bandpass filter may be a global filter which allows light in the four bands to be transmitted globally over the filter while being opaque to light having other wavelengths.
- the imaging device further comprises a color filter which is provided between the optical sensor and the multiband bandpass filter.
- the color filter is configured to transmit light in a single band to a corresponding pixel of the imaging device such that, for each pixel group, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band.
- each pixel group comprises two green pixels, one IR pixel and either one red or one blue pixel.
- Pixel groups that receive light from the blue band are called blue pixel groups and pixel groups that receive light from the red band are called red pixel groups.
- the pixel groups are ordered such that blue pixel groups are adjacent to red pixel groups.
- the blue pixel groups and red pixel groups can be provided alternatingly in a first direction and/or a perpendicular second direction across the optical sensor.
- the imaging device When the imaging device is used to measure light, the imaging device is configured to assign to a pixel value to the pixels that receive light, wherein the pixel value corresponds to an amount of light that is received by the corresponding pixel. For example, a pixel that receives little light is assigned a low pixel value, whereas a pixel that receives a lot of light is assigned a high pixel value.
- the processor is configured to receive the pixel values from the optical sensor so that the processor may output the image.
- the processor is configured to determine a virtual red or blue pixel value for each blue or red pixel group, such that each pixel group comprises a pixel value, either measured or virtual, for each band.
- the processor is configured to: determine a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group.
- the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current blue pixel group. Since the adjacent pixel groups are red pixel groups, a virtual red pixel value may be determined based on the red pixel value of the pixel group which has to green pixel value closest to the green pixel value of the blue pixel group.
- the processor is similarly configured to determine a virtual blue pixel value for each red pixel group by: determine a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; determine which of the adjacent blue pixel groups has a lowest contrast with the red pixel group; and determine the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group that has the lowest contrast with the red pixel group;
- the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current red pixel group. Since the adjacent pixel groups are all blue pixel groups, a virtual blue pixel value may be determined based on the blue pixel value of the pixel group which has to green pixel value closest to the green pixel value of the red pixel group.
- an image which is perceived sharper may be displayed on a display device.
- Human eyes perceive sharpness based on contrast of the image, which is determined by the black/white signal send to the display device.
- the black/white signal is largely based on the pixel values of green pixels, thus green pixels determine in large part the perceived sharpness of the image. Images which have a high number of green pixels are conceived to be sharper compared to images having a lower number of green pixels. Thus making the images more suitable for visual inspection.
- a downside of increasing the number of green pixels is the reduction in red and blue pixels which leads to a reduced effectiveness in determining a state of vegetation if no further steps are taken.
- the above mentioned steps of determining the virtual red pixel values and virtual blue pixel values alleviate this problem by accurately determining the virtual red and blue pixel values based on pixel groups that have the closest contrast to the pixel group for which the virtual pixel value is to be determined.
- the processor is further configured to generate an image based on the pixel values and the virtual pixel values and to output the image, for example to an image processor for determining the state of the vegetation or to a storage device, such as an SD card.
- the imaging system of the invention allows for providing images with increased sharpness that are suitable for determining a state of the vegetation, e.g. that may be used in determining a vegetation index.
- the images are better suitable for a visual inspection while allowing determining of a state of the vegetation.
- the processor is further configured to: use an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generate the image based on the corrected pixel values and the corrected virtual pixel values.
- a further step is taken to improve the resulting image for use in determining a state of vegetation.
- the processor is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
- the use of an s-curve, in the form of a gamma correction which was traditionally used to compensate for nonlinear behaviour of old TV tubes.
- Pixels with higher values are pixels that received more light and pixels with lower values are pixels that received less light. It was found that pixels that are oversaturated or undersaturated contribute less to determining of the state of the vegetation than pixels having intermediate values.
- the resulting image is more suitable for determining a state of the vegetation because pixel values that are not suitable for determining the state of the vegetation, e.g. because they correspond to areas that are too bright or too dim, are compressed which results in an increased dynamic range for pixels having intermediate pixel values. This results in more usable values in the range suitable for determining the state of the vegetation.
- the combination of determining virtual pixel values and using an s-curve results in outputted images that are more suitable for determining a state of vegetation.
- the processor is configured to use an adjustable s-curve when correcting the pixel values and virtual pixel values. Adjusting the s-curve allows to adjust the dynamic range of pixels having values relevant for determining the vegetation index, e.g. pixels that are not oversaturated or undersaturated. This improves the flexibility of the system for use for different purposes or different conditions.
- the correction may be adjusted by the processor, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
- the processor is configured to determine the corresponding virtual blue pixel value based on the blue pixel group that has the lowest contrast with the red pixel group by interpolating blue pixel values of blue pixel groups in the direction of the blue pixel group that has the lowest contrast with the red pixel group and wherein the processor is configured to determine the corresponding virtual red pixel value based on the red pixel group that has the lowest contrast with the blue pixel group by interpolating red pixel values of red pixel groups in the direction of the red pixel group that has the lowest contrast with the blue pixel group.
- the virtual pixel values are interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values. The interpolation may be based on interpolation techniques or extrapolation techniques.
- the multiband bandpass filter is configured such that the blue band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the green band, red band and infrared band, the red band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, green band and infrared band, the green band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, red band and infrared band, the infrared band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, green band and red band.
- Leakage between colors from different bands may occur due to the spectral response of the color filter, wherein each band allows some light of other bands to be transmitted. This leakage is reduced by configuring the multiband bandpass filter such that it allows narrow bands of light to pass wherein the range of each band is such that the leakage in that range of the color filter is lower, e.g. compared to for another range.
- the blue band has wavelengths in the range of 410-450nm
- the green band has wavelengths in the range of 555 - 585nm
- the red band has wavelengths in the range of 645-675nm
- the infrared band has wavelengths in the range of 850 - 870nm.
- the color filter may not be enough to eliminate cross talk between the colors because the color filter, e.g. in a blue band, may allow some light of another band, e.g. a red band, to be transmitted, e.g. in the blue band, thus allowing cross talk.
- the width of the ranges is such that sufficient light, e.g. during daylight, enters the pixels to allow for an image to be taken which is suitable for determining a state of the vegetation.
- the imaging device has a rolling shutter and wherein the imaging device is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time.
- a rolling shutter allows a controlled amount of light to reach the optical sensor.
- a downside of using a rolling shutter is that pixels are illuminated at different times which results in the so-called rolling shutter effect when the camera or the object is moving. This distorts the image which reduces the effectiveness of the image for determining a state of the vegetation.
- a relatively low frame rate smaller than 5 fps, is required.
- the roller shutter effect is relevant.
- the imaging device has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second.
- the imaging device may have a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field.
- the processor may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
- the imaging system is configured to output data relating to one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken.
- This data may be stored together with the image to provide additional information about the image, for example when the image was taken, where the image was taken, the conditions when the image was taken.
- the processor is configured to determine if the image should be outputted based on one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken. This embodiment may reduce overlap between images that are outputted.
- the processor is configured to apply a gain factor to each color band such that at a predetermined color temperature the color bands have the same level.
- the invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system according to the invention.
- the method comprises: determining pixel values by performing a measurement; determining a virtual red pixel value for each blue pixel group, by: o determining a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; o determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group; determining a virtual blue pixel value for each red pixel group, by: o determining a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; o determining which of the adjacent blue pixel groups has a lowest contrast
- the method further comprises the steps of: using an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generating the image based on the corrected pixel values and the corrected virtual pixel values.
- the method comprises taking images with a frame rate between 50 and 60 Hz and with a shutter time of less than half a frame time.
- the method according to the present invention may comprise one or more of the features and/or benefits that are disclosed herein in relation to the multispectral imaging system according to the present invention, in particular as recited in the appended claims.
- the invention is further related to a vehicle, such as a drone, comprising the multispectral imaging system according to the invention.
- Fig. 1a shows an example of a pixel group
- Fig. 1 b shows an example of a color filter comprising various pixel groups
- Fig. 2 shows an example of the multispectral imaging system
- Fig. 3 shows an effect of the s-curve on the sensor output
- Fig. 4 shows a flow chart of a method.
- Figure 1a shows an example of a pixel group 5 that comprises two green pixels 6, G, an infrared pixel I R, and either a red pixel R or a blue pixel B.
- a pixel group 5 comprising a red pixel R is called a red pixel group RG and a Figure 1b shows how the pixel group 5 of figure 1a may be arranged in the color filter 8 for use in the multispectral imaging system 1 of the invention.
- Figure 1b shows an optical sensor 8 that comprises twelve pixel groups, wherein a red pixel group RB is adjacent to a blue pixel group BG.
- an optical sensor may comprise many more, for example more than a thousand, pixel groups.
- the optical sensor 8 may have a spectral response, wherein a pixel 6, for example a green pixel G, allows some light having a different wavelength than green light to pass. This reduces the effectiveness of the use of the optical sensor 8 in an imaging system 1 for use in determining a state of a vegetation. For this reason, a multiband bandpass filter 7 is used which allows light in narrow bands to pass to reduce crosstalk between the colors.
- the pixels 6 of the pixel group 5 are adjacent to each other in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source. Further, the pixel groups 5 themselves are arranged in a square formation in the optical sensor 8.
- Figure 2 shows an example of the multispectral imaging system 1 comprising a processor 2 and an imaging device 3.
- the processor 2 and the imaging device 3 are connected, e.g. via a wireless or wired connection, such that the processor 2 may receive pixel values measured by the imaging device 3.
- the imaging device 3 comprises an optical sensor 4, a multiband bandpass filter 7 and a color filter 8.
- the multiband bandpass filter 7 and the color filter 8 are arranged in a light path of the imaging device 3 such that light may enter the imaging device 3, travel through the bandpass filter 7 and the color filter 8 towards the optical sensor 4.
- the imaging device 3 may comprise a lens 8 to collect light, e.g. coming from a field of crops.
- the color filter 8 is provided on the optical sensor 4. In embodiments, the color filter 8 may be integrated with the optical sensor 4. In other embodiments, the color filter may be separate from the optical sensor 4.
- the color filter 8 may be a color filter 8 as shown in figure 1b.
- the color filter 8 comprises pixel groups 5 which each comprise 4 pixels 6.
- the optical sensor 4 may have a resolution of four megapixels, so that the infrared color band has a resolution of one megapixels, the red and blue color bands have a resolution of 0.5 megapixels and the green color band has a resolution of two megapixels.
- the processor 2 is configured to determine virtual red and blue pixel values such that the resolution of the red and blue pixels in combination with the virtual pixels is one megapixels for each band.
- the multiband bandpass filter 7 reduces crosstalk between the bands by allowing light of a narrow width, in the four bands, to be transmitted and to stop light having other wavelengths.
- the imaging device 3 may be mounted on a vehicle such as a drone or an automatic land vehicle.
- the processor 2 may be a microcomputer which is in radio contact with the imaging device 3.
- the processor 2 may be a microcomputer that is integrated with the imaging device 3, e.g. with a wired connection.
- the processor 2 is configured to output an image, for example to a display, a storage medium such as an SD card, or to an image printer.
- the color filter 8 is configured to transmit a light in a single band to a corresponding pixel 6 of the imaging device 3, such that, for each pixel group 5, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band.
- each pixel group 5 comprises two green pixels G, one IR pixel IR and either one red R or one blue pixel B.
- the imaging device 3 is provided with a virtual shutter which, in this embodiment, is a rolling shutter.
- the imaging device 3 is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time.
- the rolling shutter allows a controlled amount of light to reach the optical sensor 4.
- a downside of using the rolling shutter is that pixels 6 are illuminated at different times which results in the rolling shutter effect when the camera or the object is moving such as when the imaging device 3 is mounted on a vehicle. This may distort the image which reduces the effectiveness of the image for determining a state of the vegetation.
- a moving vehicle e.g.
- the imaging device 3 has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second. Additionally, e.g. for this embodiment, the imaging device 3 has a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field.
- the processor 2 may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
- Figure 3 shows an effect of the s-curve on the output of the optical sensor 4.
- the first graph 10 shows a linear graph wherein a constant s-curve is used, which translates into a corrected output which is linear with respect to the measured values.
- the second graph 11 shows a traditional gamma curve at 0.45 gamma which was traditionally used to compensate for nonlinear behaviour of old TV tubes.
- the third graph 12 shown is an example of an s-curve as used by the invention.
- the processor 2 may be configured to use the s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. The image is then generated based on the corrected pixel values.
- the processor 2 is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
- the s-curve shown 12 may be an adjustable s-curve such that the relation between the sensor output and the corrected output may vary depending on the needs of the user. Adjusting the s-curve allows to adjust the dynamic range of pixels 5 having values relevant for determining the vegetation index, e.g. pixels 5 that are not oversaturated or undersaturated. This improves the flexibility of the system 1 for use for different purposes or different conditions.
- the correction may be adjusted by the processor 2, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
- Figure 4 shows a flow chart of a method for outputting an image.
- the method comprises determining pixel values 100 by performing a measurement with the imaging system 1.
- the measurement may be taking an image of a field of crops.
- the pixels 6 receive light and pixel values are determined.
- the next step is to determine virtual pixel values 103 for both red pixel groups and blue pixel groups by first determining a contrast 101 between a pixel group 5 and adjacent pixel groups 5 by comparing green pixel values of these groups 5. This allows to determine which of the, e.g., red pixel groups 5 has the green pixel value closest to the adjacent, e.g., blue pixel group 5, which in turn allows to determine the contrast between these groups 5. Next it is determined which of these groups 5 has the lowest contrast 102 with the group for which the virtual pixel value is to be determined.
- the corresponding virtual pixel value is determined 103 based on the corresponding pixel value of the pixel group 5 having the lowest contrast with the group 5 for which the virtual pixel is to be determined.
- the processor 2 may determine the corresponding virtual pixel value based on the pixel group 5 that has the lowest contrast with the pixel group 5 for which the virtual pixel is to be determined by interpolating pixel values of pixel groups 5 in the direction of the pixel group that has the lowest contrast. This allows the virtual pixel values to be interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values. The interpolation may be based on interpolation techniques or extrapolation techniques.
- an s-curve is used to correct 104 the pixel values and virtual pixel values by compressing pixels 5 and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
- an s-curve is used to correct 104 the pixel values and virtual pixel values by compressing pixels 5 and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
- an image is generated 105 based on the (corrected) pixel values and the (corrected) virtual pixel values and the image is outputted 106.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention is related to a multispectral imaging system for providing an image for determining a state of vegetation. The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system of the invention. The multispectral imaging system of the invention comprises both a processor and an imaging device, e.g. a camera, e.g. a single optical path camera. The imaging device is suitable to be mounted on a vehicle, such as a remote controlled vehicle such as a drone or a remote controlled car.
Description
Title: MULTISPECTRAL IMAGING SYSTEM AND METHOD FOR USING THE SYSTEM
The invention is related to a multispectral imaging system for providing an image for determining a state of vegetation. The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system of the invention.
Imaging systems are used in the art to help determine a state of vegetation. During a lifecycle of a vegetation, such as a crop or a fruit bearing tree, light in different bandwidths is reflected in amounts that vary over the lifecycle of the vegetation. For example, older vegetation may appear more brown that younger vegetation which may appear more green. A downside of judging a state of vegetation using visible light is that a vegetation may already become unhealthy before a change in visible wavelengths occurs.
It is known that healthy vegetation has a different reflectivity in the infrared (IR) band compared to unhealthy vegetation, e.g. as a result of dehydration of the vegetation. This difference in reflectivity is, in practice, often measured compared to the reflectivity in the RGB (red, green, blue) bands.
An example of a vegetation index that provides an indication of a state of the vegetation is a Normalized Difference Vegetation Index (NDVI), which takes both R and IR bands into account. Determination of a vegetation index such as the NDVI requires precise analyse of the different color components comprised in an image of the vegetation.
Imaging systems for providing images suitable for determining a vegetation index are known, for example, from WO 2021/014764 A1 wherein an imaging system is disclosed including a multiband pass filter that selectively transmits band light in a specific color band, a color filter that transmits the band light in the specific band per pixel to an image pickup device. The system of WO 2021/014764 A1 may be used to determine a state of vegetation by analysing the color components of the vegetation in an image provided with the system.
An aim of the invention is to provide an improved, or at least an alternative, imaging system for providing an image for determining a state of vegetation.
The aim of the invention is achieved by a multispectral imaging system according to claim 1.
The multispectral imaging system of the invention comprises both a processor and an imaging device, e.g. a camera, e.g. a single optical path camera. The imaging device is suitable to be mounted on a vehicle, such as a remote controlled vehicle such as a drone or a remote controlled car. The imaging device may also be mounted on an automated vehicle, such as a drone performing automated flights. In practice the imaging device may be mounted on the vehicle.
The processor may be a microcomputer that is integrated with the imaging device such that both the processor and the imaging device are mountable or mounted on a vehicle, e.g. wherein the imaging device and the processor are connected through a wired connection. In other embodiments the processor and the imaging device may be connected through a wireless connection, e.g. a radio connection, that allows data to be send from the imaging device to the processor. The processor is configured to receive information, e.g. pixel values, from the imaging device and process this data, e.g. to output an image. The outputted image may, for example, be stored on an SD card or other storage medium.
The imaging device comprises an optical sensor comprising a plurality of pixel groups. Each pixel group is comprised of four adjacent pixels. A pixel group of the optical sensor has four pixels that are adjacent, e.g. in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source. Thus pixels values of pixels in the same pixel group may be assumed to come from the same source which allows the pixels of the same pixel group to be placed on top of each other to determine the true color of the source.
The imaging device further comprises a multiband bandpass filter provided in the light path of the imaging device between the optical sensor and an aperture of the imaging device. For example the aperture comprises a lens and/or a shutter for guiding incoming light and/or closing the aperture. The multiband bandpass filter is configured to selectively transmit light in a red band, a blue band, a green band, and an infrared band to the optical sensor. The bandpass filter allows light in relatively narrow bands to be transmitted towards the optical sensor such that there is reduced cross talk between colors. Cross talk between colors of different bands reduces the effectiveness of the imaging system in providing images for determining a state of vegetation because the cross talk reduces the accuracy of the vegetation index determined based on the image. The bandpass filter may be a global filter which allows light in the four bands to be transmitted globally over the filter while being opaque to light having other wavelengths.
The imaging device further comprises a color filter which is provided between the optical sensor and the multiband bandpass filter. The color filter is configured to transmit light in a single band to a corresponding pixel of the imaging device such that, for each pixel group, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band. Thus each pixel group comprises two green pixels, one IR pixel and either one red or one blue pixel.
Pixel groups that receive light from the blue band are called blue pixel groups and pixel groups that receive light from the red band are called red pixel groups. The pixel groups are ordered such that blue pixel groups are adjacent to red pixel groups. For example, the blue pixel groups and red pixel groups can be provided alternatingly in a first direction and/or a perpendicular second direction across the optical sensor.
When the imaging device is used to measure light, the imaging device is configured to assign to a pixel value to the pixels that receive light, wherein the pixel value corresponds to an amount of light that is received by the corresponding pixel. For example, a pixel that receives little light is assigned a low pixel value, whereas a pixel that receives a lot of light is assigned a high pixel value.
The processor is configured to receive the pixel values from the optical sensor so that the processor may output the image. The processor is configured to determine a virtual red or blue pixel value for each blue or red pixel group, such that each pixel group comprises a pixel value, either measured or virtual, for each band. In order to determine a virtual red pixel value for a blue pixel group the processor is configured to: determine a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group.
By determining a contrast between the pixel groups by comparing green pixel values of pixel groups that are adjacent to the pixel group for which the virtual pixel value is determined it may be determined if an edge, e.g. a sudden change in light intensity over the pixel groups, is
present. If an edge is present in a direction for the green pixels it is likely that an edge will also be present in that direction for the red pixels. Determining a virtual red pixel value based on pixels that have a higher likelihood of having changes in intensity reduces the accuracy of the virtual pixel value. Thus the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current blue pixel group. Since the adjacent pixel groups are red pixel groups, a virtual red pixel value may be determined based on the red pixel value of the pixel group which has to green pixel value closest to the green pixel value of the blue pixel group.
The processor is similarly configured to determine a virtual blue pixel value for each red pixel group by: determine a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; determine which of the adjacent blue pixel groups has a lowest contrast with the red pixel group; and determine the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group that has the lowest contrast with the red pixel group;
By comparing green pixel values of pixel groups that are adjacent to the pixel group for which the virtual pixel value is determined it may be determined if an edge, e.g. a sudden change in light intensity over the pixel groups, is present. If an edge is present in a direction for the green pixels it is likely that an edge will also be present in that direction for the blue pixels. Determining a virtual blue pixel value based on pixels that have a higher likelihood of having changes in intensity reduces the accuracy of the virtual pixel value. Thus the processor is configured to determine which of the adjacent pixel groups have green pixel values that are closest to the green pixel values of the current red pixel group. Since the adjacent pixel groups are all blue pixel groups, a virtual blue pixel value may be determined based on the blue pixel value of the pixel group which has to green pixel value closest to the green pixel value of the red pixel group.
By increasing the number of green pixels an image which is perceived sharper may be displayed on a display device. Human eyes perceive sharpness based on contrast of the image, which is determined by the black/white signal send to the display device. The black/white signal is largely based on the pixel values of green pixels, thus green pixels determine in large part the perceived sharpness of the image. Images which have a high number of green pixels are conceived to be sharper compared to images having a lower number of green pixels. Thus making the images more suitable for visual inspection.
A downside of increasing the number of green pixels is the reduction in red and blue pixels which leads to a reduced effectiveness in determining a state of vegetation if no further steps are taken. The above mentioned steps of determining the virtual red pixel values and virtual blue pixel values alleviate this problem by accurately determining the virtual red and blue pixel values based on pixel groups that have the closest contrast to the pixel group for which the virtual pixel value is to be determined.
The processor is further configured to generate an image based on the pixel values and the virtual pixel values and to output the image, for example to an image processor for determining the state of the vegetation or to a storage device, such as an SD card.
The imaging system of the invention allows for providing images with increased sharpness that are suitable for determining a state of the vegetation, e.g. that may be used in determining a vegetation index. The images are better suitable for a visual inspection while allowing determining of a state of the vegetation.
In embodiments the processor is further configured to: use an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generate the image based on the corrected pixel values and the corrected virtual pixel values.
In these embodiments, a further step is taken to improve the resulting image for use in determining a state of vegetation. The processor is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. The use of an s-curve, in the form of a gamma correction, which was traditionally used to compensate for nonlinear behaviour of old TV tubes.
Pixels with higher values are pixels that received more light and pixels with lower values are pixels that received less light. It was found that pixels that are oversaturated or undersaturated contribute less to determining of the state of the vegetation than pixels having intermediate values. By correcting the pixel values the resulting image is more suitable for determining a state of the vegetation because pixel values that are not suitable for
determining the state of the vegetation, e.g. because they correspond to areas that are too bright or too dim, are compressed which results in an increased dynamic range for pixels having intermediate pixel values. This results in more usable values in the range suitable for determining the state of the vegetation. The combination of determining virtual pixel values and using an s-curve results in outputted images that are more suitable for determining a state of vegetation.
In embodiments, the processor is configured to use an adjustable s-curve when correcting the pixel values and virtual pixel values. Adjusting the s-curve allows to adjust the dynamic range of pixels having values relevant for determining the vegetation index, e.g. pixels that are not oversaturated or undersaturated. This improves the flexibility of the system for use for different purposes or different conditions. The correction may be adjusted by the processor, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
In embodiments, the processor is configured to determine the corresponding virtual blue pixel value based on the blue pixel group that has the lowest contrast with the red pixel group by interpolating blue pixel values of blue pixel groups in the direction of the blue pixel group that has the lowest contrast with the red pixel group and wherein the processor is configured to determine the corresponding virtual red pixel value based on the red pixel group that has the lowest contrast with the blue pixel group by interpolating red pixel values of red pixel groups in the direction of the red pixel group that has the lowest contrast with the blue pixel group. For example, in these embodiments the virtual pixel values are interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values. The interpolation may be based on interpolation techniques or extrapolation techniques.
In embodiments the multiband bandpass filter is configured such that the blue band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the green band, red band and infrared band, the red band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, green band and infrared band, the green band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower amount of light from the blue band, red band and infrared band, the infrared band of the multiband bandpass filter has a wavelength range in which the color filter transmits a lower
amount of light from the blue band, green band and red band. Leakage between colors from different bands may occur due to the spectral response of the color filter, wherein each band allows some light of other bands to be transmitted. This leakage is reduced by configuring the multiband bandpass filter such that it allows narrow bands of light to pass wherein the range of each band is such that the leakage in that range of the color filter is lower, e.g. compared to for another range.
In a further embodiment the blue band has wavelengths in the range of 410-450nm, the green band has wavelengths in the range of 555 - 585nm, the red band has wavelengths in the range of 645-675nm, and/or the infrared band has wavelengths in the range of 850 - 870nm. The color filter may not be enough to eliminate cross talk between the colors because the color filter, e.g. in a blue band, may allow some light of another band, e.g. a red band, to be transmitted, e.g. in the blue band, thus allowing cross talk.
By using a bandpass filter that transmits light in the ranges of this embodiment, preferably in combination with a color filter which has a minimal amount of leakage in these ranges, cross talk between the different colors is further reduced. Additionally, the width of the ranges is such that sufficient light, e.g. during daylight, enters the pixels to allow for an image to be taken which is suitable for determining a state of the vegetation.
In embodiments, the imaging device has a rolling shutter and wherein the imaging device is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time. A rolling shutter allows a controlled amount of light to reach the optical sensor. A downside of using a rolling shutter is that pixels are illuminated at different times which results in the so-called rolling shutter effect when the camera or the object is moving. This distorts the image which reduces the effectiveness of the image for determining a state of the vegetation. In order to have sufficient overlap between images, when the camera is placed on a moving vehicle, e.g. a drone, a relatively low frame rate, smaller than 5 fps, is required. However, at such low frame rates the roller shutter effect is relevant. In order to reduce the effect of the rolling shutter effect, the imaging device has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second. Additionally, e.g. for this embodiment, the imaging device may have a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field. In this case the processor may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
In embodiments the imaging system is configured to output data relating to one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken. This data may be stored together with the image to provide additional information about the image, for example when the image was taken, where the image was taken, the conditions when the image was taken.
In embodiments the processor is configured to determine if the image should be outputted based on one or more of a location of the image device when the image was taken, a height of the imaging device when the image was taken, a speed of the imaging device when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device when the image was taken. This embodiment may reduce overlap between images that are outputted.
In embodiments, the processor is configured to apply a gain factor to each color band such that at a predetermined color temperature the color bands have the same level.
The invention is further related to a method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system according to the invention.
In an embodiment of the method, the method comprises: determining pixel values by performing a measurement; determining a virtual red pixel value for each blue pixel group, by: o determining a contrast between the blue pixel group and red pixel groups adjacent to the blue pixel group by comparing green pixel values of the red pixel groups adjacent to the blue pixel group with the green pixel values of the blue pixel group; o determining which of the adjacent red pixel groups has a lowest contrast with the blue pixel group; and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group that has the smallest contract with the blue pixel group; determining a virtual blue pixel value for each red pixel group, by: o determining a contrast between the red pixel group and blue pixel groups adjacent to the red pixel group by comparing green pixel values of the blue
pixel groups adjacent to the red pixel group with the green pixel values of the red pixel group; o determining which of the adjacent blue pixel groups has a lowest contrast with the red pixel group; and o determining the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group that has the lowest contrast with the red pixel group; generating an image based on the pixel values and the virtual pixel values; and outputting the image.
In a further embodiment of the method, the method further comprises the steps of: using an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generating the image based on the corrected pixel values and the corrected virtual pixel values.
In embodiments, the method comprises taking images with a frame rate between 50 and 60 Hz and with a shutter time of less than half a frame time.
The method according to the present invention may comprise one or more of the features and/or benefits that are disclosed herein in relation to the multispectral imaging system according to the present invention, in particular as recited in the appended claims.
The invention is further related to a vehicle, such as a drone, comprising the multispectral imaging system according to the invention.
The invention will be explained with reference to the drawing below, in which:
Fig. 1a shows an example of a pixel group;
Fig. 1 b shows an example of a color filter comprising various pixel groups;
Fig. 2 shows an example of the multispectral imaging system;
Fig. 3 shows an effect of the s-curve on the sensor output; and
Fig. 4 shows a flow chart of a method.
Figure 1a shows an example of a pixel group 5 that comprises two green pixels 6, G, an infrared pixel I R, and either a red pixel R or a blue pixel B. A pixel group 5 comprising a red pixel R is called a red pixel group RG and a Figure 1b shows how the pixel group 5 of figure
1a may be arranged in the color filter 8 for use in the multispectral imaging system 1 of the invention.
Figure 1b shows an optical sensor 8 that comprises twelve pixel groups, wherein a red pixel group RB is adjacent to a blue pixel group BG. In practical embodiments, an optical sensor may comprise many more, for example more than a thousand, pixel groups.
The optical sensor 8 may have a spectral response, wherein a pixel 6, for example a green pixel G, allows some light having a different wavelength than green light to pass. This reduces the effectiveness of the use of the optical sensor 8 in an imaging system 1 for use in determining a state of a vegetation. For this reason, a multiband bandpass filter 7 is used which allows light in narrow bands to pass to reduce crosstalk between the colors.
As can be seen in figure 1a, the pixels 6 of the pixel group 5 are adjacent to each other in a square formation, such that the pixels of a single pixel group may collect light coming from the same, e.g. point, source. Further, the pixel groups 5 themselves are arranged in a square formation in the optical sensor 8.
Figure 2 shows an example of the multispectral imaging system 1 comprising a processor 2 and an imaging device 3. The processor 2 and the imaging device 3 are connected, e.g. via a wireless or wired connection, such that the processor 2 may receive pixel values measured by the imaging device 3.
The imaging device 3 comprises an optical sensor 4, a multiband bandpass filter 7 and a color filter 8. The multiband bandpass filter 7 and the color filter 8 are arranged in a light path of the imaging device 3 such that light may enter the imaging device 3, travel through the bandpass filter 7 and the color filter 8 towards the optical sensor 4. For example, the imaging device 3 may comprise a lens 8 to collect light, e.g. coming from a field of crops.
The color filter 8 is provided on the optical sensor 4. In embodiments, the color filter 8 may be integrated with the optical sensor 4. In other embodiments, the color filter may be separate from the optical sensor 4. The color filter 8 may be a color filter 8 as shown in figure 1b. The color filter 8 comprises pixel groups 5 which each comprise 4 pixels 6.
The optical sensor 4 may have a resolution of four megapixels, so that the infrared color band has a resolution of one megapixels, the red and blue color bands have a resolution of 0.5 megapixels and the green color band has a resolution of two megapixels. The processor 2 is
configured to determine virtual red and blue pixel values such that the resolution of the red and blue pixels in combination with the virtual pixels is one megapixels for each band.
The multiband bandpass filter 7 reduces crosstalk between the bands by allowing light of a narrow width, in the four bands, to be transmitted and to stop light having other wavelengths.
The imaging device 3 may be mounted on a vehicle such as a drone or an automatic land vehicle. The processor 2 may be a microcomputer which is in radio contact with the imaging device 3. In other embodiments, the processor 2 may be a microcomputer that is integrated with the imaging device 3, e.g. with a wired connection. The processor 2 is configured to output an image, for example to a display, a storage medium such as an SD card, or to an image printer.
The color filter 8 is configured to transmit a light in a single band to a corresponding pixel 6 of the imaging device 3, such that, for each pixel group 5, a first pixel of the corresponding pixel group receives light from the green band, a second pixel of the corresponding pixel group receives light from the infrared band, a third pixel of the corresponding pixel group receives light from the green band, and a fourth pixel of the corresponding pixel group receives light from either the red band or the blue band. Thus each pixel group 5 comprises two green pixels G, one IR pixel IR and either one red R or one blue pixel B.
The imaging device 3 is provided with a virtual shutter which, in this embodiment, is a rolling shutter. In order to reduce the rolling shutter effect, the imaging device 3 is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time. The rolling shutter allows a controlled amount of light to reach the optical sensor 4. A downside of using the rolling shutter is that pixels 6 are illuminated at different times which results in the rolling shutter effect when the camera or the object is moving such as when the imaging device 3 is mounted on a vehicle. This may distort the image which reduces the effectiveness of the image for determining a state of the vegetation. In order to have sufficient overlap between images, when the camera is placed on a moving vehicle, e.g. a drone, a relatively low frame rate, smaller than 5 fps, is required. However, at such low frame rates the roller shutter effect is relevant. In order to reduce the effect of the rolling shutter effect, the imaging device 3 has a high frame rate, e.g. between 50 and 60 Hz and a fast shutter time, e.g. of less than half a frame time, e.g. between 1/100 and 1/120 of a second. Additionally, e.g. for this embodiment, the imaging device 3 has a frame rate that results in images with substantial overlap such that not all images are required to determine a state of vegetation, e.g. in a field.
In this case the processor 2 may be configured to only output a subset of images taken, e.g. at an external rate or based on some external factor.
Figure 3 shows an effect of the s-curve on the output of the optical sensor 4. The first graph 10 shows a linear graph wherein a constant s-curve is used, which translates into a corrected output which is linear with respect to the measured values. The second graph 11 shows a traditional gamma curve at 0.45 gamma which was traditionally used to compensate for nonlinear behaviour of old TV tubes.
The third graph 12 shown is an example of an s-curve as used by the invention. The processor 2 may be configured to use the s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. The image is then generated based on the corrected pixel values.
This allows to further improve the resulting image for use in determining a state of vegetation. The processor 2 is configured to use an s-curve to correct the pixel values and virtual pixel values by compressing pixels and virtual pixels, e.g. by giving them a lower amplification factor, having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values.
The s-curve shown 12 may be an adjustable s-curve such that the relation between the sensor output and the corrected output may vary depending on the needs of the user. Adjusting the s-curve allows to adjust the dynamic range of pixels 5 having values relevant for determining the vegetation index, e.g. pixels 5 that are not oversaturated or undersaturated. This improves the flexibility of the system 1 for use for different purposes or different conditions. The correction may be adjusted by the processor 2, e.g. based on additional measurements, or the correction may be adjusted by a user, e.g. based on experience.
Figure 4 shows a flow chart of a method for outputting an image. The method comprises determining pixel values 100 by performing a measurement with the imaging system 1. The measurement may be taking an image of a field of crops. By allowing light to fall on the optical sensor 4 during a measurement the pixels 6 receive light and pixel values are determined.
The next step is to determine virtual pixel values 103 for both red pixel groups and blue pixel groups by first determining a contrast 101 between a pixel group 5 and adjacent pixel groups
5 by comparing green pixel values of these groups 5. This allows to determine which of the, e.g., red pixel groups 5 has the green pixel value closest to the adjacent, e.g., blue pixel group 5, which in turn allows to determine the contrast between these groups 5. Next it is determined which of these groups 5 has the lowest contrast 102 with the group for which the virtual pixel value is to be determined.
Subsequently, the corresponding virtual pixel value is determined 103 based on the corresponding pixel value of the pixel group 5 having the lowest contrast with the group 5 for which the virtual pixel is to be determined. For example, the processor 2 may determine the corresponding virtual pixel value based on the pixel group 5 that has the lowest contrast with the pixel group 5 for which the virtual pixel is to be determined by interpolating pixel values of pixel groups 5 in the direction of the pixel group that has the lowest contrast. This allows the virtual pixel values to be interpolated based on a direction perpendicular to an edge, e.g. a direction which has a large change in contrast, in the image. Interpolation may be based on an average pixel values or may be based on using some kind of fit function to fit the pixel values. The interpolation may be based on interpolation techniques or extrapolation techniques.
Next, optionally, an s-curve is used to correct 104 the pixel values and virtual pixel values by compressing pixels 5 and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values. By correcting the pixel values the resulting image is more suitable for determining a state of the vegetation because pixel values that are not suitable for determining the state of the vegetation, e.g. because they correspond to areas that are too bright or too dim, are compressed which results in an increased dynamic range for pixels having intermediate pixel values. This results in more usable values in the range suitable for determining the state of the vegetation.
Next, an image is generated 105 based on the (corrected) pixel values and the (corrected) virtual pixel values and the image is outputted 106.
Claims
1. Multispectral imaging system (1) for providing an image for determining a state of vegetation wherein the imaging system (1) comprises a processor (2) and an imaging device (3) that is mountable on a vehicle, e.g. a drone, wherein the imaging device (3) comprises: an optical sensor (4) comprising a plurality of pixel groups (5), wherein each pixel group (5) is comprised of four adjacent pixels (6); a multiband bandpass filter (7) provided between the optical sensor (4) and an aperture of the imaging device (3), wherein the multiband bandpass filter (7) is configured to selectively transmit light in a red band, a blue band, a green band, and an infrared band to the optical sensor; and a color filter (8) provided between the optical sensor (4) and the multiband bandpass filter (7), wherein the color filter (8) is configured to transmit light received from the multiband bandpass filter to the optical sensor such that, for each pixel group (5), a first pixel (6) of the corresponding pixel group (5) receives light from the green band, a second pixel (6) of the corresponding pixel group (5) receives light from the infrared band, a third pixel (6) of the corresponding pixel group (5) receives light from the green band, and a fourth pixel (6) of the corresponding pixel group (5) receives light from either the red band or the blue band, wherein pixel groups (5) that receive light from the blue band are called blue pixel groups (BG) and wherein pixel groups (5) that receive light from the red band are called red pixel groups (RG), wherein blue pixel groups (BG) are adjacent to red pixel groups (RG), wherein the imaging system (1) is configured to assign pixel values to each pixel (6) when light falls on the optical sensor (4), e.g. during a measurement, wherein the processor (2) is connected to the optical sensor (4) to receive the pixel values thereof, and wherein the processor (2) is configured to: determine a virtual red pixel value for each blue pixel group (BG), by: o determining a contrast between the blue pixel group (BG) and red pixel groups (RG) adjacent to the blue pixel group by comparing green pixel values of the red pixel groups (RG) adjacent to the blue pixel group (BG) with the green pixel values of the blue pixel group (BG); o determining which of the adjacent red pixel groups (RG) has a lowest contrast with the blue pixel group (BG); and
o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group (RG) that has the smallest contract with the blue pixel group (BG); determine a virtual blue pixel value for each red pixel group (RG), by: o determining a contrast between the red pixel group (RG) and blue pixel groups (BG) adjacent to the red pixel group (RG) by comparing green pixel values of the blue pixel groups (BG) adjacent to the red pixel group (RG) with the green pixel values of the red pixel group (RG); o determining which of the adjacent blue pixel groups (BG) has a lowest contrast with the red pixel group (RG); and o determining the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG); generate an image based on the pixel values and the virtual pixel values; and output the image.
2. Multispectral imaging system according to claim 1 , wherein the processor (2) is further configured to: use an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generate the image based on the corrected pixel values and the corrected virtual pixel values.
3. Multispectral imaging system according to claim 2, wherein the processor (2) is configured to use an adjustable s-curve when correcting the pixel values and virtual pixel values.
4. Multispectral imaging system according to one or more of the preceding claims, wherein the processor (2) is configured to determine the corresponding virtual blue pixel value based on the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG) by interpolating blue pixel values of blue pixel groups (BG) in the direction of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG) and wherein the processor (2) is configured to determine the corresponding virtual red pixel value based on the red pixel group (RG) that has the lowest contrast with the blue pixel group
(BG) by interpolating red pixel values of red pixel groups (RG) in the direction of the red pixel group (RG) that has the lowest contrast with the blue pixel group (BG).
5. Multispectral imaging system according to one or more of the preceding claims, wherein the multiband bandpass filter (7) is configured such that the blue band of the multiband bandpass filter (7) has a wavelength range wherein the color filter (8) transmits a lower amount of light from the green band, red band and infrared band, the red band of the multiband bandpass (7) filter has a wavelength range wherein the color filter (8) transmits a lower amount of light from the blue band, green band and infrared band, the green band of the multiband bandpass filter (7) has a wavelength range wherein the color filter (8) transmits a lower amount of light from the blue band, red band and infrared band, the infrared band of the multiband bandpass filter (7) has a wavelength range wherein the color filter (8) transmits a lower amount of light from the blue band, green band and red band.
6. Multispectral imaging system according to one or more of the preceding claims, wherein the blue band transmits light with wavelengths in the range of 410-450nm, the green band transmits light with wavelengths in the range of 555 - 585nm, the red band transmits light with wavelengths in the range of 645-675nm, and/or the infrared band transmits light with wavelengths in the range of 850 - 870nm.
7. Multispectral imaging system according to one or more of the preceding claims, wherein the imaging device (3) has a rolling shutter and wherein the imaging device (3) is configured to have a frame rate between 50 and 60 Hz and a shutter time of less than half a frame time.
8. Multispectral imaging system according to one or more of the preceding claims, wherein the imaging system (1) is configured to output data relating to one or more of a location of the image device (3) when the image was taken, a height of the imaging device (3) when the image was taken, a speed of the imaging device (3) when the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device (3) when the image was taken.
9. Multispectral imaging system according to one or more of the preceding claims, wherein the processor (2) is configured to determine if the image should be outputted based on one or more of a location of the image device (3) when the image was taken, a height of the imaging device (3) when the image was taken, a speed of the imaging device (3) when
the image was taken, information relating to lighting conditions when the image was taken, and an angle of the imaging device (3) when the image was taken.
10. Multispectral imaging system according to one or more of the preceding claims, wherein the processor (2) is configured to apply a gain factor to each color band such that at a predetermined color temperature the color bands have the same level.
11. Method for providing an image for determining a state of vegetation wherein use is made of a multispectral imaging system (1) according to one or more of the preceding claims.
12. Method according to claim 11, wherein the method comprises: determining pixel values for pixels of the optical sensor (4) by taking a measurement with the imaging device (3); determining a virtual red pixel value for each blue pixel group (BG), by: o determining a contrast between the blue pixel group (BG) and red pixel groups (RG) adjacent to the blue pixel group (BG) by comparing green pixel values of the red pixel groups (RG) adjacent to the blue pixel group (BG) with the green pixel values of the blue pixel group (BG); o determining which of the adjacent red pixel groups (RG) has a lowest contrast with the blue pixel group (BG); and o determining the corresponding virtual red pixel value based on the red pixel value of the red pixel group (RG) that has the smallest contract with the blue pixel group (BG); determining a virtual blue pixel value for each red pixel group (RG), by: o determining a contrast between the red pixel group (RG) and blue pixel groups (BG) adjacent to the red pixel group (RG) by comparing green pixel values of the blue pixel groups (BG) adjacent to the red pixel group (RG) with the green pixel values of the red pixel group (RG); o determining which of the adjacent blue pixel groups (BG) has a lowest contrast with the red pixel group (RG); and o determining the corresponding virtual blue pixel value based on the blue pixel value of the blue pixel group (BG) that has the lowest contrast with the red pixel group (RG); generating an image based on the pixel values and the virtual pixel values; and outputting the image.
13. Method according to claim 12, wherein the method further comprises the steps of:
using an s-curve to correct the pixel values and the virtual pixel values by compressing pixels and virtual pixels having higher and lower pixel values compared to pixels and virtual pixels having intermediate pixel values; and generating the image based on the corrected pixel values and the corrected virtual pixel values.
14. Vehicle comprising the multispectral imaging system according to one or more of the claims 1 - 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2031092 | 2022-02-28 | ||
NL2031092A NL2031092B1 (en) | 2022-02-28 | 2022-02-28 | Multispectral imaging system and method for using the system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023161477A1 true WO2023161477A1 (en) | 2023-08-31 |
Family
ID=82100119
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2023/054857 WO2023161477A1 (en) | 2022-02-28 | 2023-02-27 | Multispectral imaging system and method for using the system |
Country Status (2)
Country | Link |
---|---|
NL (1) | NL2031092B1 (en) |
WO (1) | WO2023161477A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180040104A1 (en) * | 2016-08-04 | 2018-02-08 | Intel Corporation | Restoring Color and Infrared Images from Mosaic Data |
EP3761638A1 (en) * | 2018-03-23 | 2021-01-06 | Sony Corporation | Signal processing apparatus, signal processing method, image capture apparatus, and medical image capture apparatus |
WO2021014764A1 (en) | 2019-07-24 | 2021-01-28 | Sony Corporation | Image processing apparatus, image pickup apparatus, method and program |
-
2022
- 2022-02-28 NL NL2031092A patent/NL2031092B1/en active
-
2023
- 2023-02-27 WO PCT/EP2023/054857 patent/WO2023161477A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180040104A1 (en) * | 2016-08-04 | 2018-02-08 | Intel Corporation | Restoring Color and Infrared Images from Mosaic Data |
EP3761638A1 (en) * | 2018-03-23 | 2021-01-06 | Sony Corporation | Signal processing apparatus, signal processing method, image capture apparatus, and medical image capture apparatus |
WO2021014764A1 (en) | 2019-07-24 | 2021-01-28 | Sony Corporation | Image processing apparatus, image pickup apparatus, method and program |
Non-Patent Citations (1)
Title |
---|
ORIT SKORKA ET AL: "Color correction for RGB sensors with dual-band filters for in-cabin imaging applications", ELECTRONIC IMAGING, vol. 2019, no. 15, 13 January 2019 (2019-01-13), US, pages 46 - 1, XP055715959, ISSN: 2470-1173, DOI: 10.2352/ISSN.2470-1173.2019.15.AVM-046 * |
Also Published As
Publication number | Publication date |
---|---|
NL2031092B1 (en) | 2023-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8767103B2 (en) | Color filter, image processing apparatus, image processing method, image-capture apparatus, image-capture method, program and recording medium | |
US6573932B1 (en) | Method for automatic white balance of digital images | |
CN106973240B (en) | Digital camera imaging method for realizing high-definition display of high-dynamic-range image | |
CN102894957B (en) | Image processing apparatus for fundus image, and image processing method for fundus image | |
US20110169984A1 (en) | Imaging device and signal processing circuit for the imaging device | |
US20070177004A1 (en) | Image creating method and imaging device | |
US20080043117A1 (en) | Method and Apparatus for Compensating Image Sensor Lens Shading | |
US9420197B2 (en) | Imaging device, imaging method and imaging program | |
KR102100031B1 (en) | Camera and method of producing color images | |
JP7207124B2 (en) | CALIBRATION DEVICE, CALIBRATION METHOD, SPECTRAL CAMERA, AND DISPLAY DEVICE | |
JP6013284B2 (en) | Imaging apparatus and imaging method | |
KR101685887B1 (en) | Image processing device, image processing method, and computer-readable storage medium having image processing program | |
RU2011111681A (en) | WAYS AND SYSTEMS OF ADJUSTMENT OF WHITE LED BACKLIGHT | |
US20150222829A1 (en) | Imaging apparatus | |
CN107247384B (en) | Brightness compensation data acquisition system and method, and image brightness adjustment method and device | |
KR101504564B1 (en) | Method of processing a relative illumination phenomenon on a digital image and associated processing system | |
WO2007120695A2 (en) | Rich color image processing method and apparatus | |
CN114866754A (en) | Automatic white balance method and device, computer readable storage medium and electronic equipment | |
WO2023161477A1 (en) | Multispectral imaging system and method for using the system | |
US5337130A (en) | Exposure control method for photographic printing | |
CN116324869A (en) | Method and device for correcting brightness and chrominance | |
JP2016525723A (en) | Movie projection measurement | |
WO2014208188A1 (en) | Image processing apparatus and image processing method | |
US10567713B2 (en) | Camera and method of producing color images | |
US11083368B2 (en) | System and method, in particular for microscopes and endoscopes, for creating HDR monochrome images of a fluorescing fluorophore |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23708720 Country of ref document: EP Kind code of ref document: A1 |