CN115297268B - Imaging system and image processing method - Google Patents

Imaging system and image processing method Download PDF

Info

Publication number
CN115297268B
CN115297268B CN202210824473.8A CN202210824473A CN115297268B CN 115297268 B CN115297268 B CN 115297268B CN 202210824473 A CN202210824473 A CN 202210824473A CN 115297268 B CN115297268 B CN 115297268B
Authority
CN
China
Prior art keywords
channel
image signal
type
channels
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210824473.8A
Other languages
Chinese (zh)
Other versions
CN115297268A (en
Inventor
聂鑫鑫
於敏杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202210824473.8A priority Critical patent/CN115297268B/en
Publication of CN115297268A publication Critical patent/CN115297268A/en
Application granted granted Critical
Publication of CN115297268B publication Critical patent/CN115297268B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides an imaging system and an image processing method, wherein the imaging system comprises: an image sensor, a statistics unit and an exposure control unit. The statistics unit is used for respectively counting the image data of each type of channel, the exposure control unit is used for calculating the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, controlling the brightness adjustment of the image data of the type of channels based on the calculated exposure parameters, and independently exposing the image data of the type of channels according to the actual conditions of different energy of light components responded by the different types of channels, so that the brightness of the image data of the type of channels is controlled within a proper brightness range, and the final imaging effect is improved.

Description

Imaging system and image processing method
Technical Field
The present disclosure relates to the field of computer vision, and in particular, to an imaging system and an image processing method.
Background
In current imaging systems, image sensors are typically employed for imaging. The image sensor uses a photoelectric conversion Device such as a CCD (Charge-coupled Device), a CMOS (Complementary Metal Oxide Semiconductor ), or the like. The image sensor comprises a plurality of types of channels, the channels are respectively arranged to correspond to pixels in the pixel array, one channel responds to the passing light component, one pixel is correspondingly obtained, and further the conversion of the light signal into the image signal can be realized.
However, since image signals between various channels may affect each other, for example, RGB (color) channels may sense energy of part of NIR (Near Infrared) light in addition to energy of color light, resulting in poor imaging effect. Therefore, in the corresponding imaging system, an interpolation mode is adopted, the spatial information of the RGB channel is used for interpolating the NIR channel, the spatial information of the NIR channel is used for interpolating the RGB channel, the correlation between the RGB channel and the NIR channel is analyzed in the interpolation process, and the obtained difference image has higher resolution, so that the imaging effect is improved.
However, in the practical environment, the energy of the light components responded by the various channels is greatly different, and the correlation between the channels can be analyzed by adopting the interpolation method, but a uniform imaging strategy is still utilized during imaging, so that the practical imaging effect is not ideal.
Disclosure of Invention
An object of an embodiment of the present application is to provide an imaging system and an image processing method, so as to improve an imaging effect of the imaging system. The specific technical scheme is as follows:
in a first aspect, embodiments of the present application provide an imaging system, the system comprising: an image sensor, a statistics unit and an exposure control unit; the image sensor includes a plurality of types of channels;
The image sensor is used for converting an optical signal into an image signal, wherein the optical signal comprises light components in various wave band ranges;
the statistics unit is used for acquiring the image signals; extracting image data of various channels in the image signal; respectively counting the image data of the various channels to obtain the statistical data of the various channels; the statistical data of the various channels are sent to the exposure control unit;
the exposure control unit is used for receiving the statistic data of the various channels sent by the statistic unit; aiming at any channel, according to the statistical data of the channel, calculating the exposure parameters corresponding to the channel, and controlling the brightness adjustment of the image data of the channel based on the exposure parameters.
Optionally, the image sensor includes: a first type of channel that is responsive to light components in the visible band and a second type of channel that is responsive to light components in the near infrared band.
Optionally, the first type of channel includes a plurality of color channels, and the second type of channel includes a near infrared channel;
the statistics unit is specifically configured to:
calculating the image data statistic of the first type of channel as the statistic of the first type of channel according to the image data of at least one color channel in the plurality of color channels;
And calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
Optionally, the statistics unit is specifically configured to:
extracting image data of each color channel and image data of the near infrared channel from the image signal; calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel respectively; carrying out weighted summation on the image data average value of each color channel; taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel;
or,
partitioning the image signal to obtain a plurality of image signal blocks; extracting image data of each color channel and the image data of the near infrared channel from any image signal block; calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel in each image signal block; carrying out weighted summation on the image data average value of each color channel; taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel;
Or,
extracting image data of each color channel and image data of the near infrared channel from the image signal; obtaining a histogram of each color channel and a histogram of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel respectively; respectively carrying out weighted average calculation on the gray scale numbers in the histogram of each color channel and the histogram of the near infrared channel to obtain an image data average value of each color channel and an image data average value of the near infrared channel; carrying out weighted summation on the image data average value of each color channel; and taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel.
Optionally, the system further comprises: a light filtering unit;
the filtering unit is used for filtering out other light components except for the light components in the appointed wave band range in the input light signals and transmitting the filtered light signals to the image sensor.
Optionally, the optical filtering unit includes a switching device;
the switching device is used for switching the filtering state of the filtering unit;
The filtering unit is used for filtering other light components except for the light components in the specified wave band range in the input light signal when the filtering state is on, and transmitting the filtered light signal to the image sensor; and transmitting all light components in the light signal to the image sensor when the filtering state is closed.
Optionally, the system further comprises: a light supplementing unit;
the light supplementing unit is used for carrying out near infrared light supplementing on the scene so that the input light signal comprises near infrared light.
Optionally, the image sensor includes a second type of channel responsive to light components in the near infrared band;
the exposure control unit is further configured to control the light supplementing unit to adjust the light supplementing intensity according to the statistical data of the second type channel.
Optionally, the exposure control unit is specifically configured to:
if the statistical data of the second type of channels is larger than a first preset threshold value, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light;
and if the statistical data of the second type of channels is smaller than a second preset threshold value, controlling the light supplementing unit to improve the intensity of emitting the near infrared light.
Optionally, the image sensor further includes: a first type of channel responsive to light components in the visible band; the exposure control unit is specifically configured to:
acquiring a first exposure time of the first type channel, first target data corresponding to the first type channel, a second exposure time of the second type channel and second target data corresponding to the second type channel;
calculating a first data offset of the first type channel according to the statistical data of the first type channel and the first target data, and calculating a first exposure gain according to the statistical data of the first type channel and the first target data if the first data offset is not in a first preset range;
calculating a second data offset of the second class channel according to the statistical data of the second class channel and the second target data, and calculating a second exposure gain according to the statistical data of the second class channel and the second target data if the second data offset is not within a second preset range;
if the first exposure time is equal to the second exposure time, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light when the second exposure gain is smaller than a first preset gain threshold value, and controlling the light supplementing unit to improve the intensity of emitting the near infrared light when the second exposure gain is larger than a second preset gain threshold value;
If the first exposure time is not equal to the second exposure time, the second exposure time is reduced when the second exposure gain is smaller than the first preset gain threshold value, and the second exposure time is increased when the second exposure gain is larger than the second preset gain threshold value.
Optionally, the system further comprises: a processing unit;
the processing unit is used for acquiring image signals output by the image sensor, current exposure parameters corresponding to the various channels and associated information among the various channels; determining the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the correlation information between the various channels; and removing light components of the other channel type contained in the channel type according to the correlation between the two channel types.
Optionally, the image sensor includes: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
the processing unit is specifically configured to:
acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the color of the first type channel and the brightness of the second type channel; or, acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the brightness of the first type channel and the brightness of the second type channel;
Normalizing the image data of the first type channel and the image data of the second type channel in the image signal to the same exposure parameter according to the current exposure parameters corresponding to the first type channel and the second type channel;
determining the weight of the first type channel and the weight of the second type channel based on the association information;
and removing light components of the second type channels contained in the first type channels according to the weights of the first type channels and the weights of the second type channels, and the normalized image data of the first type channels and the normalized image data of the second type channels.
Optionally, the image sensor includes: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
the processing unit is specifically configured to:
acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the color of the first type channel and the brightness of the second type channel; or, acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the brightness of the first type channel and the brightness of the second type channel;
Determining the weight of the first type channel and the weight of the second type channel according to the current exposure parameters corresponding to the first type channel and the second type channel and the association information;
and removing light components of the second type channels contained in the first type channels according to the weights of the first type channels, the weights of the second type channels, the image data of the first type channels and the image data of the second type channels.
Optionally, the processing unit includes a signal decomposition module and a post-processing module;
the signal decomposition module is used for acquiring an image signal, decomposing a visible light signal and a near infrared light signal of the image signal, and outputting a first decomposed image signal and a second decomposed image signal after decomposition, wherein the first decomposed image signal is the visible light image signal, and the second decomposed image is the near infrared light image signal;
the post-processing module is configured to obtain the first decomposed image signal, the second decomposed image signal, a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel, and association information between the first type channel and the second type channel; determining the correlation between the first type channel and the second type channel according to the first current exposure parameter, the second current exposure parameter and the correlation information; and determining a first output image signal and/or a second output image signal according to the correlation, wherein the first output image signal is the first decomposed image signal with the near infrared light component removed.
Optionally, the signal decomposition module is specifically configured to:
acquiring an image signal; respectively up-sampling each color component of the visible light signal and the near infrared light signal in the image signal to obtain an image signal of each color component and an image signal of near infrared light; combining the image signals of the color components to obtain a first decomposed image signal for output, and outputting the image signal of the near infrared light as a second decomposed image signal;
or,
acquiring an image signal, a first current exposure gain corresponding to the first type channel and a second current exposure gain corresponding to the second type channel; if the second current exposure gain is smaller than the first current exposure gain, performing edge judgment interpolation on the image data of the first type channel according to the image data of the second type channel in the image signal, and if the second current exposure gain is larger than the first current exposure gain, performing edge judgment interpolation on the image data of the second type channel according to the image data of the first type channel in the image signal; obtaining image signals of various color components of the interpolated visible light signal and image signals of near infrared light; and combining the image signals of the color components to obtain a first decomposed image signal for output, and outputting the image signal of the near infrared light as a second decomposed image signal.
Optionally, the post-processing module includes: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a third processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the third processing sub-module is used for processing the first recovery image signal to obtain a first output image signal.
Optionally, the post-processing module includes: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a fourth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the fourth processing submodule is used for processing the second restored image signal to obtain a second output image signal.
Optionally, the post-processing module includes: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
The third processing sub-module is configured to process the first restored image signal to obtain a third sub-processed image signal;
the fourth processing sub-module is configured to process the second restored image signal to obtain a fourth sub-processed image signal;
the fifth processing sub-module is configured to process the third sub-processed image signal and the fourth sub-processed image signal to obtain a first output image signal.
Optionally, the post-processing module includes: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
The third processing sub-module is configured to process the first restored image signal to obtain a third sub-processed image signal;
the fourth processing sub-module is configured to process the second restored image signal to obtain a fourth sub-processed image signal;
the fifth processing sub-module is configured to process the third sub-processed image signal and the fourth sub-processed image signal to obtain a first output image signal and a second output image signal.
Optionally, the processing unit further comprises a preprocessing module;
the preprocessing module is used for acquiring the image signal output by the image sensor, preprocessing the image signal and sending the preprocessed image signal to the signal decomposition module.
Optionally, the processing unit is further configured to fuse the image data of each type of channel from which the light component of the other type of channel has been removed, to obtain a fused image signal.
In a second aspect, an embodiment of the present application provides an image processing method, applied to an imaging system, where the system includes: an image sensor, a statistics unit and an exposure control unit; the image sensor includes a plurality of types of channels; the method comprises the following steps:
The image sensor converts an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands;
the statistics unit acquires the image signal, extracts image data of various channels in the image signal, respectively carries out statistics on the image data of the various channels to obtain statistics data of the various channels, and sends the statistics data of the various channels to the exposure control unit;
the exposure control unit receives the statistic data of the various channels sent by the statistic unit, calculates exposure parameters corresponding to the channels according to the statistic data of the channels, and controls brightness adjustment of the image data of the channels based on the exposure parameters.
Optionally, the image sensor includes: a first type of channel responsive to light components in the visible light band, the first type of channel comprising a plurality of color channels, and a second type of channel responsive to light components in the near infrared light band, the second type of channel comprising a near infrared channel;
the statistics unit respectively performs statistics on the image data of the various channels to obtain statistics data of the various channels, and the statistics unit comprises:
Calculating the image data statistic of the first type of channel as the statistic of the first type of channel according to the image data of at least one color channel in the plurality of color channels;
and calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
Optionally, the system further comprises: a light supplementing unit; the image sensor includes a second type of channel responsive to light components in the near infrared band;
the method further comprises the steps of:
if the statistical data of the second type of channels is larger than a first preset threshold value, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light;
and if the statistical data of the second type of channels is smaller than a second preset threshold value, controlling the light supplementing unit to improve the intensity of emitting the near infrared light.
Optionally, the system further comprises: a processing unit;
after the image sensor converts the optical signal into an image signal, the method further comprises:
the processing unit acquires the image signals output by the image sensor, the current exposure parameters corresponding to the various channels and the association information among the various channels, determines the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information among the various channels, and removes the light components of the other type of channels contained in one type of channels according to the correlation between every two types of channels.
An imaging system and an image processing method provided in an embodiment of the present application, where the imaging system includes: an image sensor, a statistics unit and an exposure control unit. Wherein, the statistical unit is used for obtaining the image signals obtained by converting the light signals of the image sensor, extracting the image data of various channels in the image signals, respectively counting the image data of various channels to obtain the statistical data of various channels, and sending the statistical data of various channels to an exposure control unit; the exposure control unit is used for receiving the statistic data of various channels sent by the statistic unit; aiming at any channel, according to the statistical data of the channel, calculating the exposure parameters corresponding to the channel, and controlling the brightness adjustment of the image data of the channel based on the exposure parameters.
In the embodiment of the application, a statistics unit and an exposure control unit are added in the imaging system, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates exposure parameters corresponding to the type of channel according to the statistics data of the type of channel, and based on the calculated exposure parameters, the brightness adjustment of the image data of the type of channel is controlled, and the independent exposure is carried out on the image data of the type of channel according to the actual conditions of different energy of light components responded by different types of channels, so that the brightness of the image data of the type of channel is controlled within a proper brightness range, and the final imaging effect is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an imaging system according to an embodiment of the present application;
FIG. 2a is a schematic diagram illustrating an arrangement of image sensors according to an embodiment of the present application;
FIG. 2b is a schematic diagram illustrating an arrangement of image sensors according to another embodiment of the present application;
FIG. 2c is a schematic diagram illustrating an arrangement of image sensors according to another embodiment of the present application;
FIG. 3 is a schematic diagram of spectral response curves of RGB channels and NIR channels according to embodiments of the present application;
FIG. 4 is a schematic diagram of the spectral response curve of the W channel according to the embodiment of the present application;
FIG. 5 is a schematic diagram of an imaging system according to another embodiment of the present application;
FIG. 6 is a graph illustrating a spectral transmittance curve of a filter unit according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of an imaging system according to yet another embodiment of the present application;
FIG. 8 is a schematic structural view of an imaging system according to yet another embodiment of the present application;
FIG. 9 is a schematic diagram of a 850nm near infrared energy distribution curve according to an embodiment of the present application;
FIG. 10 is a flow chart of exposure gain adjustment according to an embodiment of the present application;
FIG. 11 is a schematic structural view of an imaging system according to yet another embodiment of the present application;
FIG. 12 is a flow chart of an implementation of a processing unit according to an embodiment of the present application;
FIG. 13 is a schematic flow chart of an implementation of a post-processing module according to an embodiment of the present application;
FIG. 14 is a schematic flow chart of an implementation of a post-processing module according to another embodiment of the present application;
FIG. 15 is a schematic flow chart of an implementation of a post-processing module according to another embodiment of the present application;
FIG. 16 is a flow chart illustrating an implementation of a processing unit according to another embodiment of the present application;
FIG. 17 is a schematic flow chart of an implementation of a post-processing module according to another embodiment of the present application;
FIG. 18 is a flow chart illustrating an implementation of a processing unit according to another embodiment of the present disclosure;
FIG. 19 is a flow chart illustrating an implementation of a processing unit according to another embodiment of the present application;
fig. 20 is a flowchart of an image processing method according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
In order to improve an imaging effect of an imaging system, an embodiment of the application provides an imaging system and an image processing method.
An embodiment of the present application provides an imaging system, as shown in fig. 1, which includes an image sensor 11, a statistical unit 12, and an exposure control unit 13. The image sensor 11 comprises a plurality of types of channels for obtaining a pixel in the image signal in response to the passing light component.
Wherein the image sensor 11 is configured to convert an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands; a statistics unit 12, configured to acquire an image signal, extract image data of each channel in the image signal, respectively count the image data of each channel, obtain statistics data of each channel, and send the statistics data of each channel to an exposure control unit 13; the exposure control unit 13 is configured to receive the statistics data of the various channels sent by the statistics unit 12, calculate, for any channel, an exposure parameter corresponding to the channel according to the statistics data of the channel, and control brightness adjustment of image data of the channel based on the exposure parameter.
By applying the embodiment of the application, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, and controls the brightness adjustment on the image data of the type of channels based on the calculated exposure parameters, and the independent exposure is carried out on the type of channels according to the actual conditions of different energy of the light components responded by the different types of channels, so that the brightness of the type of channels is controlled in a proper brightness range, and the final imaging effect is improved.
The imaging system provided by the embodiment of the application can be an image shooting system (such as a digital camera, a portable video camera, a monitoring camera and the like), and the imaging system can also be an imaging module installed on a computer, a multimedia player, a mobile phone and the like.
The image sensor 11 comprises a plurality of types of channels, each for responding to light components in a different band of wavelengths. In one implementation, the image sensor 11 may include: a first type of channel that is responsive to light components in the visible band and a second type of channel that is responsive to light components in the near infrared band.
In this implementation, the image sensor 11 is an RGB-IR sensor, and specifically includes two types of channels, i.e., a first type of channel is an RGB channel, and a second type of channel is an NIR channel. The image sensor arrangement shown in fig. 2a, 2b or 2c is obtained by various types of channels, wherein a plurality of pixels are formed in a pixel array. In fig. 2a, the R (red) channel and the B (blue) channel respectively account for 1/8 of the total number of pixels, the NIR channel accounts for 1/4 of the total number of pixels, and the G (green) channel accounts for 1/2 of the total number of pixels; in FIG. 2B, R channel and B channel respectively account for 1/8 of the total pixel number, G channel accounts for 1/4 of the total pixel number, and NIR channel accounts for 1/2 of the total pixel number; in FIG. 2c, R channel and B channel respectively account for 3/16 of the total number of pixels, G channel accounts for 3/8 of the total number of pixels, and NIR channel accounts for 1/4 of the total number of pixels.
Wherein, the spectral response of each channel is shown in fig. 3, in order to ensure that the first channel responds to the visible light component and the second channel responds to the near infrared light component, the relative response of the light component of the first channel in the wave band range of 400 nm-700 nm is not lower than the relative response in the wave band range of 700 nm-1000 nm; the relative response of the light components of the second type of channels is not lower than the relative response of the light components of the first type of channels in the wavelength band of 800nm to 1000 nm.
The second type of channel may be a W (White) channel, which is a channel that can respond to light components in the visible light band and light components in the near infrared light band, in addition to the NIR channel. The spectral response of the W channel is shown in fig. 4, and the light component of the W channel is totally responsive in the band range of 400nm to 1000 nm.
A lens (not shown in fig. 1) is provided at the input side of the image sensor 11 for receiving the optical signal reflected by the target object. Wherein the optical signal comprises a visible light component, a near infrared light component and the like, and the lens can enable the visible light component and the near infrared light component to meet confocal requirements.
In this embodiment, the statistics unit 12 is a sensor for performing statistics of image data, for example, rgbhr sensor, and performs brightness statistics on the image data when the image data is counted, where the image data of each type of channel is a pixel value of a corresponding position of the type of channel in the pixel array, and the pixel value can represent pixel brightness, so that during the statistics, the pixel value in the type of channel can be directly counted to obtain the statistical brightness, that is, the statistical data of the type of channel.
In one implementation, the first type of channel includes a plurality of color channels and the second type of channel includes a near infrared channel.
Accordingly, the statistics unit 12 may be specifically configured to: calculating image data statistics of a first type of channel as statistics of the first type of channel according to image data of at least one color channel in the plurality of color channels; and calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
The first type of channel is a color channel, specifically including a red channel, a green channel and a blue channel, or including a red channel, a yellow channel and a blue channel, and image data of each color channel needs to be counted, and the calculated image data statistic value (such as an image data mean value, image data, value, etc.) of the first type of channel is used as the statistic data of the first type of channel, or may be used as the statistic data of the first type of channel by calculating the image data mean value of the red channel or the green channel. The second type channel is a near infrared channel, statistics is carried out on the image data of the near infrared channel, and the calculated image data statistic value of the second type channel is used as the statistic data of the second type channel.
Optionally, the statistics unit 12 may be specifically configured to:
extracting image data of each color channel and image data of a near infrared channel from the image signal; respectively calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel; carrying out weighted summation on the image data average value of each color channel; taking the weighted sum result as the statistical data of the first type channel, and taking the image data mean value of the near infrared channel as the statistical data of the second type channel;
or,
partitioning the image signal to obtain a plurality of image signal blocks; extracting image data of each color channel and image data of a near infrared channel from any image signal block; respectively calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel in each image signal block; carrying out weighted summation on the image data average value of each color channel; taking the weighted sum result as the statistical data of the first type channel, and taking the image data mean value of the near infrared channel as the statistical data of the second type channel;
Or,
extracting image data of each color channel and image data of a near infrared channel from the image signal; obtaining a histogram of each color channel and a histogram of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel respectively; respectively carrying out weighted average calculation on gray scale numbers in the histogram of each color channel and the histogram of the near infrared channel to obtain an image data average value of each color channel and an image data average value of the near infrared channel; carrying out weighted summation on the image data average value of each color channel; and taking the weighted sum result as the statistical data of the first type channel, and taking the image data mean value of the near infrared channel as the statistical data of the second type channel.
The data statistics include three statistical modes of global statistics, block statistics and histogram statistics, and the three statistical modes are described below by taking RGBIR sensor as an example.
Global statistics: and respectively calculating image data average values of the R channel, the G channel, the B channel and the NIR channel to respectively obtain an image data average value Rave of the R channel, an image data average value Gave of the G channel, an image data average value Bave of the B channel and an image data average value NIRave of the NIR channel. And weighting Rave, gave and Bave according to a certain weight to obtain Y=kr, rave+kg, gave+kb, bave, wherein kr is R channel weight, kg is G channel weight, kb is B channel weight, and kr+kg+kb=1. Y is the statistical data of the first type channel, and NIRave is the statistical data of the second type channel.
And (5) block statistics: dividing the image signal into blocks, and calculating image data average values of the R channel, the G channel, the B channel and the NIR channel of each block to obtain an image data average value R of the R channel ij ave, image data mean G of G channel ij ave, image data mean value B of B channel ij average NIR of image data of ave and NIR channels ij ave (i, j represent coordinates of the block, 0<i<M,0<j<N, M, N is the number of partitions in the vertical and horizontal directions); all blocks of each channel are weighted and averaged to obtain image data of R channelMean value ofImage data mean of G channel->Image data mean of B channel->Image data mean value of NIR channel->And weighting Rave, gave and Bave according to a certain weight to obtain Y=kr, rave+kg, gave+kb, bave, wherein kr is R channel weight, kg is G channel weight, kb is B channel weight, and kr+kg+kb=1. Y is the statistical data of the first type channel, and NIRave is the statistical data of the second type channel.
Histogram statistics: firstly, considering the bit number of input data of an R channel, a G channel and a B channel, if the bit number exceeds or is less than 8 bits, the input data is required to be converted into 8 bits, and then histograms are calculated for the R channel, the G channel, the B channel and the NIR channel respectively to obtain Rhist, ghist, bhist, NIRhist; the gray scale number of the histogram is 256; the histogram for each channel is weighted averaged according to the gray scale as follows:
Where w (n) is the weight of each gray level. And weighting Rave, gave and Bave according to a certain weight to obtain Y=kr, rave+kg, gave+kb, bave, wherein kr is R channel weight, kg is G channel weight, kb is B channel weight, and kr+kg+kb=1. Y is the statistical data of the first type channel, and NIRave is the statistical data of the second type channel.
In this embodiment of the present application, the exposure control unit 13 receives the statistics data of the various channels sent by the statistics unit 12, and calculates, according to the statistics data of the various channels, exposure parameters corresponding to the various channels, where the exposure parameters may include exposure time, analog gain, digital gain, and the like, and the exposure parameters may further include an aperture of a lens, where in this embodiment of the present application, the aperture may be considered to be fixed in size under a certain condition. And after the exposure parameters corresponding to the various channels are obtained, controlling the brightness adjustment of the image data of the corresponding channels based on the exposure parameters. Specifically, when controlling brightness adjustment of image data of a corresponding channel, the exposure control unit 13 may send exposure parameters to the image sensor, where the image sensor applies the exposure parameters to various channels, and adjusts brightness of the various channels, so that brightness of the image data of the various channels is within a preset brightness range; the exposure parameters can also be sent to the image processing unit, and the image processing unit can directly adjust the image data of the corresponding channels output by the image sensor based on the exposure parameters, so that the brightness of the image data of various channels is within a preset brightness range. Different preset luminance ranges may be set for the image data of different channels, or the same preset luminance range may be set, which is not particularly limited herein. After brightness adjustment is performed on the image data of each type of channel, the finally output image signal can be made to conform to the brightness range.
After the exposure parameters are calculated, the brightness adjustment mode is mainly implemented by sending the exposure parameters to the image sensor or the image processing unit and controlling the image sensor or the image processing unit to adjust the exposure time, the exposure gain and the like based on exposure parameter control, and the specific adjustment mode is a conventional exposure adjustment mode, which is not particularly limited herein.
Based on the embodiment shown in fig. 1, the embodiment of the present application further provides an imaging system, which includes an image sensor 11, a statistics unit 12, an exposure control unit 13, and a filtering unit 14, as shown in fig. 5. The image sensor 11 comprises a plurality of types of channels for obtaining a pixel in the image signal in response to the passing light component.
A filter unit 14 for filtering out other light components except for the light components in the specified wavelength band range in the input light signal, and transmitting the filtered light signal to the image sensor 11; an image sensor 11 for converting an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands; a statistics unit 12, configured to acquire an image signal, extract image data of each channel in the image signal, respectively count the image data of each channel, obtain statistics data of each channel, and send the statistics data of each channel to an exposure control unit 13; the exposure control unit 13 is configured to receive the statistics data of the various channels sent by the statistics unit 12, calculate, for any channel, an exposure parameter corresponding to the channel according to the statistics data of the channel, and control brightness adjustment of image data of the channel based on the exposure parameter.
By applying the embodiment of the application, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, and controls the brightness adjustment on the image data of the type of channels based on the calculated exposure parameters, and the independent exposure is carried out on the type of channels according to the actual conditions of different energy of the light components responded by the different types of channels, so that the brightness of the image data of the type of channels is controlled within a proper brightness range, and the final imaging effect is improved. By adding a filter unit to the front end of the image sensor and filtering the input optical signal, the light components in the specified wavelength band range can be passed and taken into the image sensor, and the light components in other wavelength band ranges can be filtered.
The filter unit 14 is capable of passing light components of near infrared light and visible light in a specified wavelength band range, while filtering light components in other wavelength band ranges. The filter unit 14 may be a monolithic filter, that is, an optical device that filters a certain frequency band in the light wave by a film coating technology, for example, as shown in the spectral transmittance curve of the filter unit in fig. 6, and filters near infrared light in a specified wavelength band and other near infrared light except visible light. In order for the filter unit to pass light components of near infrared light and visible light within a specified band range, it may be set in a near infrared band range of 650nm to 1000nm, the pass band width being smaller than the sum of the filtered band widths; in the near infrared band range of 650 nm-1000 nm, the band width with the passing rate more than 30% is less than 100nm. The relative response of the light components of the RGB channels of the image sensor is below 0.3 in the range of the near infrared light passing band of the filter unit 14.
Based on fig. 6, a first band width corresponding to near infrared light at a certain passing rate (for example, 20%) can be identified, and based on fig. 3, a second band width of color light at a response intensity corresponding to the near infrared light can be identified, where the first band width should be smaller than or equal to the second band width, and if the condition is not satisfied, the purpose that the first band width should be smaller than or equal to the second band width can be achieved by reselecting the filter unit of a different coating technology. By the selection, near infrared light components in the color light are fewer, and the imaging effect is improved.
Alternatively, the filter unit may comprise a switching device. Wherein the switching device is used for switching the filtering state of the filtering unit 14.
A filtering unit 14, which is configured to filter out other light components except for the light component within the specified wavelength band in the input light signal when the filtering state is on, and transmit the filtered light signal to the image sensor 11; when the filtering state is off, all light components in the light signal are transmitted to the image sensor 11.
The filter unit 14 may have switching means for switching the state of the filter unit 14. When the switching device switches the filtering state of the filtering unit 14 to on, the filtering unit 14 filters out other light components except the light component in the specified wavelength range in the input light signal, and transmits the filtered light signal to the image sensor 11, and only transmits the light component in the specified wavelength range to the image sensor 11; when the switching device switches the filtering state of the filtering unit 14 to off, all light components in the light signal are transmitted to the image sensor 11. In one case, when the filtering state of the filtering unit 14 is switched to on, the filtering unit 14 may filter out other light components except for the visible light and part of the near infrared light in the input light signal, and transmit the filtered light signal to the image sensor 11, and only transmit the light components of the visible light and part of the near infrared light to the image sensor 11; the filter unit 14 may also filter out other light components except for visible light in the input light signal and transmit the filtered light signal to the image sensor 11, and transmit only the light component of visible light to the image sensor 11.
Based on the embodiment shown in fig. 1, the embodiment of the present application further provides an imaging system, which includes an image sensor 11, a statistics unit 12, an exposure control unit 13, and a light supplementing unit 15, as shown in fig. 7. The image sensor 11 comprises a plurality of types of channels for obtaining a pixel in the image signal in response to the passing light component.
Wherein, the light supplementing unit 15 is configured to perform near infrared light supplementing on the scene, so that the input light signal includes near infrared light; an image sensor 11 for converting an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands; a statistics unit 12, configured to acquire an image signal, extract image data of each channel in the image signal, respectively count the image data of each channel, obtain statistics data of each channel, and send the statistics data of each channel to an exposure control unit 13; the exposure control unit 13 is configured to receive the statistics data of the various channels sent by the statistics unit 12, calculate, for any channel, an exposure parameter corresponding to the channel according to the statistics data of the channel, and control brightness adjustment of image data of the channel based on the exposure parameter.
By applying the embodiment of the application, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, and controls the brightness adjustment on the image data of the type of channels based on the calculated exposure parameters, and the independent exposure is carried out on the type of channels according to the actual conditions of different energy of the light components responded by the different types of channels, so that the brightness of the image data of the type of channels is controlled within a proper brightness range, and the final imaging effect is improved. And by adding the light supplementing unit, the near infrared light component in the light signal is increased, so that the brightness of the near infrared light channel is increased, and the near infrared light imaging is facilitated to be improved.
In another embodiment, as shown in fig. 8, the imaging system includes an image sensor 11, a statistics unit 12, an exposure control unit 13, a filtering unit 14, and a light supplementing unit 15. The image sensor 11 comprises a plurality of types of channels for obtaining a pixel in the image signal in response to the passing light component.
Wherein, the light supplementing unit 15 is configured to perform near infrared light supplementing on the scene, so that the input light signal includes near infrared light; a filter unit 14 for filtering out other light components except for the light components in the specified wavelength band range in the input light signal, including the near infrared light emitted by the light supplementing unit, and transmitting the filtered light signal to the image sensor 11; an image sensor 11 for converting an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands; a statistics unit 12, configured to acquire an image signal, extract image data of each channel in the image signal, respectively count the image data of each channel, obtain statistics data of each channel, and send the statistics data of each channel to an exposure control unit 13; the exposure control unit 13 is configured to receive the statistics data of the various channels sent by the statistics unit 12, calculate, for any channel, an exposure parameter corresponding to the channel according to the statistics data of the channel, and control brightness adjustment of image data of the channel based on the exposure parameter.
The light supplementing unit 15 is used for near infrared light supplementing, and of course, visible light supplementing can be simultaneously generated. The near infrared light-compensating energy generated by the light-compensating unit 15 is distributed in the range of 650nm to 1000nm, specifically, the energy is concentrated in the range of 750nm to 900nm, or in the range of 900nm to 1000 nm. In order to ensure that the filter unit 14 can pass near infrared light supplement by the light supplement unit 15, it is required that the energy distribution range of the near infrared light supplement by the light supplement unit 15 is not less than the near infrared light passing range preset by the filter unit 14.
In the embodiment of the application, the light supplementing equipment with the near infrared light wave band is adopted for supplementing light. Specifically, an infrared lamp of 850nm may be used as the light supplementing unit 15, or an infrared lamp of 750nm, 780nm, 850nm, 860nm, 940nm may be used as the light supplementing unit 15, and an infrared lamp of 850nm is taken as an example, and the energy distribution curve is shown in fig. 9, and the energy distribution is mainly concentrated in the range of 830nm to 880 nm.
Alternatively, the image sensor 11 may comprise a second type of channel responsive to light components in the near infrared band. The exposure control unit 13 may be further configured to control the light supplementing unit 15 to adjust the light supplementing intensity according to the statistics data of the second type of channels.
The exposure control unit 13 may control the light compensation unit 15 to adjust the light compensation intensity in addition to brightness adjustment of the image data of the various channels, and may make the image data of the second channel conform to a predetermined brightness range through the adjustment of the light compensation intensity.
Alternatively, the exposure control unit 13 may be specifically configured to: if the statistical data of the second type of channel is greater than the first preset threshold, controlling the light supplementing unit 15 to reduce the intensity of the emitted near infrared light; if the statistical data of the second type of channel is smaller than the second preset threshold, the light supplementing unit 15 is controlled to improve the intensity of the emitted near infrared light.
The manner in which the light-compensating unit 15 adjusts the intensity of the light-compensating light is controlled by the exposure control unit 13, specifically, whether the statistical data of the second type of channel is greater than a first preset threshold is determined, the statistical data of the second type of channel may be a luminance statistical result of the image data of the second type of channel, the corresponding first preset threshold is a maximum luminance threshold, if the statistical data is greater than the first preset threshold, it is indicated that the luminance of the second type of channel is too high and needs to be reduced, and the light-compensating unit 15 is controlled to reduce the intensity of the emitted near infrared light. And if the statistical data is smaller than the second preset threshold, the brightness of the second type channel is too low, and the brightness needs to be improved, and the light supplementing unit 15 is controlled to improve the intensity of the emitted near infrared light.
In one implementation manner of this embodiment of the present application, after the exposure control unit 13 controls the light compensation unit 15 to adjust the light compensation intensity, that is, if the exposure control unit 13 controls the brightness adjustment of the image data of the various channels, the imaging effect is still poor (the image is too dark or too bright), then the exposure control unit 13 may control the light compensation unit 15 to adjust the light compensation intensity so that the brightness of the image accords with the predetermined brightness range.
Optionally, the image sensor may further include: a first type of channel responsive to light components in the visible band; the exposure control unit 13 may specifically be configured to:
acquiring a first exposure time of a first type channel, first target data corresponding to the first type channel, a second exposure time of a second type channel and second target data corresponding to the second type channel; calculating a first data offset of the first type channel according to the statistical data and the first target data of the first type channel, and if the first data offset is not in a first preset range, calculating a first exposure gain according to the statistical data and the first target data of the first type channel; calculating a second data offset of the second type channel according to the statistical data and the second target data of the second type channel, and if the second data offset is not in a second preset range, calculating a second exposure gain according to the statistical data and the second target data of the second type channel; if the first exposure time is equal to the second exposure time, controlling the light supplementing unit to reduce the intensity of the emitted near infrared light when the second exposure gain is smaller than a first preset gain threshold value, and controlling the light supplementing unit to improve the intensity of the emitted near infrared light when the second exposure gain is larger than a second preset gain threshold value; if the first exposure time is not equal to the second exposure time, the second exposure time is reduced when the second exposure gain is smaller than the first preset gain threshold value, and the second exposure time is increased when the second exposure gain is larger than the second preset gain threshold value.
As shown in fig. 10, let the exposure time be a fixed time, for example 40ms or 10ms, the exposure time of the first type channel be T1, gain be Gain1, and the exposure time of the second type channel be T2, gain be Gain2. The steps of calculating Gain1 and Gain2 include: firstly, calculating data offset, namely Ydelta= |Ycurrent-ytarget|, according to statistical data and target data of a channel; a second step of judging whether the data offset is within a preset range, if not, executing a third step; third, an updated exposure Gain, that is, gain=ytarget/Ycurrent, is calculated from the channel statistics and the target data. And then the calculated updated exposure Gain and the set exposure time are sent to an image sensor, the image sensor recognizes that T1 and T2 are equal, when Gain2 is smaller than a first preset Gain threshold value, the light supplementing unit can be controlled to reduce the intensity of emitted near infrared light to realize adjustment of the light supplementing intensity, when the Gain2 is smaller than a preset minimum value, the difference between T1 and T2 is recognized, and when the Gain2 is smaller than the preset minimum value, the brightness adjustment of the image data of the channel can be realized by reducing T2.
Based on the embodiment shown in fig. 1, the embodiment of the present application further provides an imaging system, which includes an image sensor 11, a statistics unit 12, an exposure control unit 13, and a processing unit 16, as shown in fig. 11. The image sensor 11 comprises a plurality of types of channels for obtaining a pixel in the image signal in response to the passing light component.
Wherein the image sensor 11 is configured to convert an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands; the processing unit 16 is configured to obtain an image signal output by the image sensor 11, current exposure parameters corresponding to various channels, and association information between the various channels, determine a correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information between the various channels, and remove light components of another type of channels included in one type of channels according to the correlation between every two types of channels; a statistics unit 12, configured to acquire an image signal, extract image data of each channel in the image signal, respectively count the image data of each channel, obtain statistics data of each channel, and send the statistics data of each channel to an exposure control unit 13; the exposure control unit 13 is configured to receive the statistics data of the various channels sent by the statistics unit 12, calculate, for any channel, an exposure parameter corresponding to the channel according to the statistics data of the channel, and control brightness adjustment of image data of the channel based on the exposure parameter.
By applying the embodiment of the application, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, and controls the brightness adjustment on the image data of the type of channels based on the calculated exposure parameters, and the independent exposure is carried out on the type of channels according to the actual conditions of different energy of the light components responded by the different types of channels, so that the brightness of the image data of the type of channels is controlled within a proper brightness range, and the final imaging effect is improved. And, the processing unit removes the light component of another type of channel contained in one type of channel by analyzing the correlation between every two types of channels, and the correlation is related to the exposure parameter, which is beneficial to obtaining more accurate color information.
The processing unit 16 is a logic platform containing image processing algorithms or programs, which may be a CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but also DSP (Digital Signal Processor ), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
After the image signal is obtained from the image sensor 11, the processing unit 16 processes the image signal to obtain image data of various channels, and can obtain current exposure parameters (which may include exposure time, exposure gain, etc.) corresponding to the various channels, and can determine correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and correlation information between the various channels, where the correlation information refers to an influence of a certain image attribute of one channel on a certain image attribute of another channel, for example, an influence of brightness of an NIR channel and color of an RGB channel, the correlation refers to a degree and a magnitude of the correlation information of the image attribute, and the image attribute correlation information may be obtained by analyzing by the processing unit after the image signal is obtained, or may be sent to the processing unit by the image sensor in advance according to the image signal analysis. The magnitude and extent of the interaction of the image data between the two channels can be known unambiguously from the correlation between each two classes of channels, and therefore, from the correlation between each two classes of channels, the light component of the other class of channels contained in one class of channels can be removed, thereby recovering the original signal of the one class of channels. In particular, the method can be realized by weighting the image data of the multiple channels by using a coefficient matrix, wherein the coefficient matrix can be a pre-calibrated matrix.
Alternatively, the image sensor 11 may include: a first type of channel that is responsive to light components in the visible band and a second type of channel that is responsive to light components in the near infrared band.
The processing unit 16 may be specifically configured to: acquiring image signals output by an image sensor, current exposure parameters corresponding to a first type channel and a second type channel and associated information between the color of the first type channel and the brightness of the second type channel, or acquiring the image signals output by the image sensor, the current exposure parameters corresponding to the first type channel and the second type channel and associated information between the brightness of the first type channel and the brightness of the second type channel; normalizing the image data of the first type channel and the image data of the second type channel in the image signal to the same exposure parameter according to the current exposure parameters corresponding to the first type channel and the second type channel; determining the weight of the first type channel and the weight of the second type channel based on the association information; and removing the light component of the second type channel contained in the first type channel according to the weight of the first type channel, the weight of the second type channel, the normalized image data of the first type channel and the normalized image data of the second type channel.
For a scene in which the image sensor 11 includes a first type channel and a second type channel, the image data of the first type channel and the image data of the second type channel in the image signal may be normalized to the same exposure parameter according to the current exposure parameters of the first type channel and the second type channel, so that the image data of the first type channel and the image data of the second type channel under the same exposure parameter are logically obtained, the weights of the first type channel and the weights of the second type channel are determined based on the correlation information between the colors of the first type channel and the brightness of the second type channel, and the weights of the first type channel and the weights of the second type channel may form a coefficient matrix, where the coefficient matrix may be a 3×4 matrix calibrated according to the correlation information. By using the weights of the first type channels and the weights of the second type channels and the normalized image data of the first type channels and the normalized image data of the second type channels, the light components of the second type channels contained in the first type channels can be removed. The normalization process may also be implemented by adjusting the coefficient matrix, which is not described herein. The correlation information between the colors of the first type channels and the brightness of the second type channels can be specifically obtained by analyzing the correlation information between the brightness of each color channel in the first type channels and the brightness of the second type channels in advance, and of course, the correlation information can also be directly obtained by adopting the correlation information between the brightness of each color channel in the first type channels and the brightness of the second type channels.
Alternatively, the image sensor 11 may include: a first type of channel that is responsive to light components in the visible band and a second type of channel that is responsive to light components in the near infrared band.
The processing unit 16 may be specifically configured to: acquiring image signals output by an image sensor, current exposure parameters corresponding to a first type channel and a second type channel and associated information between the color of the first type channel and the brightness of the second type channel, or acquiring the image signals output by the image sensor, the current exposure parameters corresponding to the first type channel and the second type channel and associated information between the brightness of the first type channel and the brightness of the second type channel; determining the weight of the first type channel and the weight of the second type channel according to the current exposure parameters corresponding to the first type channel and the second type channel and the associated information; and removing the light component of the second type channel contained in the first type channel according to the weight of the first type channel, the weight of the second type channel, the image data of the first type channel and the image data of the second type channel.
For a scene in which the image sensor 11 includes a first type channel and a second type channel, the weights of the first type channel and the weights of the second type channel may be determined according to the current exposure parameters of the first type channel and the second type channel and the correlation information between the colors of the first type channel and the brightness of the second type channel obtained by pre-analysis, and the weights of the first type channel and the weights of the second type channel may form a coefficient matrix, where the coefficient matrix may be a 3×4 matrix calibrated according to the correlation information. By using the weights of the first type channels and the weights of the second type channels and the image data of the first type channels and the image data of the second type channels, the light components of the second type channels contained in the first type channels can be removed. The determined weights of the first type channels and the determined weights of the second type channels are related to the current exposure parameters, and more accurate color information is obtained by utilizing the determined weights.
In subsequent applications, the color component in the image signal is generally of greater concern, and therefore the above-described embodiments only give an implementation of removing the light component of the second type of channels contained in the first type of channels. In order to achieve better imaging effect, the light components of the first type channels contained in the second type channels can be removed in the above manner, and then channel fusion is performed to obtain a fused image signal.
Optionally, the processing unit 16 may be further configured to fuse the image data of the various channels from which the light components of the other channels are removed, to obtain a fused image signal.
After the light components of the various channels are eliminated, the processing unit 16 can fuse the image data of the various channels, so that color signals can be enhanced, and the quality of the fused image data under low illumination can be improved. After the fused image signal is obtained, the image signal may be sent to the statistics unit 12, and the statistics unit 12 may perform statistics of the image data, thereby providing a basis for exposure control by the exposure control unit 13, or may directly output to the user.
In the embodiment of the present application, the image data of the first type channel and the second type channel may be further subjected to joint noise reduction, and a specific joint noise reduction manner may be that the image data of the second type channel is used to guide the image data of the first type channel, so that noise is reduced, and meanwhile effective information loss is reduced.
Optionally, the processing unit includes a signal decomposition module and a post-processing module;
the signal decomposition module is used for acquiring an image signal, decomposing a visible light signal and a near infrared light signal of the image signal, and outputting a first decomposed image signal and a second decomposed image signal after decomposition, wherein the first decomposed image signal is the visible light image signal, and the second decomposed image is the near infrared light image signal;
the post-processing module is used for acquiring a first decomposed image signal, a second decomposed image signal, a first current exposure parameter corresponding to a first type channel, a second current exposure parameter corresponding to a second type channel and associated information between the first type channel and the second type channel; determining the correlation between the first type channel and the second type channel according to the first current exposure parameter, the second current exposure parameter and the correlation information; and determining a first output image signal and/or a second output image signal according to the correlation, wherein the first output image signal is the first decomposed image signal with the near infrared light component removed. The second output image signal may be a second decomposed image signal from which the visible light component is removed, or may be a second decomposed image signal from which the visible light component is not removed.
Optionally, the signal decomposition module may specifically be configured to:
acquiring an image signal; respectively up-sampling each color component of the visible light signal and the near infrared light signal in the image signal to obtain an image signal of each color component and an image signal of near infrared light; combining the image signals of the color components to obtain a first decomposed image signal for output, and outputting the image signal of the near infrared light as a second decomposed image signal;
or,
acquiring an image signal, a first current exposure gain corresponding to a first type channel and a second current exposure gain corresponding to a second type channel; if the second current exposure gain is smaller than the first current exposure gain, performing edge judgment interpolation on the image data of the first type channel according to the image data of the second type channel in the image signal, and if the second current exposure gain is larger than the first current exposure gain, performing edge judgment interpolation on the image data of the second type channel according to the image data of the first type channel in the image signal; obtaining image signals of various color components of the interpolated visible light signal and image signals of near infrared light; the image signals of the respective color components are combined to obtain a first decomposed image signal, which is output, and the image signal of the near infrared light is output as a second decomposed image signal.
Optionally, the post-processing module may include: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a third processing sub-module;
the first processing sub-module is used for acquiring a first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is used for acquiring a second decomposed image signal, preprocessing the second decomposed image signal to obtain a second sub-processed image signal, wherein the preprocessing comprises at least one processing mode of dead pixel correction, black level correction, digital gain and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and association information between the first type channel and the second type channel; normalizing the first sub-processing image signal and the second sub-processing image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal according to the weight of the first type channel, the weight of the second type channel, the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the third processing sub-module is used for processing the first recovered image signal to obtain a first output image signal.
Optionally, the post-processing module may include: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a fourth processing sub-module;
the first processing sub-module is used for acquiring a first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is used for acquiring a second decomposed image signal, preprocessing the second decomposed image signal to obtain a second sub-processed image signal, wherein the preprocessing comprises at least one processing mode of dead pixel correction, black level correction, digital gain and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and association information between the first type channel and the second type channel; normalizing the first sub-processing image signal and the second sub-processing image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a second recovery image signal according to the weights of the first type channel and the second type channel and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the fourth processing submodule is used for processing the second restored image signal to obtain a second output image signal.
Optionally, the post-processing module may include: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring a first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is used for acquiring a second decomposed image signal, preprocessing the second decomposed image signal to obtain a second sub-processed image signal, wherein the preprocessing comprises at least one processing mode of dead pixel correction, black level correction, digital gain and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and association information between the first type channel and the second type channel; normalizing the first sub-processing image signal and the second sub-processing image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channel and the second type channel and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
The third processing sub-module is used for processing the first recovery image signal to obtain a third sub-processing image signal;
a fourth processing sub-module, configured to process the second restored image signal to obtain a fourth sub-processed image signal;
and the fifth processing sub-module is used for processing the third sub-processing image signal and the fourth sub-processing image signal to obtain a first output image signal.
Optionally, the post-processing module may include: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring a first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is used for acquiring a second decomposed image signal, preprocessing the second decomposed image signal to obtain a second sub-processed image signal, wherein the preprocessing comprises at least one processing mode of dead pixel correction, black level correction, digital gain and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and association information between the first type channel and the second type channel; normalizing the first sub-processing image signal and the second sub-processing image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channel and the second type channel and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
The third processing sub-module is used for processing the first recovery image signal to obtain a third sub-processing image signal;
a fourth processing sub-module, configured to process the second restored image signal to obtain a fourth sub-processed image signal;
and the fifth processing sub-module is used for processing the third sub-processing image signal and the fourth sub-processing image signal to obtain a first output image signal and a second output image signal.
Optionally, the processing unit may further include a preprocessing module;
the preprocessing module is used for acquiring the image signals output by the image sensor, preprocessing the image signals and sending the preprocessed image signals to the signal decomposition module.
In summary, the processing unit 16 may be implemented in a variety of ways, as described below.
First embodiment:
as shown in fig. 12, the processing unit 16 includes a signal decomposition module and a post-processing module. The signal decomposition module logically decomposes the visible light signal and the near infrared light signal of the input image signal and decomposes the image signal into a first decomposed image signal and a second decomposed image signal; the post-processing module processes the first decomposed image signal and the second decomposed image signal and outputs a first output image signal.
The image sensor transmits one path of image signals to the processing unit and simultaneously comprises visible light signals and near infrared light signals, so that the signal decomposition module logically decomposes the two image signals and outputs decomposed first decomposed image signals and second decomposed image signals.
For the signal decomposition module, one processing method may be to up-sample the R, G, B signal of visible light and the NIR signal of near infrared (which may be bilinear up-sampling or other up-sampling methods) respectively, to obtain R, G, B, NIR image signals, where the four image signals are full resolution image signals, and then combine R, G, B image signals into a visible light image signal to output as a first decomposed image signal and output a near infrared NIR image signal as a second decomposed image signal.
For the signal decomposition module, another processing mode may be to perform interpolation operation on the rgbhr image signal, where the interpolation operation may be interpolation based on edge judgment. And when interpolation is carried out, a channel with better imaging quality can be used as a guide, and a channel with poorer imaging quality is guided to carry out edge decision interpolation. The imaging quality can be determined by gain, for example, when the exposure gain of the NIR channel is less than the exposure gain of the R, G, B channel, the NIR channel is used to guide R, G, B channel for edge decision interpolation; when the exposure gain of R, G, B channel is smaller than that of NIR channel, R, G, B channel is used to guide NIR channel for edge decision interpolation. The image signal R, G, B, NIR is obtained by interpolation, the R, G, B image signal is combined into a visible light image signal and output as a first decomposed image signal, and the near infrared NIR image signal is output as a second decomposed image signal.
The post-processing module is used for carrying out joint processing on the first decomposed image signal and the second decomposed image signal to obtain a first output image signal. The post-processing module may be implemented in a variety of ways.
As shown in fig. 13, the first processing sub-module may perform one or more of dead point correction, black level correction, data gain, and noise reduction on the first decomposed image signal to obtain a first sub-processed image signal; the second processing sub-module may perform one or more of dead point correction, black level correction, data gain, and noise reduction on the second decomposed image signal to obtain a second sub-processed image signal. One way to normalize the first sub-processed image signal and the second sub-processed image signal to the same gain and exposure time is to adjust the second sub-processed image signal according to the gain g1 of the RGB channel, the exposure time t1, the gain g2 of the NIR channel, and the exposure time t2 as follows:
and carrying out joint processing on the first sub-processing image signal (RGB image signal) and the adjusted second sub-processing image signal (NIR' image signal) by utilizing a pre-calibrated coefficient matrix A to obtain a first recovery image signal of the recovery color, wherein the processing mode is as follows:
Furthermore, it is not excluded to use other means for normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure time and gain, such as scaling the first sub-processed image signal, or scaling the coefficient matrix a, etc.
The third processing sub-module performs further processing on the first recovered image signal including, but not limited to, digital gain, white balance, color correction, curve mapping, noise reduction, enhancement, etc., to finally obtain a colored first output image signal.
A second implementation manner of the post-processing module is shown in fig. 14, and the first processing sub-module and the second processing sub-module may be the same as the first processing sub-module and the second processing sub-module in the embodiment shown in fig. 13, which are not described herein again. The second sub-processed image signal may be directly output as the second restored image signal by one processing method of the color restoration sub-module, or may be output as the second restored image signal after weighting the first sub-processed image signal and the second sub-processed image signal. The fourth processing sub-module performs further processing on the second recovered image signal including, but not limited to, digital gain, white balance, color correction, curve mapping, noise reduction, enhancement, etc., to finally obtain a black and white second output image signal.
Third implementation of the post-processing module as shown in fig. 15, the first processing sub-module and the second processing sub-module may be implemented in the same manner as the first processing sub-module and the second processing sub-module in the embodiment shown in fig. 13. The color restoration submodule may output the first restoration image signal in the same implementation as the color restoration submodule in the embodiment shown in fig. 13 and output the second restoration image signal in the same implementation as the color restoration submodule in the embodiment shown in fig. 14. The third processing sub-module may obtain the third sub-processed image signal in the same implementation as the third processing sub-module in the embodiment shown in fig. 13, and the fourth processing sub-module may obtain the fourth sub-processed image signal in the same implementation as the fourth processing sub-module in the embodiment shown in fig. 14. The fifth processing sub-module processes the third sub-processed image signal and the fourth sub-processed image signal to obtain the first output image signal, and the processing manner of the fifth processing sub-module includes, but is not limited to, noise reduction, fusion, enhancement, and the like.
Second embodiment:
as shown in fig. 16, the processing unit 16 includes a signal decomposition module and a post-processing module. The signal decomposition module performs logic decomposition on an input image signal and decomposes the image signal into a first decomposed image signal and a second decomposed image signal; the post-processing module processes the first and second decomposed image signals and outputs first and second output image signals.
The signal decomposition module may adopt the same implementation manner as the signal decomposition module in the first embodiment, and will not be described herein.
Implementation of the post-processing module as shown in fig. 17, the first processing sub-module, the second processing sub-module, the color recovery sub-module, the third processing sub-module, and the fourth processing sub-module may adopt the same implementation as the corresponding modules in the embodiment shown in fig. 15. The fifth processing sub-module processes the third sub-processed image signal and the fourth sub-processed image signal in a manner including, but not limited to, noise reduction, fusion, enhancement, etc., to obtain a first output image signal in color and a second output image signal in black and white.
Third embodiment:
as shown in fig. 18, the processing unit 16 includes a preprocessing module, a signal decomposition module, and a post-processing module. The preprocessing module is used for preprocessing an input image signal and outputting a preprocessed image signal; the signal decomposition module carries out logic decomposition on the preprocessed image signal and decomposes the preprocessed image signal into a first decomposed image signal and a second decomposed image signal; the post-processing module processes the first decomposed image signal and the second decomposed image signal and outputs a first output image signal.
The preprocessing module performs preprocessing on the input image signal to obtain a preprocessed image, wherein the preprocessing comprises, but is not limited to, black level correction, dead pixel correction, digital gain, noise reduction and the like.
The signal decomposition module may adopt the same implementation manner as the signal decomposition module in the first embodiment, and will not be described herein.
The post-processing module may adopt the same implementation manner as the post-processing module in the first embodiment, and will not be described herein.
Fourth embodiment:
as shown in fig. 19, the processing unit 16 includes a preprocessing module, a signal decomposition module, and a post-processing module. The preprocessing module is used for preprocessing an input image signal and outputting a preprocessed image signal; the signal decomposition module carries out logic decomposition on the preprocessed image signal and decomposes the preprocessed image signal into a first decomposed image signal and a second decomposed image signal; the post-processing module processes the first and second decomposed image signals and outputs first and second output image signals.
The preprocessing module may adopt the same implementation manner as the preprocessing module in the third embodiment, and will not be described herein.
The signal decomposition module may adopt the same implementation manner as the signal decomposition module in the first embodiment, and will not be described herein.
The post-processing module may be implemented in the same manner as the post-processing module in the second embodiment, which is not described herein.
The embodiment of the application provides an image processing method, which is applied to an imaging system, and comprises the following steps: an image sensor, a statistics unit and an exposure control unit; the image sensor includes multiple classes of channels; as shown in fig. 20, the method includes:
s201, the image sensor converts an optical signal into an image signal, wherein the optical signal includes optical components in various wavelength band ranges.
S202, the statistical unit acquires image signals, extracts image data of various channels in the image signals, respectively performs statistics on the image data of the various channels to obtain statistical data of the various channels, and sends the statistical data of the various channels to the exposure control unit.
S203, the exposure control unit receives the statistic data of various channels sent by the statistic unit, calculates exposure parameters corresponding to the channels according to the statistic data of the channels, and controls brightness adjustment of the image data of the channels based on the exposure parameters.
Optionally, the image sensor includes: a first type of channel responsive to light components in the visible light band and a second type of channel responsive to light components in the near infrared light band, wherein the first type of channel comprises a plurality of color channels and the second type of channel comprises a near infrared channel;
in S202, the step of the statistical unit performing statistics on the image data of each channel to obtain statistical data of each channel may be implemented specifically by the following steps:
calculating image data statistics of a first type of channel as statistics of the first type of channel according to image data of at least one color channel in the plurality of color channels;
and calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
Optionally, the system may further include: a light supplementing unit; an image sensor comprising a second type of channel responsive to light components in the near infrared band;
the method may further comprise the steps of:
if the statistical data of the second type of channels is larger than a first preset threshold value, controlling the light supplementing unit to reduce the intensity of the emitted near infrared light;
and if the statistical data of the second type of channels is smaller than a second preset threshold value, controlling the light supplementing unit to improve the intensity of the emitted near infrared light.
Optionally, the system may further include: a processing unit;
after performing S201, the method may further include the steps of:
the processing unit acquires image signals output by the image sensor, current exposure parameters corresponding to various channels and association information among the various channels, determines the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information among the various channels, and removes light components of the other type of channels contained in one type of channels according to the correlation between every two types of channels.
Alternatively, the image sensor may include: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
the processing unit obtains image signals output by the image sensor, current exposure parameters corresponding to various channels and association information among the various channels, determines the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information among the various channels, and removes light components of the other type of channels contained in one type of channels according to the correlation between every two types of channels, wherein the step of removing the light components of the other type of channels can be realized by the following steps:
Acquiring an image signal output by an image sensor, current exposure parameters corresponding to a first type channel and a second type channel, and correlation information between the color of the first type channel and the brightness of the second type channel;
normalizing the image data of the first type channel and the image data of the second type channel in the image signal to the same exposure parameter according to the current exposure parameters corresponding to the first type channel and the second type channel;
determining the weight of the first type channel and the weight of the second type channel based on the association information;
and removing the light component of the second type channel contained in the first type channel according to the weight of the first type channel, the weight of the second type channel, the normalized image data of the first type channel and the normalized image data of the second type channel.
Alternatively, the image sensor may include: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
the processing unit obtains image signals output by the image sensor, current exposure parameters corresponding to various channels and association information among the various channels, determines the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information among the various channels, and removes light components of the other type of channels contained in one type of channels according to the correlation between every two types of channels, wherein the step of removing the light components of the other type of channels can be realized by the following steps:
Acquiring an image signal output by an image sensor, current exposure parameters corresponding to a first type channel and a second type channel, and correlation information between the color of the first type channel and the brightness of the second type channel;
determining the weight of the first type channel and the weight of the second type channel according to the current exposure parameters corresponding to the first type channel and the second type channel and the associated information;
and removing the light component of the second type channel contained in the first type channel according to the weight of the first type channel, the weight of the second type channel, the image data of the first type channel and the image data of the second type channel.
Optionally, after the step of obtaining the image signal output by the image sensor, the current exposure parameters corresponding to the various channels and the association information between the various channels by the processing unit, determining the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information between the various channels, and removing the light component of another type of channel included in one type of channel according to the correlation between every two types of channels, the method may further include the steps of:
the processing unit fuses the image data of various channels from which the light components of other channels are removed, and a fused image signal is obtained.
By applying the embodiment of the application, the statistics unit respectively carries out statistics on the image data of each type of channel, the exposure control unit calculates the exposure parameters corresponding to the type of channels according to the statistics data of the type of channels, and controls the brightness adjustment on the image data of the type of channels based on the calculated exposure parameters, and independently exposes the image data of the type of channels according to the actual conditions of different energy of the light components responded by different types of channels, so that the brightness of the image data of the type of channels is controlled within a proper brightness range, and the final imaging effect is improved.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the image processing method embodiment, since it is substantially similar to the imaging system embodiment, the description is relatively simple, and the relevant points are referred to in the description of the imaging system embodiment.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (17)

1. An imaging system, the system comprising: an image sensor, a statistics unit and an exposure control unit; the image sensor includes a plurality of types of channels;
the image sensor is used for converting an optical signal into an image signal, wherein the optical signal comprises light components in various wave band ranges;
the statistics unit is used for acquiring the image signals; extracting image data of various channels in the image signal; respectively counting the image data of the various channels to obtain the statistical data of the various channels; the statistical data of the various channels are sent to the exposure control unit;
The exposure control unit is used for receiving the statistic data of the various channels sent by the statistic unit; aiming at any channel, calculating exposure parameters corresponding to the channel according to the statistical data of the channel, and controlling the image data of the channel to be subjected to brightness adjustment based on the exposure parameters;
the system further comprises: a processing unit;
the processing unit is used for acquiring image signals output by the image sensor, current exposure parameters corresponding to the various channels and associated information among the various channels; determining the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the correlation information between the various channels; removing light components of another channel contained in one channel according to the correlation between every two channels, wherein the correlation information is the influence of the image attribute of one channel on the image attribute of the other channel, and the correlation is used for representing the degree and the size of the image attribute correlation information;
the image sensor includes: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
The exposure control unit is specifically configured to:
acquiring a first exposure time of the first type channel, first target data corresponding to the first type channel, a second exposure time of the second type channel and second target data corresponding to the second type channel;
calculating a first data offset of the first type channel according to the statistical data of the first type channel and the first target data, and calculating a first exposure gain according to the statistical data of the first type channel and the first target data if the first data offset is not in a first preset range;
calculating a second data offset of the second class channel according to the statistical data of the second class channel and the second target data, and calculating a second exposure gain according to the statistical data of the second class channel and the second target data if the second data offset is not within a second preset range;
if the first exposure time is not equal to the second exposure time, the second exposure time is reduced when the second exposure gain is smaller than a first preset gain threshold value, and the second exposure time is increased when the second exposure gain is larger than a second preset gain threshold value.
2. The system of claim 1, wherein the first type of channel comprises a plurality of color channels and the second type of channel comprises a near infrared channel;
the statistics unit is specifically configured to:
calculating the image data statistic of the first type of channel as the statistic of the first type of channel according to the image data of at least one color channel in the plurality of color channels;
and calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
3. The system according to claim 2, characterized in that said statistics unit is specifically configured to:
extracting image data of each color channel and image data of the near infrared channel from the image signal; calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel respectively; carrying out weighted summation on the image data average value of each color channel; taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel;
Or,
partitioning the image signal to obtain a plurality of image signal blocks; extracting image data of each color channel and the image data of the near infrared channel from any image signal block; calculating the image data average value of each color channel and the image data average value of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel in each image signal block; carrying out weighted summation on the image data average value of each color channel; taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel;
or,
extracting image data of each color channel and image data of the near infrared channel from the image signal; obtaining a histogram of each color channel and a histogram of the near infrared channel according to the image data of each color channel and the image data of the near infrared channel respectively; respectively carrying out weighted average calculation on the gray scale numbers in the histogram of each color channel and the histogram of the near infrared channel to obtain an image data average value of each color channel and an image data average value of the near infrared channel; carrying out weighted summation on the image data average value of each color channel; and taking the result of the weighted summation as the statistical data of the first type channel, and taking the image data average value of the near infrared channel as the statistical data of the second type channel.
4. The system of claim 1, wherein the system further comprises: a light supplementing unit;
the light supplementing unit is used for carrying out near infrared light supplementing on a scene so that an input light signal comprises near infrared light;
the exposure control unit is further configured to control the light supplementing unit to adjust the light supplementing intensity according to the statistical data of the second type channel.
5. The system according to claim 4, wherein the exposure control unit is specifically configured to:
if the statistical data of the second type of channels is larger than a first preset threshold value, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light;
and if the statistical data of the second type of channels is smaller than a second preset threshold value, controlling the light supplementing unit to improve the intensity of emitting the near infrared light.
6. The system according to claim 4, wherein the exposure control unit is specifically configured to:
and if the first exposure time is equal to the second exposure time, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light when the second exposure gain is smaller than a first preset gain threshold, and controlling the light supplementing unit to improve the intensity of emitting the near infrared light when the second exposure gain is larger than a second preset gain threshold.
7. The system according to claim 1, characterized in that the processing unit is specifically configured to:
acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the color of the first type channel and the brightness of the second type channel; or, acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the brightness of the first type channel and the brightness of the second type channel;
normalizing the image data of the first type channel and the image data of the second type channel in the image signal to the same exposure parameter according to the current exposure parameters corresponding to the first type channel and the second type channel;
determining the weight of the first type channel and the weight of the second type channel based on the association information;
and removing light components of the second type channels contained in the first type channels according to the weights of the first type channels and the weights of the second type channels, and the normalized image data of the first type channels and the normalized image data of the second type channels.
8. The system according to claim 1, characterized in that the processing unit is specifically configured to:
acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the color of the first type channel and the brightness of the second type channel; or, acquiring an image signal output by the image sensor, current exposure parameters corresponding to the first type channel and the second type channel, and associated information between the brightness of the first type channel and the brightness of the second type channel;
determining the weight of the first type channel and the weight of the second type channel according to the current exposure parameters corresponding to the first type channel and the second type channel and the association information;
and removing light components of the second type channels contained in the first type channels according to the weights of the first type channels, the weights of the second type channels, the image data of the first type channels and the image data of the second type channels.
9. The system of claim 1, wherein the processing unit comprises a signal decomposition module and a post-processing module;
The signal decomposition module is used for acquiring an image signal, decomposing a visible light signal and a near infrared light signal of the image signal, and outputting a first decomposed image signal and a second decomposed image signal after decomposition, wherein the first decomposed image signal is the visible light image signal, and the second decomposed image is the near infrared light image signal;
the post-processing module is configured to obtain the first decomposed image signal, the second decomposed image signal, a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel, and association information between the first type channel and the second type channel; determining the correlation between the first type channel and the second type channel according to the first current exposure parameter, the second current exposure parameter and the correlation information; and determining a first output image signal and/or a second output image signal according to the correlation, wherein the first output image signal is the first decomposed image signal with the near infrared light component removed.
10. The system according to claim 9, wherein the signal decomposition module is specifically configured to:
Acquiring an image signal; respectively up-sampling each color component of the visible light signal and the near infrared light signal in the image signal to obtain an image signal of each color component and an image signal of near infrared light; combining the image signals of the color components to obtain a first decomposed image signal for output, and outputting the image signal of the near infrared light as a second decomposed image signal;
or,
acquiring an image signal, a first current exposure gain corresponding to the first type channel and a second current exposure gain corresponding to the second type channel; if the second current exposure gain is smaller than the first current exposure gain, performing edge judgment interpolation on the image data of the first type channel according to the image data of the second type channel in the image signal, and if the second current exposure gain is larger than the first current exposure gain, performing edge judgment interpolation on the image data of the second type channel according to the image data of the first type channel in the image signal; obtaining image signals of various color components of the interpolated visible light signal and image signals of near infrared light; and combining the image signals of the color components to obtain a first decomposed image signal for output, and outputting the image signal of the near infrared light as a second decomposed image signal.
11. The system of claim 9, wherein the post-processing module comprises: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a third processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the third processing sub-module is used for processing the first recovery image signal to obtain a first output image signal.
12. The system of claim 9, wherein the post-processing module comprises: the device comprises a first processing sub-module, a second processing sub-module, a color recovery sub-module and a fourth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
And the fourth processing submodule is used for processing the second restored image signal to obtain a second output image signal.
13. The system of claim 9, wherein the post-processing module comprises: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
the color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
The third processing sub-module is configured to process the first restored image signal to obtain a third sub-processed image signal;
the fourth processing sub-module is configured to process the second restored image signal to obtain a fourth sub-processed image signal;
the fifth processing sub-module is configured to process the third sub-processed image signal and the fourth sub-processed image signal to obtain a first output image signal.
14. The system of claim 9, wherein the post-processing module comprises: the color restoration device comprises a first processing sub-module, a second processing sub-module, a color restoration sub-module, a third processing sub-module, a fourth processing sub-module and a fifth processing sub-module;
the first processing sub-module is used for acquiring the first decomposed image signal, preprocessing the first decomposed image signal and obtaining a first sub-processed image signal;
the second processing sub-module is configured to obtain the second decomposed image signal, and perform preprocessing on the second decomposed image signal to obtain a second sub-processed image signal, where the preprocessing includes at least one processing mode of dead pixel correction, black level correction, digital gain, and noise reduction;
The color recovery sub-module is used for acquiring a first current exposure parameter corresponding to the first type channel, a second current exposure parameter corresponding to the second type channel and associated information between the first type channel and the second type channel; normalizing the first sub-processed image signal and the second sub-processed image signal to the same exposure parameter according to the first current exposure parameter and the second current exposure parameter; determining the weight of the first type channel and the weight of the second type channel based on the association information; obtaining a first recovery image signal and a second recovery image signal according to the weights of the first type channels and the second type channels and the normalized first sub-processing image signal and the normalized second sub-processing image signal;
the third processing sub-module is configured to process the first restored image signal to obtain a third sub-processed image signal;
the fourth processing sub-module is configured to process the second restored image signal to obtain a fourth sub-processed image signal;
the fifth processing sub-module is configured to process the third sub-processed image signal and the fourth sub-processed image signal to obtain a first output image signal and a second output image signal.
15. An image processing method, characterized by being applied to an imaging system, the system comprising: an image sensor, a statistics unit and an exposure control unit; the image sensor includes a plurality of types of channels; the method comprises the following steps:
the image sensor converts an optical signal into an image signal, the optical signal including light components in a plurality of wavelength bands;
the statistics unit acquires the image signal, extracts image data of various channels in the image signal, respectively carries out statistics on the image data of the various channels to obtain statistics data of the various channels, and sends the statistics data of the various channels to the exposure control unit;
the exposure control unit receives the statistic data of the various channels sent by the statistic unit, calculates exposure parameters corresponding to the channels according to the statistic data of the channels, and controls the brightness adjustment of the image data of the channels based on the exposure parameters;
the system further comprises: a processing unit;
after the image sensor converts the optical signal into an image signal, the method further comprises:
the processing unit acquires an image signal output by the image sensor, current exposure parameters corresponding to various channels and association information among the various channels, determines the correlation between every two types of channels according to the current exposure parameters corresponding to the various channels and the association information among the various channels, and removes the light component of the other type of channels contained in one type of channels according to the correlation between every two types of channels, wherein the association information is the influence of the image attribute of one channel on the image attribute of the other channel, and the correlation is used for representing the degree and the size of the image attribute association information;
The image sensor includes: a first type of channel responsive to light components in the visible band and a second type of channel responsive to light components in the near infrared band;
the exposure control unit acquires a first exposure time of the first type channel, first target data corresponding to the first type channel, a second exposure time of the second type channel and second target data corresponding to the second type channel; calculating a first data offset of the first type channel according to the statistical data of the first type channel and the first target data, and calculating a first exposure gain according to the statistical data of the first type channel and the first target data if the first data offset is not in a first preset range; calculating a second data offset of the second class channel according to the statistical data of the second class channel and the second target data, and calculating a second exposure gain according to the statistical data of the second class channel and the second target data if the second data offset is not within a second preset range; if the first exposure time is not equal to the second exposure time, the second exposure time is reduced when the second exposure gain is smaller than a first preset gain threshold value, and the second exposure time is increased when the second exposure gain is larger than a second preset gain threshold value.
16. The method of claim 15, wherein the first type of channel comprises a plurality of color channels and the second type of channel comprises a near infrared channel;
the statistics unit respectively performs statistics on the image data of the various channels to obtain statistics data of the various channels, and the statistics unit comprises:
calculating the image data statistic of the first type of channel as the statistic of the first type of channel according to the image data of at least one color channel in the plurality of color channels;
and calculating the image data statistic value of the second type of channels as the statistic data of the second type of channels according to the image data of the near infrared channels.
17. The method of claim 15, wherein the system further comprises: a light supplementing unit;
the method further comprises the steps of:
if the statistical data of the second type of channels is larger than a first preset threshold value, controlling the light supplementing unit to reduce the intensity of emitting the near infrared light;
and if the statistical data of the second type of channels is smaller than a second preset threshold value, controlling the light supplementing unit to improve the intensity of emitting the near infrared light.
CN202210824473.8A 2020-01-22 2020-01-22 Imaging system and image processing method Active CN115297268B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210824473.8A CN115297268B (en) 2020-01-22 2020-01-22 Imaging system and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010073691.3A CN113163124B (en) 2020-01-22 2020-01-22 Imaging system and image processing method
CN202210824473.8A CN115297268B (en) 2020-01-22 2020-01-22 Imaging system and image processing method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010073691.3A Division CN113163124B (en) 2020-01-22 2020-01-22 Imaging system and image processing method

Publications (2)

Publication Number Publication Date
CN115297268A CN115297268A (en) 2022-11-04
CN115297268B true CN115297268B (en) 2024-01-05

Family

ID=76881954

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210824473.8A Active CN115297268B (en) 2020-01-22 2020-01-22 Imaging system and image processing method
CN202010073691.3A Active CN113163124B (en) 2020-01-22 2020-01-22 Imaging system and image processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010073691.3A Active CN113163124B (en) 2020-01-22 2020-01-22 Imaging system and image processing method

Country Status (2)

Country Link
CN (2) CN115297268B (en)
WO (1) WO2021147804A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071026B (en) * 2022-01-18 2022-04-05 睿视(天津)科技有限公司 Automatic exposure control method and device based on red component characteristic detection
CN115361494B (en) * 2022-10-09 2023-03-17 浙江双元科技股份有限公司 High-speed stroboscopic image processing system and method of multi-line scanning image sensor

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1850584A1 (en) * 2006-04-30 2007-10-31 Huawei Technologies Co., Ltd. Method for acquiring and controlling automatic exposure control parameters and imaging device
CN103686111A (en) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 Method and device for correcting color based on RGBIR (red, green and blue, infra red) image sensor
CN104113743A (en) * 2013-04-18 2014-10-22 深圳中兴力维技术有限公司 Colour camera automatic white balance processing method and device under low illumination
CN107438170A (en) * 2016-05-25 2017-12-05 杭州海康威视数字技术股份有限公司 A kind of image Penetrating Fog method and the image capture device for realizing image Penetrating Fog
CN108353134A (en) * 2015-10-30 2018-07-31 三星电子株式会社 Use the filming apparatus and its image pickup method of multiple-exposure sensor
WO2018145575A1 (en) * 2017-02-10 2018-08-16 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
CN108600725A (en) * 2018-05-10 2018-09-28 杭州雄迈集成电路技术有限公司 A kind of white balance correction device and method based on RGB-IR image datas
EP3514600A1 (en) * 2018-01-19 2019-07-24 Leica Instruments (Singapore) Pte. Ltd. Method for fluorescence intensity normalization
CN110493506A (en) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 A kind of image processing method and system
CN110493531A (en) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 A kind of image processing method and system
CN110602420A (en) * 2019-09-30 2019-12-20 杭州海康威视数字技术股份有限公司 Camera, black level adjusting method and device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8436914B2 (en) * 2008-11-07 2013-05-07 Cisco Technology, Inc. Method for automatic exposure control within a video capture device
JP5471550B2 (en) * 2010-02-10 2014-04-16 ソニー株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP6183238B2 (en) * 2014-02-06 2017-08-23 株式会社Jvcケンウッド IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
CN104980628B (en) * 2014-04-13 2018-09-11 比亚迪股份有限公司 Imaging sensor and monitoring system
CN106375645B (en) * 2015-07-21 2019-08-30 杭州海康威视数字技术股份有限公司 A kind of adaptive control system based on infrared eye
CN205666883U (en) * 2016-03-23 2016-10-26 徐鹤菲 Support compound imaging system and mobile terminal of formation of image of near infrared and visible light
CN108024106B (en) * 2016-11-04 2019-08-23 上海富瀚微电子股份有限公司 Support the color correction device and method of RGBIR and RGBW format
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
CN107948521B (en) * 2017-12-01 2019-12-27 深圳市同为数码科技股份有限公司 Camera day and night mode switching system based on AE and AWB statistical information
CN110493533B (en) * 2019-05-31 2021-09-07 杭州海康威视数字技术股份有限公司 Image acquisition device and image acquisition method
CN110493495B (en) * 2019-05-31 2022-03-08 杭州海康威视数字技术股份有限公司 Image acquisition device and image acquisition method
CN110519489B (en) * 2019-06-20 2021-04-06 杭州海康威视数字技术股份有限公司 Image acquisition method and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1850584A1 (en) * 2006-04-30 2007-10-31 Huawei Technologies Co., Ltd. Method for acquiring and controlling automatic exposure control parameters and imaging device
CN104113743A (en) * 2013-04-18 2014-10-22 深圳中兴力维技术有限公司 Colour camera automatic white balance processing method and device under low illumination
CN103686111A (en) * 2013-12-31 2014-03-26 上海富瀚微电子有限公司 Method and device for correcting color based on RGBIR (red, green and blue, infra red) image sensor
CN108353134A (en) * 2015-10-30 2018-07-31 三星电子株式会社 Use the filming apparatus and its image pickup method of multiple-exposure sensor
CN107438170A (en) * 2016-05-25 2017-12-05 杭州海康威视数字技术股份有限公司 A kind of image Penetrating Fog method and the image capture device for realizing image Penetrating Fog
WO2018145575A1 (en) * 2017-02-10 2018-08-16 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
EP3514600A1 (en) * 2018-01-19 2019-07-24 Leica Instruments (Singapore) Pte. Ltd. Method for fluorescence intensity normalization
CN108600725A (en) * 2018-05-10 2018-09-28 杭州雄迈集成电路技术有限公司 A kind of white balance correction device and method based on RGB-IR image datas
CN110493506A (en) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 A kind of image processing method and system
CN110493531A (en) * 2018-12-12 2019-11-22 杭州海康威视数字技术股份有限公司 A kind of image processing method and system
CN110602420A (en) * 2019-09-30 2019-12-20 杭州海康威视数字技术股份有限公司 Camera, black level adjusting method and device

Also Published As

Publication number Publication date
CN113163124B (en) 2022-06-03
CN115297268A (en) 2022-11-04
WO2021147804A1 (en) 2021-07-29
CN113163124A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
EP3582494B1 (en) Multi-spectrum-based image fusion apparatus and method, and image sensor
US11049232B2 (en) Image fusion apparatus and image fusion method
JP4346634B2 (en) Target detection device
US8666153B2 (en) Image input apparatus
CN108600725B (en) White balance correction device and method based on RGB-IR image data
US20180192011A1 (en) Imaging systems with clear filter pixels
EP2471258B1 (en) Reducing noise in a color image
CN101193314B (en) Image processing device and method for image sensor
JP5397788B2 (en) Image input device
US8385641B2 (en) Method and apparatus for eliminating chromatic aberration
US10110825B2 (en) Imaging apparatus, imaging method, and program
US20120287286A1 (en) Image processing device, image processing method, and program
CN111698434A (en) Image processing apparatus, control method thereof, and computer-readable storage medium
EP2721828A1 (en) High resolution multispectral image capture
CN115297268B (en) Imaging system and image processing method
CN104363434A (en) Image processing apparatus
CN107835351B (en) Two camera modules and terminal
CN115802183B (en) Image processing method and related device
US10395347B2 (en) Image processing device, imaging device, image processing method, and image processing program
US20130266220A1 (en) Color signal processing circuit, color signal processing method, color reproduction evaluating method, imaging apparatus, electronic apparatus and testing device
JP4993275B2 (en) Image processing device
US20240056690A1 (en) Imaging apparatus
JP3880276B2 (en) Imaging device
JP2023013549A (en) Imaging apparatus, imaging method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant