CN116600212B - White balance calibration method based on atmospheric external environment spectrum - Google Patents

White balance calibration method based on atmospheric external environment spectrum Download PDF

Info

Publication number
CN116600212B
CN116600212B CN202310870425.7A CN202310870425A CN116600212B CN 116600212 B CN116600212 B CN 116600212B CN 202310870425 A CN202310870425 A CN 202310870425A CN 116600212 B CN116600212 B CN 116600212B
Authority
CN
China
Prior art keywords
camera
response
light source
value
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310870425.7A
Other languages
Chinese (zh)
Other versions
CN116600212A (en
Inventor
赵汝进
刘赛男
曾骏哲
乔丽
马跃博
余国彬
刘恩海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202310870425.7A priority Critical patent/CN116600212B/en
Publication of CN116600212A publication Critical patent/CN116600212A/en
Application granted granted Critical
Publication of CN116600212B publication Critical patent/CN116600212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/63Noise processing, e.g. detecting, correcting, reducing or removing noise applied to dark current
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

The invention discloses a white balance calibration method based on an atmospheric external environment spectrum, and relates to the technical field of color calibration. According to the method, the spectral distribution of illumination of a target environment is combined with the quantum efficiency of the whole camera, the response color quality value of the camera to the gray plate under the illumination environment is calculated, and the white balance scaling coefficient is calculated according to the linear relation between the target value and the standard value. And solving the color difference after scaling by utilizing a CIE DE 2000 color difference formula, and further performing iterative optimization on the scaling coefficient to obtain an optimal solution. Compared with the traditional color code plate calibration method, the method can ensure the color constancy under different illumination environments, and solves the problem of color cast of the deep space exploration camera image caused by inconsistent spectral distribution of the internal and external environments of the atmosphere.

Description

White balance calibration method based on atmospheric external environment spectrum
Technical Field
The invention relates to the technical field of color calibration, in particular to a white balance calibration method based on an atmospheric external environment spectrum.
Background
Deep space exploration is an inevitable stage of national aerospace technology development, and is beneficial to promoting the research on serious scientific problems such as formation and evolution of solar systems and universe, life origin and evolution and the like. In order to reflect the actual condition characteristics of the deep space and acquire the actual state image of the topography, the deep space detection camera needs to be subjected to color calibration. The color response of the camera is related to ambient light source characteristics, target reflection characteristics, camera spectral response characteristics. In the case where the detection target is unchanged and the spectral response of the camera itself has been corrected, the ambient light source characteristics become the main influencing factors affecting the color response of the camera. The most special factor of the deep space exploration color camera is different from the common camera in the illumination conditions of the environment inside and outside the atmosphere, so that the white balance calibration of the deep space exploration camera is necessary.
The deep space exploration target has unknowns and uncertainties, and the prior few results in scaling difficulty. The literature "Zhao Rujin, liu Enhai, wang Jin, et al, method for color correction of topographic images for satellite camera No. three in the goddess [ J ]. Astronomy, 2016, 37 (03): a two-step color correction method combining color scaling with white balance is designed in 341-347', and satisfactory results are achieved through ground verification. Literature "prince, yang Jianfeng, xue Bin, etc. color truth acquisition method based on light source relative spectral power distribution [ J ]. Spectroscopy and spectral analysis, 2018, 38 (03): 877-882, "redefine the white point coordinates in the conversion matrix using the relative spectral power distribution of the light sources, then correct the conversion matrix between color spaces, improving the accuracy of sample truth value acquisition. However, the above method only performs ground white balance calibration, and the complex illumination environment outside the atmosphere can cause deviation of the calibration result, so that the on-orbit adaptability of the method is required to be improved.
Disclosure of Invention
In order to solve the technical problem of chromatic aberration of a deep space detection camera image caused by inconsistent spectral distribution of the internal and external environments of the atmosphere, the invention provides a white balance calibration method based on the external environment spectrum of the atmosphere, which is used for calculating the white balance calibration coefficient of a camera under the illumination environment by combining the spectral distribution of illumination of a target environment with the quantum efficiency of the whole camera.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a white balance calibration method based on an atmospheric external environment spectrum comprises the following steps:
s1, dark current correction is carried out, the influence of background thermal noise is removed, and an accurate spectrum response function of the camera is obtained;
s2, calculating the overall quantum efficiency of the camera to obtain a spectral response function of the camera;
s3, constructing a camera response characteristic model based on an external space illumination environment;
s4, carrying out ground verification on the camera response characteristic model constructed in the S3:
s5, determining the camera response under the illumination of the standard light source as a standard value of white balance calibration, determining the camera response under the illumination of the environment light source as a correction value, and calculating a white balance calibration coefficient according to a linear relation between the camera response and the correction value;
s6, solving the color difference after calibration by utilizing a CIE DE 2000 color difference formula, and performing iterative optimization on the calibration coefficient to obtain an optimal white balance calibration coefficient solution.
Further, the step S2 includes:
s21, calculating the average quantum efficiency of each channel of the camera R, G, B under each wavelength, and changing the wavelength to repeat the testing and calculating steps so as to obtain the overall quantum efficiency of the camera; wherein the G channel comprisesChannel and->A channel;
s22, fitting a quantum efficiency curve to obtain a spectral response function QR (x), QG (x) and QB (x) of each channel of R, G, B.
Further, the step S3 includes:
s31, calculating the photon numbers of the reflection environment light source and the standard light source on the surface of the target;
s32, calculating the response value of the camera to the gray plate under the target light source environment and the standard light source by combining the overall quantum efficiency of the camera obtained in the step S2.
Further, the step S4 includes: and photographing the standard gray color target plate under different environmental lights, acquiring an rgb value of actual photographing, and comparing the rgb value with a calculated value of a camera response characteristic model to prove the accuracy of the calculation of the camera response characteristic model.
The beneficial effects are that:
the invention provides a white balance calibration method based on an atmospheric external environment spectrum, which combines the spectrum distribution of target environment illumination with the quantum efficiency of the whole camera to calculate the white balance calibration coefficient of the camera in the illumination environment. The method can ensure the color constancy under different environments, solves the technical problem of chromatic aberration of the deep space detection camera image caused by inconsistent spectral distribution of the internal and external environments of the atmosphere, and improves the accuracy and the on-orbit adaptability of white balance calibration.
Drawings
FIG. 1 is a flow chart of a white balance calibration method based on the spectrum of the atmosphere environment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
As shown in fig. 1, the white balance calibration method based on the spectrum of the atmosphere environment of the invention comprises the following steps:
s1, dark current correction is carried out, the influence of background thermal noise is removed, and an accurate camera spectrum response function is obtained:
in the spectral response process, after photons are incident on a photosensitive region, electrons in a channel are excited from a valence band top to a conduction band bottom, and the main process is a process of generating signal charges by a photoelectric effect. Due to the limitation of forbidden bandwidth and the high absorption coefficient of Si to short-wave light, the wavelength range of light which can be effectively detected by CMOS APS (active pixel sensor) is approximately 400-1100 nm. For color cameras, the spectral response characteristics are divided into R,、/>The outputs of the four channels B (red, green, blue) (i.e., bayer color filter array) are recorded separately. The output responses of the four channels were tested using an integrating sphere and filters to scan a fixed visible band. The wavelength range and the scanned wavelength point are: 443.2 nm.+ -. 5nm, 489.5 nm.+ -. 5nm, 515.5 nm.+ -. 5nm, 531.0 nm.+ -. 5nm, 633.6 nm.+ -. 5nm, 648.8 nm.+ -. 5nm, 668.3 nm.+ -. 5nm.
In the spectral response process, the composition of carriers is mainly divided into two parts: photon-generated carriers that undergo transition in the forbidden band after absorption of photons and electrons that undergo transition due to thermal motion, i.e., superposition of incident light response signal and noise floor. Therefore, in order to obtain an accurate incident light response curve, the effect of the thermal noise floor needs to be removed.
The camera is arranged in front of the integrating sphere, an optical filter is covered at the camera lens, proper exposure time is set, then the integrating sphere is used for outputting, and the irradiation brightness of the integrating sphere, the light response gray value of each pixel of the camera and the dark field gray value are recorded in each wave band.
R is,、/>The four channels B are calculated independently, taking R channel as an exampleLet DN value of each picture element of jth image in dark field be +.>Wherein->For pixel coordinates, the dark field spatial average response value of the jth image is +.>
Order theRepresenting the dark field spatial average response value of J images, then the time domain average value of the dark field spatial responseThe method comprises the following steps:
wherein λ is the wavelength;
s2, calculating the overall quantum efficiency of the camera to obtain a R, G, B quantum efficiency function of each channel, wherein the steps are as follows:
s21, calculating the average quantum efficiency of each channel of the camera under each wavelength lambda, and repeating the testing and calculating steps by changing the wavelength, thereby obtaining the spectral response function of the camera:
recording the photo response of individual pixels in each image for j=5 imagesAnd the spatial average of the light response is determined>Let->The time domain average of the optical response of J images:
then the quantum efficiency of the incident light is calculated, the formula is:
wherein, K is the conversion gain,for the light response DN time domain variance, < >>For dark field DN temporal variance, +.>For the average number of incident photons +.>Average value of light response>For the mean value of the dark field spatial response, A is the unit pixel area,>for the exposure time, E is the light irradiation amount, c is the light velocity, h is the planck constant, and then the following expression is given:
thereby obtaining the wavelengthAverage amount of lower camera R channelSub-efficiency, wavelength is derived in turn according to this method>Down->Average quantum efficiency of B channel. Wavelength change->The testing and calculating steps are repeated, thereby obtaining the spectral response characteristics of the camera.
The visible light wave band R of the whole camera obtained through the test,、/>The B channel quantum efficiency value (7 band) is shown in table 1.
Table 1 overall quantum efficiency of camera
S22, fitting a quantum efficiency curve to obtain R, G, B spectral response functions QR (x), QB (x) and QG (x) of each channel, wherein the QG (x) is obtained by calculating the average value of QG1 (x) and QG2 (x);,/>is->、/>Spectral response function of the channel;
s3, constructing a camera response characteristic model based on an outdoor space illumination environment, wherein the method comprises the following steps:
s31, calculating the number of photons with the wavelength x of the target surface reflection environment light source and the standard light source:
when the camera is aimed at a detection target to take a picture, the ambient light is reflected by the surface of the target and then enters the camera, namely, the reflected light of the surface of the target is used as a target light source of the camera. The standard gray plate is taken as a detection target, x is taken as the wavelength of a light source, and the radiation spectrum of ambient illumination isThe radiation spectrum of the standard light source is +.>The reflectance spectrum of the standard gray plate isThe spectrum of the reflected light after the standard gray plate reflects the ambient light is as follows:
the spectrum of the reflected light after the standard gray plate reflects the standard illumination is as follows:
then the radiation spectrum of the ambient light source light received by the light receiving surface of the camera lensThe method comprises the following steps:
wherein k is the ratio of the total light power reflected by the standard gray plate to the light power received by the light receiving surface of the camera lens;
radiation spectrum of standard light source light received by light receiving surface of camera lensThe method comprises the following steps:
the energy E of a single photon with standard light source wavelength x is:
the number of photons of the ambient light source wavelength x is reflected by the target surface:
the number of photons of standard light source wavelength x is reflected by the target surface:
s32, calculating the response value of the camera to the standard gray plate under the target light source environment and the standard light source by combining the spectral response function of the camera obtained in the step S2:
in order to study the influence of ambient light, the quantum efficiency of the camera is a spectrum response curve which is calibrated by color, the influence caused by the color cast of the camera is removed, and only the unique variable influence of the ambient light is considered.
Let the number of pixels occupied by the imaged object in the imaging array be n, the detector conversion gain be f, the spectral response function QR (x), QB (x) and QG (x) of each channel be obtained by S2, then:
average R channel response value under target reflection environment light source
Target objectAverage G-channel response value under reflective ambient light source
Average B-channel response value under target reflection environment light source
And (3) the same principle:
average R, G, B channel response value under target reflection standard light source、/>、/>The method comprises the following steps:
the calculation method for solving the color quality values r, g and b according to the chromaticity value R, G, B is as follows:
camera to standard gray plate under ambient lightResponse color value,/>,/>The method comprises the following steps:
,/>
response color quality value of camera to standard gray plate under standard light source illumination,/>,/>The method comprises the following steps:
,/>
s4, performing ground verification on the camera response characteristic model:
the standard gray plate was photographed under different ambient light, the actual photographed rgb values were obtained and compared with the calculated values of the model, and the results are shown in table 2. The calculation of the visible model is basically consistent with the chromaticity value of the actual shooting, and the accuracy of the calculation of the model is proved.
Table 2 ground color calibration and on-track color calibration results comparison
S5, taking the on-orbit real-time solar spectrum as ambient light to input a camera response characteristic model, calculating an on-orbit white balance scaling coefficient, and comparing the white balance coefficients under different light source environments in a table 3.
TABLE 3 white balance coefficient contrast under different light source environments
Therefore, the camera response under the illumination of the standard light source is considered as the standard value of white balance calibration, the camera response under the illumination of the environment light source is considered as the correction value to be corrected, and the white balance calibration coefficients of the three channels are calculated according to the linear relation between the camera response and the correction value
And S6, finally solving the color difference after calibration by utilizing a CIE DE 2000 color difference formula, and performing iterative optimization on the calibration coefficient to obtain an optimal white balance calibration coefficient solution.
The invention, in part, is not described in detail in the manner known in the art. The foregoing is only illustrative of the present invention and is not to be construed as limiting thereof, but rather as various modifications, equivalent arrangements, improvements, etc., which fall within the spirit and principles of the present invention.

Claims (2)

1. The white balance calibration method based on the atmospheric external environment spectrum is characterized by comprising the following steps of:
s1, a deep space detection camera working in the atmosphere environment is used as an action object, and in order to obtain an accurate spectral response function of the deep space detection camera, the influence of background thermal noise is removed, and dark current correction is carried out:
placing a deep space detection camera in front of an integrating sphere, covering an optical filter at a camera lens, setting proper exposure time, then outputting by using the integrating sphere, and recording the irradiation brightness of the integrating sphere, the light response gray value and the dark field gray value of each pixel of the deep space detection camera in each wave band;
calculating R, G, G2 and B channels independently, and setting DN values of pixels of a j-th image in a dark field as DN values of R channelsWherein->For pixel coordinates, the dark field spatial average response value of the jth image is +.>
Order theRepresenting the mean response value of the dark field space of J pictures, the mean value of the dark field space response in the time domain +.>The method comprises the following steps:
the G1, G2 and B channels are calculated according to the calculation mode of the R channel;
s2, calculating the overall quantum efficiency of the camera to obtain a spectral response function of the camera, wherein the method comprises the following steps:
s21, calculating the average quantum efficiency of each channel of the camera R, G, B under each wavelength, changing the wavelength, and repeating the test and calculation to obtain the overall quantum efficiency of the camera;
recording each of j=5 imagesPhoto response of individual picture elements in a sheet of imageAnd the spatial average of the light response is determined>Let->The time domain average of the optical response of J images:
then the quantum efficiency of the incident light is calculated, the formula is:
where K is the conversion gain, delta y For the light response DN time domain variance, delta yd For the dark field DN time domain variance,for the average number of incident photons +.>Mean value of light response>For the mean value of the dark field spatial response, A is the unit pixel area,>for the exposure time, E is the light irradiation amount, c is the light velocity, h is the planck constant, and then the following expression is given:
thereby obtaining the wavelengthThe average quantum efficiency of the R channel of the lower deep space probe camera is then sequentially derived for the wavelength +.>Average quantum efficiency of the lower G1, G2, B channels; wavelength change->Repeating the testing and calculating steps, thereby obtaining the spectral response characteristics of the deep space exploration camera;
s22, fitting a quantum efficiency curve to obtain a R, G, B spectral response function QR (x), QG (x) and QB (x) of each channel;
s3, constructing a response characteristic model of the camera under the illumination environment based on the extraterrestrial space, which comprises the following steps:
s31, calculating the photon numbers of the reflection environment light source and the standard light source on the surface of the target;
when the camera aims at a detection target to take a picture, the ambient light source enters the camera after being reflected by the surface of the target, namely, the reflected light of the surface of the target is used as a target light source of the camera; the standard gray plate is taken as a detection target, x is taken as the wavelength of a light source, and the radiation spectrum of the standard light source isThe radiation spectrum of ambient light is +.>The reflectivity spectrum of the standard gray plate is +.>The spectrum of the reflected light after the standard gray plate reflects the ambient light is as follows:
the spectrum of the reflected light after the standard gray plate reflects the standard illumination is as follows:
then the radiation spectrum of the ambient light source light received by the light receiving surface of the camera lensThe method comprises the following steps:
wherein k is the ratio of the total light power reflected by the standard gray plate to the light power received by the light receiving surface of the camera lens;
radiation spectrum of standard light source light received by light receiving surface of camera lensThe method comprises the following steps:
the light irradiation amount E of a single photon with the standard light source wavelength x is as follows:
the number of photons of the ambient light source wavelength x is reflected by the target surface:
the number of photons of standard light source wavelength x is reflected by the target surface:
s32, constructing a response characteristic model of the camera to the standard gray plate under the target light source environment and the standard light source by combining the spectral response function of the camera obtained in the S2;
let the number of pixels occupied by the imaged object in the imaging array be n, the detector conversion gain be f, the spectral response function QR (x), QB (x) and QG (x) of each channel be obtained by S2, then:
average R channel response value R under target reflection environment light source H1
Average G channel response value G under target reflection ambient light source H1
Average B-channel response value B under target reflection environment light source H1
Average R, G, B channel response value R under target reflection standard light source H0 、G H0 、B H0 The method comprises the following steps:
obtaining response color quality value r of camera to standard gray plate under ambient light according to chromaticity value R, G, B H1 ,g H1 ,b H1 Response color value r of camera to standard gray plate under standard light source illumination H0 ,g H0 ,b H0
S4, carrying out ground verification on the response characteristic model of the camera constructed in the S3:
s5, taking the on-orbit real-time solar spectrum as an ambient light source, calculating response values of the standard light source and the camera under the ambient light source by using the response characteristic model of the camera constructed in the S3, determining the response value of the camera under the illumination of the standard light source as a standard value of white balance calibration, determining the response value of the camera under the illumination of the ambient light source as a correction value to be corrected, and calculating a white balance calibration coefficient according to a linear relation between the two values
Wherein rH0, gH0, bH0 are response color values of the camera to the standard gray plate under the illumination of the standard light source, and rH1, gH1, bH1 are response color values of the camera to the standard gray plate under the illumination of the environment light source;
s6, solving the color difference after calibration by utilizing a CIE DE 2000 color difference formula, and performing iterative optimization on the white balance calibration coefficient to obtain an optimal white balance calibration coefficient solution.
2. The method for calibrating white balance based on spectrum of the atmosphere environment according to claim 1, wherein S4 comprises: and shooting a standard gray plate under different environmental illumination, acquiring an rgb value of actual shooting, and comparing the rgb value with a calculated value of a response characteristic model of the camera to prove the accuracy of the calculation of the response characteristic model of the camera.
CN202310870425.7A 2023-07-17 2023-07-17 White balance calibration method based on atmospheric external environment spectrum Active CN116600212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310870425.7A CN116600212B (en) 2023-07-17 2023-07-17 White balance calibration method based on atmospheric external environment spectrum

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310870425.7A CN116600212B (en) 2023-07-17 2023-07-17 White balance calibration method based on atmospheric external environment spectrum

Publications (2)

Publication Number Publication Date
CN116600212A CN116600212A (en) 2023-08-15
CN116600212B true CN116600212B (en) 2023-11-17

Family

ID=87604810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310870425.7A Active CN116600212B (en) 2023-07-17 2023-07-17 White balance calibration method based on atmospheric external environment spectrum

Country Status (1)

Country Link
CN (1) CN116600212B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622739A (en) * 2012-03-30 2012-08-01 中国科学院光电技术研究所 Method for correcting non-uniformity of image of Bayer filter array color camera
CN103686144A (en) * 2012-08-30 2014-03-26 苹果公司 Correction factor for color response calibration
CN105187819A (en) * 2015-07-29 2015-12-23 合肥埃科光电科技有限公司 Color response testing and correcting device and method for industrial color cameras
CN105933687A (en) * 2016-07-04 2016-09-07 凌云光技术集团有限责任公司 Automatic white balance processing method and device for images
CN113487681A (en) * 2021-07-01 2021-10-08 浙江大学 Camera color calibration method based on spectral sensitivity curve and light source spectrum optimization

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7728845B2 (en) * 1996-02-26 2010-06-01 Rah Color Technologies Llc Color calibration of color image rendering devices
US8300933B2 (en) * 2009-07-27 2012-10-30 Himax Imaging, Inc. System and method of generating color correction matrix for an image sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622739A (en) * 2012-03-30 2012-08-01 中国科学院光电技术研究所 Method for correcting non-uniformity of image of Bayer filter array color camera
CN103686144A (en) * 2012-08-30 2014-03-26 苹果公司 Correction factor for color response calibration
CN105187819A (en) * 2015-07-29 2015-12-23 合肥埃科光电科技有限公司 Color response testing and correcting device and method for industrial color cameras
CN105933687A (en) * 2016-07-04 2016-09-07 凌云光技术集团有限责任公司 Automatic white balance processing method and device for images
CN113487681A (en) * 2021-07-01 2021-10-08 浙江大学 Camera color calibration method based on spectral sensitivity curve and light source spectrum optimization

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于LASSO回归模型的彩色相机颜色校正方法;郭越;高昆;朱钧;豆泽阳;黄亚东;冯云鹏;;影像科学与光化学(第02期);第1节 *
嫦娥三号卫星相机地形图像的彩色校正方法;赵汝进;刘恩海;王进;余国彬;;宇航学报(第03期);全文 *
空间天文CCD在EUV波段的光谱响应定标测试;曾智蓉;李保权;彭吉龙;张鑫;周洪军;霍同林;;科学技术与工程(第08期);第3节 *

Also Published As

Publication number Publication date
CN116600212A (en) 2023-08-15

Similar Documents

Publication Publication Date Title
US9332197B2 (en) Infrared sensor control architecture
US6924841B2 (en) System and method for capturing color images that extends the dynamic range of an image sensor using first and second groups of pixels
CN102466520A (en) Multispectral imaging color measurement system and imaging signal processing method thereof
CN102403326A (en) Imaging device and imaging apparatus
US8243162B2 (en) Automatic white balancing using meter sensors
CN110944126B (en) Imaging system and method for performing black level correction on image sensor
Fiorentin et al. Calibration of digital compact cameras for sky quality measures
CN116600212B (en) White balance calibration method based on atmospheric external environment spectrum
CN114125431A (en) Non-uniformity calibration correction method for static track optical large-area array camera
Catrysse et al. Comparative analysis of color architectures for image sensors
CN114577446B (en) CCD/CMOS extreme ultraviolet band quantum efficiency detection device and method
Cattini et al. Low-cost imaging photometer and calibration method for road tunnel lighting
Mayya Photometric calibration of the CCD camera of 1-m telescope at VBO
Vunckx et al. Accurate video-rate multi-spectral imaging using imec snapshot sensors
RU93977U1 (en) MULTI-COLOR COLORIMETER
Ribés et al. Color and multispectral imaging with the CRISATEL multispectral system
Skorka et al. Tradeoffs With RGB-IR Image Sensors
LaBelle et al. Introduction to high performance CCD cameras
CN105590941B (en) A method of improving photoelectric sensor and photosensitive material Dim light measurement ability
CN115830146B (en) On-orbit relative radiation calibration and correction method for aerospace optical remote sensing camera
CN113820009B (en) On-orbit radiation calibration method for space extreme ultraviolet solar telescope
CN117714905B (en) Method for correcting radiation response characteristic of CMOS image sensor
Slemer et al. SIMBIO-SYS STC radiometric cailbration
Mizoguchi Evaluation of image sensors
Liu et al. Research on the nonuniformity correction of linear TDI CCD remote camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant