CN116132784A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN116132784A
CN116132784A CN202111335246.0A CN202111335246A CN116132784A CN 116132784 A CN116132784 A CN 116132784A CN 202111335246 A CN202111335246 A CN 202111335246A CN 116132784 A CN116132784 A CN 116132784A
Authority
CN
China
Prior art keywords
camera
pixel
response
determining
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111335246.0A
Other languages
Chinese (zh)
Inventor
庄宏伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202111335246.0A priority Critical patent/CN116132784A/en
Publication of CN116132784A publication Critical patent/CN116132784A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/50Photovoltaic [PV] energy

Abstract

The present disclosure relates to an image processing method, apparatus and storage medium, the method comprising: acquiring first spectral response data of a first camera and second spectral response data of a second camera; acquiring radiation illuminance data of an optical signal to be detected; determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the illuminance data, the first spectral response data and the second spectral response data of the optical signal to be detected; and determining pixel deviation amounts between output images after the first camera and the second camera receive the optical signals to be detected in different spectral frequency bands based on the first pixel response and the second pixel response, wherein the pixel deviation amounts are used for correcting images acquired by the second camera when the first camera is switched to the second camera.

Description

Image processing method, device and storage medium
Technical Field
The disclosure relates to the field of image technology, and in particular, to an image processing method, an image processing device and a storage medium.
Background
Because the photographing distance and the photographing range of a single camera are limited, in order to meet the application scene requirements of multiple view angles, multiple scenes and multiple photographing distances, the photographing capability of the terminal equipment is enhanced, the terminal equipment is configured with a plurality of cameras, and the plurality of cameras are utilized for zooming photographing, so that the photographing requirements of multiple application scenes are met.
However, due to the different sensitivity of different cameras to the optical signals with different colors, when the cameras for displaying the preview image on the terminal device are switched, the preview image may have color jump and flicker, and smooth and natural switching between the different cameras cannot be realized.
Disclosure of Invention
The present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an image processing method including:
acquiring first spectral response data of a first camera and second spectral response data of a second camera;
acquiring radiation illuminance data of an optical signal to be detected;
determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the illuminance data, the first spectral response data and the second spectral response data of the optical signal to be detected, wherein the first pixel response is used for describing the intensity of the optical signal to be detected received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
And determining pixel deviation amounts between output images after the first camera and the second camera receive the optical signals to be detected in different spectral frequency bands based on the first pixel response and the second pixel response, wherein the pixel deviation amounts are used for correcting images acquired by the second camera when the first camera is switched to the second camera.
Optionally, the determining, based on the first pixel response and the second pixel response, a pixel deviation amount between output images of the first camera and the second camera after receiving the optical signals to be measured in different spectral bands includes:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
Optionally, the determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount includes:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
or alternatively, the process may be performed,
And determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
Optionally, the first spectral response data includes: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the determining, based on the illuminance data of the optical signal to be measured and the first spectral response data and the second spectral response data, the first pixel response output by the first camera and the second pixel response output by the second camera includes:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
Integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
Optionally, the determining the gray distribution curve corresponding to the first signal output by the first camera and the second camera after receiving the radiation of the optical signal to be detected includes:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
And determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
Optionally, the first pixel response includes: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
the obtaining a first deviation according to the difference between the first pixel response and the second pixel response includes:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
Optionally, the acquiring the radiation illuminance data of the optical signal to be measured includes:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
and estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
the acquisition module is used for acquiring first spectral response data of the first camera and second spectral response data of the second camera; acquiring radiation illuminance data of an optical signal to be detected;
the first determining module is used for determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the radiation illuminance data, the first spectral response data and the second spectral response data of the optical signal to be detected, wherein the first pixel response is used for describing the intensity of the optical signal to be detected received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
the second determining module is configured to determine, based on the first pixel response and the second pixel response, a pixel deviation amount between output images of the first camera and the second camera after receiving the optical signals to be detected in different spectral bands, where the pixel deviation amount is used to correct an image acquired by the second camera when the first camera is switched to the second camera.
Optionally, the second determining module is configured to:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
Optionally, the second determining module is configured to:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
or alternatively, the process may be performed,
and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
Optionally, the first spectral response data includes: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the first determining module is configured to:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
Integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
Optionally, the first determining module is further configured to:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
And determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
Optionally, the first pixel response includes: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
the second determining module is configured to:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
Optionally, the acquiring module is configured to:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
and estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
A processor;
a memory for storing executable instructions;
wherein the processor is configured to: when the executable instructions stored in the memory are executed, the steps in the image processing method according to the first aspect of the embodiments of the present disclosure are implemented.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a processor of an image processing apparatus, causes the image processing apparatus to perform the steps in the image processing method according to the first aspect of embodiments of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the embodiment of the disclosure, the first pixel response and the second pixel response of the first camera and the second camera are respectively determined by acquiring the first spectral response data of the first camera, the second spectral response data of the second camera and the radiation illuminance data of the optical signal to be detected. The first pixel response and the second pixel response can reflect the intensities of the light signals to be detected received by the pixels of the different color channels in the first camera and the second camera, so that the pixel difference amounts of the output images of the first camera and the second camera can be determined under the intensities of the light signals to be detected received by the pixels of the different color channels corresponding to the first pixel response and the second pixel response.
The color difference of each pixel point in the output images of the first camera and the second camera is reflected through the pixel difference quantity, so that the output images of the second camera can be corrected by utilizing the pixel difference quantity, the color difference between the pixel points of the output images of the first camera and the second camera is reduced, the color jump and flickering of a preview picture during the switching of the cameras are reduced, and the smooth and natural switching between different cameras is realized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flow chart illustrating an image processing method according to an exemplary embodiment.
Fig. 2 is a flow chart diagram II of an image processing method according to an exemplary embodiment.
Fig. 3 is a schematic diagram showing a spectral response curve of a camera according to an exemplary embodiment.
Fig. 4 is a schematic diagram showing an illuminance curve of an optical signal according to an exemplary embodiment.
Fig. 5 is a schematic diagram showing a brightness response curve of a camera according to an exemplary embodiment.
Fig. 6 is a flowchart illustrating a method of image processing according to an exemplary embodiment.
Fig. 7 is a schematic diagram illustrating a first image acquired by a primary camera according to an exemplary embodiment.
Fig. 8 is a schematic diagram illustrating a second image captured by a secondary camera according to an exemplary embodiment.
Fig. 9 is a schematic diagram showing spectral response curves of the primary and secondary cameras according to an exemplary embodiment.
Fig. 10 is a schematic diagram illustrating a spectral response curve of an ambient light signal according to an exemplary embodiment.
FIG. 11 is a schematic diagram illustrating a rectified third image according to an example embodiment.
Fig. 12 is a schematic structural view of an image processing apparatus according to an exemplary embodiment.
Fig. 13 is a block diagram of an image processing apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus consistent with some aspects of the disclosure as detailed in the accompanying claims.
In order to meet shooting requirements of multiple application scenes, a terminal device is generally configured with a plurality of cameras; however, the spectral response curves of different cameras are different, i.e. the sensitivity of different cameras to optical signals of different wavelengths is different; the pixel values of the images output by different cameras under the radiation of the optical signals with the same color are different, so that the white balance parameters calculated by the different cameras are also different.
When the cameras for displaying the preview images are switched in the automatic white balance mode, the color jump and flickering of the preview images are caused due to the jump of the white balance parameters corresponding to different cameras, so that the user experience is reduced.
The following two methods are generally adopted in the related art to solve the above problems:
firstly, based on a deep learning pixel mapping model, a large amount of data acquisition and training are needed for each camera, on one hand, because of a plurality of differences such as angles of view and the like among the cameras, the acquisition of training data of the cameras is difficult, and on the other hand, the obtained pixel mapping model has no universality, and the data needs to be acquired again and retrained every time the cameras are replaced.
Second, the mapping white balance based on the semantic segmentation mode, but the method is excessively dependent on the model, so that the color in the photographed image is greatly different from the color seen by naked eyes.
Based on this, the embodiment of the present disclosure provides an image processing method. Fig. 1 is a schematic flow diagram of an image processing method according to an exemplary embodiment, and as shown in fig. 1, the method includes:
step S101, acquiring first spectral response data of a first camera and second spectral response data of a second camera;
step S102, obtaining radiation illuminance data of an optical signal to be detected;
step S103, determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the illuminance data, the first spectral response data and the second spectral response data of the optical signal to be detected, wherein the first pixel response is used for describing the intensity of the optical signal to be detected received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
Step S104, determining a pixel deviation amount between output images after the first camera and the second camera receive the optical signals to be detected in different spectral bands based on the first pixel response and the second pixel response, where the pixel deviation amount is used to correct the image acquired by the second camera when the first camera is switched to the second camera.
In the embodiment of the disclosure, the image processing method may be performed by an image processing apparatus, which may be configured in a terminal device, and the terminal device has at least two cameras or the terminal device is connected with two cameras or more than two cameras at the same time. Here, the terminal device may be: smart phones, tablet computers, or wearable electronic devices, etc.
In step S101, the first spectral response data of the first camera includes at least a first spectral response curve of the first camera, and the second spectral response data of the second camera includes at least a second spectral response curve of the second camera.
The first camera and the second camera can be detected through the detection equipment, so that a first spectral response curve corresponding to the first camera and a second spectral response curve corresponding to the second camera are obtained.
Here, the detection device may be a monochromator; for example, in order to improve the accuracy of measurement, the monochromator may be used to measure the first camera and the second camera for multiple times to obtain multiple sets of measurement data; and determining the average value of the plurality of groups of measurement data as a first spectral response curve of the first camera and a second spectral response curve of the second camera.
In this embodiment, the spectral response curve refers to a relationship curve between photocurrents generated by the light irradiation image sensor with different wavelengths and wavelengths while keeping the intensity of the incident light unchanged; the spectral response curve can be used to describe the sensitivity of the image sensor within the camera to light signals of different wavelengths.
The first camera and/or the second camera can be a wide-angle camera, a long-focus camera or a main camera; for example, when the first camera is the primary camera, the second camera may be a tele camera.
It should be noted that the camera includes a lens and an image sensor; light is focused and projected into the image sensor through the lens, and each photosensitive element in the image sensor generates photoelectric effect and outputs an electric signal. Because the parameters of the image sensors in different cameras are different, the spectral response curves corresponding to the different cameras are also different.
In step S102, the radiation illuminance data includes at least a radiation illuminance curve; the optical signal to be detected can be an ambient light signal of an environment where the terminal equipment is located, and the radiation illuminance curve of the ambient light signal of the environment where the terminal equipment is located can be determined by performing scene analysis on the environment where the terminal equipment is located.
Here, the illuminance of the light signal to be measured refers to the radiant energy received per unit area and unit time on the surface of the object to which the light signal to be measured is radiated.
For example, the spectral power distribution (i.e. the irradiance curve) of the ambient light signal of the environment in which the terminal device is located may be obtained by analyzing the ambient light signal based on a pre-stored CIE standard illuminant spectral distribution.
The CIE standard illuminant spectral distribution is the spectral power distribution of a plurality of illuminant defined by the international commission on illumination to describe the color of non-spontaneous illuminant to provide a colorimetric analysis parameter spectrum.
In step S103, the first pixel response may be used to describe the intensities of the light signals to be measured received by the pixels of the different color channels in the first camera; the second pixel response may be used to describe the intensity of the light signal under test received by the pixels of the different color channels within the second camera.
The light energy of the light signals to be detected with different wavelengths received by the first camera and the second camera can be determined according to the radiation illuminance data of the light signals to be detected.
Because the wavelengths of the light signals to be detected which can be received by the pixels of the different color channels in the image sensor are different, the first pixel response of the first camera can be determined according to the first spectral response curve of the first camera and the light energy of the light signals to be detected with different wavelengths which are received by the first camera;
and determining a second pixel response of the second camera according to the second spectral response curve of the second camera and the light energy of the light signals to be detected with different wavelengths received by the second camera.
It should be noted that, in order to enable the image sensor to sense the intensity of the optical signals with different colors (wavelengths), a mosaic filter including only three colors of red, green and blue may be covered on the pixel surface of the image sensor, and the optical signals with other wavelengths (i.e. different from the color of the filter) of the optical signals to be measured, which are incident to the pixel, may be filtered by the filter, so that a single pixel only receives the optical signal with a certain color and senses the intensity of the optical signal with the certain color.
According to the light energy of the light signals to be detected with different wavelengths received by the first camera and the second camera, determining that pixels of different color channels in the first camera and the second camera receive the light signals to be detected with different wavelengths, and outputting electric signals when photoelectric effects occur; and determining the intensities of the optical signals to be detected of different colors received by the first camera and the second camera according to the electric signals output by the pixels of the different color channels in the first camera and the second camera.
In step S104, after determining the first pixel response of the first camera and the second pixel response of the second camera, a first pixel value corresponding to the first pixel response and a second pixel value corresponding to the second pixel response may be determined according to the brightness responses of the first camera and the second camera; and determining the pixel deviation amount according to the difference between the first pixel value and the second pixel value.
Here, the brightness response is a correspondence between the intensity of the optical signal to be detected perceived by the image sensor in the camera and the pixel value of the output image of the camera.
The first pixel value corresponding to the intensity of the light signal to be detected, which is received by the pixels of the different color channels in the first camera, can be determined according to the first pixel response and the brightness response of the first camera; and determining a second pixel value corresponding to the intensity of the light signal to be detected received by the pixels of the different color channels in the second camera according to the second pixel response and the brightness response of the second camera.
In order to ensure that the two cameras can smoothly and naturally finish switching, the pixel deviation amount between the output images of the first camera and the second camera can be determined, and the output images of the second camera are corrected through the pixel deviation amount, so that the corrected images can keep the color consistency with the images of the output images of the first camera, and the color jump and flickering between the output images of the first camera and the second camera are reduced when the cameras are switched, so that a user can finish switching between the cameras under the condition of no sense.
Optionally, the determining, in step S104, the pixel deviation amount between the output images after the first camera and the second camera receive the optical signals to be measured in different spectral bands based on the first pixel response and the second pixel response includes:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
In the embodiment of the disclosure, the first deviation amount may be used to describe the difference in intensity of the light signals to be measured perceived by the pixels of the different color channels in the first camera and the second camera.
For the first camera and the second camera, the light energy of the light signals to be detected, which are incident to the first camera and the second camera, is the same, but because the first spectral response curve of the first camera is different from the second spectral response curve of the second camera, the sensitivity of the first camera and the second camera to the light signals to be detected with different wavelengths is different, the intensities of the light signals to be detected with different wavelengths, which can be perceived by the pixels of different color channels in the first camera and the second camera, are different, so that the first camera and the second camera receive the same light signals to be detected, and the pixel values of the output images are different.
The first amount of deviation may be determined based on a difference between a first pixel response of the first camera and a second pixel response of the second camera; and determining the pixel deviation amount corresponding to the first deviation amount by utilizing the first deviation amount between the first camera and the second camera and the brightness response of the first camera and the second camera.
Optionally, the determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount includes:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
or alternatively, the process may be performed,
and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
In the embodiment of the disclosure, since the first offset may be used to describe the difference in intensity of the optical signals to be measured perceived by the pixels of the different color channels in the first camera and the second camera.
The pixel difference between the images acquired by the first camera and the second camera is just that the sensitivity of the first camera and the second camera to the optical signals to be detected with different wavelengths is different, so that the perceptibility of the pixels of the channels with different colors in the first camera and the second camera to the optical signal strength to be detected is different, and the pixel value converted from the electrical signals stimulated and output by the first camera is different from the pixel value of the electrical signals stimulated and output by the second camera aiming at the optical signals to be detected with the same color (same wavelength).
Therefore, in order to reduce the pixel deviation between the output images of the first camera and the second camera, the pixel values corresponding to the different color channels of the RGB format image acquired by the second camera can be corrected directly according to the intensity difference (i.e., the first deviation) of the light signal to be detected perceived by the pixels of the different color channels in the first camera and the second camera, which are respectively used as the pixel deviation corresponding to the pixels of the different color channels.
It should be noted that RGB is a standard for representing colors in the digital field, and is also referred to as a color space, and each pixel value in an RGB format image is identified by three components of R, G and B; a particular color is represented by a combination of different luminance values of the three primary colors R, G, B. If each component is represented by 8 bits, one pixel is represented by 3*8 =24 bits in total.
Illustratively, the pixel deviation amount corresponding to the first color channel between the first camera and the second camera can be determined according to the first deviation amount corresponding to the pixels of the first color channel (such as R channel) in the first camera and the second camera;
determining a pixel deviation amount corresponding to a second color channel (such as a G channel) between the first camera and the second camera according to a first deviation amount corresponding to a pixel of the second color channel in the first camera and the second camera;
Determining a pixel deviation amount corresponding to a third color channel (such as a B channel) between the first camera and the second camera according to a first deviation amount corresponding to a pixel of the third color channel in the first camera and the second camera;
when the terminal equipment is switched from the first camera to the second camera, the pixel values corresponding to the first color channel, the second color channel and the third color channel of each pixel point in the acquired image of the second camera can be corrected based on the pixel deviation amount corresponding to the first color channel, the pixel deviation amount corresponding to the second color channel and the pixel deviation amount corresponding to the third color channel, so that a corrected image is obtained and output.
Considering that the parameters of the image sensors in the first camera and the second camera may have differences, the brightness responses of the first camera and the second camera may also have differences, that is, the image sensors in the first camera and the second camera sense that the intensities of the optical signals to be detected are the same, but the pixel values of the output images of the first camera and the second camera are different.
In this case, even if compensation for the difference in intensity of the light signal to be measured perceived by the pixels of the different color channels in the first camera and the second camera is achieved based on the amount of the first deviation, there may still be a difference in pixel between the output image of the second camera after correction and the output image of the first camera due to the difference in luminance response between the cameras.
In order to reduce pixel differences between output images of different cameras due to brightness response differences between cameras, the embodiment of the disclosure determines a brightness response gradient of the second camera in a current brightness environment according to brightness response of the second camera; the product of the luminance response gradient and the first amount of deviation is determined as the amount of pixel deviation between the first camera and the second camera output image.
It should be noted that, because the brightness response and the brightness response gradient of the second camera are the device parameter information of the second camera, the brightness response and the brightness response gradient of the second camera can be obtained by obtaining the device parameter information of the second camera.
The brightness of the current environment where the second camera is positioned can be determined according to the pixel value of the current acquired image of the second camera; determining a brightness response gradient corresponding to the brightness based on the brightness of the current environment where the second camera is positioned; and determining the product of the brightness response gradient corresponding to the brightness and the first deviation amount between the first camera and the second camera as the pixel deviation amount between the first camera and the second camera.
For example, a brightness response gradient corresponding to a first color channel (such as an R channel) of a second camera in a current brightness environment may be obtained, and a product of the brightness response gradient corresponding to the first color channel and a first deviation amount corresponding to the first color channel is determined as a pixel deviation amount corresponding to the first color channel between the first camera and the second camera;
Acquiring a brightness response gradient corresponding to a second color channel (such as a G channel) of a second camera in a current brightness environment, and determining the product of the brightness response gradient corresponding to the second color channel and a first deviation amount corresponding to the second color channel as a pixel deviation amount corresponding to the second color channel between a first camera and the second camera;
acquiring a brightness response gradient corresponding to a third color channel (such as a B channel) of a second camera in a current brightness environment, and determining the product of the brightness response gradient corresponding to the third color channel and a first deviation amount corresponding to the third color channel as a pixel deviation amount corresponding to the third color channel between the first camera and the second camera;
when the terminal equipment is switched from the first camera to the second camera, the pixel values corresponding to the first color channel, the second color channel and the third color channel of each pixel point in the acquired image of the second camera can be corrected based on the pixel deviation amount corresponding to the first color channel, the pixel deviation amount corresponding to the second color channel and the pixel deviation amount corresponding to the third color channel, so that a corrected image is obtained and output.
Optionally, the first spectral response data includes: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the determining, in step S103, the first pixel response output by the first camera and the second pixel response output by the second camera based on the illuminance data of the optical signal to be measured and the first spectral response data and the second spectral response data, includes:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
And carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
In the embodiment of the disclosure, the first signals output by the pixels of the channels of different colors of the first camera and the second camera under the radiation of the optical signals to be detected with different wavelengths are related to the optical energy of the optical signals to be detected with different wavelengths received by the first camera and the second camera.
The light energy of the light signals to be detected with different wavelengths received by the first camera and the second camera can be determined according to the radiation illuminance data of the light signals to be detected, and the signal values of the first signals output by the first camera and the second camera under the irradiation of the light signals to be detected with different wavelengths are respectively determined based on the light energy of the light signals to be detected with different wavelengths received by the first camera and the second camera; and determining a gray value corresponding to the signal value of the first signal based on the signal values of the first signals output by the first camera and the second camera.
Here, the first signal may be an electrical signal. It should be noted that, because each pixel in the image sensors in the first camera and the second camera generates photoelectric effect under the radiation of the optical signal to be detected, an electric signal capable of reflecting the intensity of the optical signal to be detected received by the pixel is output; and the gray data reflecting the intensity of the optical signal to be measured from the image angle is formed by carrying out analog-digital conversion on the electric signal output by the pixel.
It can be appreciated that the gray level distribution curve can be used to describe the intensity of the optical signal to be detected received by the camera in different spectral bands; if the gray value corresponding to a certain pixel point is larger, the signal value of the first signal output by the pixel point is larger, and the intensity of the optical signal to be detected received by the pixel point is also larger.
Because the sensitivity of the pixels of the plurality of different color channels in the camera to the optical signals to be detected with different wavelengths is different, the pixels of the single color channel can only receive the optical signals with the wavelength range corresponding to the color channel in the optical signals to be detected, and output the first signals for describing the optical signal intensity with the wavelength range corresponding to the color channel.
Therefore, when determining the first pixel response of the first camera, integrating the products of the first sub-response curves and the gray-scale distribution curves according to a plurality of first sub-response curves corresponding to the pixels of the different color channels in the first camera and gray-scale distribution curves reflecting the intensities of the light signals to be detected received by the first camera, so as to respectively obtain first pixel responses for describing the intensities of the light signals to be detected received by the pixels of the different color channels in the first camera.
When determining the second pixel response of the second camera, integrating the products of the second sub-response curves and the gray distribution curves according to a plurality of second sub-response curves corresponding to the pixels of different color channels in the second camera and gray distribution curves reflecting the intensities of the light signals to be detected received by the second camera, so as to respectively obtain second pixel responses for describing the intensities of the light signals to be detected received by the pixels of different color channels in the second camera.
Optionally, the determining the gray distribution curve corresponding to the first signal output by the first camera and the second camera after receiving the radiation of the optical signal to be detected includes:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
And determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
In the embodiment of the disclosure, since the gray level distribution curves of the first camera and the second camera can be used for describing the signal intensity of the first signal output by the first camera and the second camera under the radiation of the optical signals to be detected with different wavelengths; the signal intensity of the first signal is related to the number of photoelectrons generated when the first camera and the second camera generate photoelectric effect under the radiation of the optical signals to be detected with different wavelengths, and is also related to the number of photons contained in the optical signals to be detected with different wavelengths.
It can be understood that the more photons contained in the optical signals with different wavelengths, the more photoelectrons the camera generates under the radiation of the optical signals with different wavelengths, and the greater the signal intensity of the first signal output by the camera. Therefore, the gray level distribution curves corresponding to the first camera and the second camera can be determined by acquiring the photon quantity in the optical signals to be detected with different wavelengths.
And determining the first light energy of the to-be-detected light signals with different wavelengths received by the first camera and the second camera according to the radiation illuminance curve of the to-be-detected light signals. Here, the first light energy of the optical signal under test with different wavelengths is the sum of energies generated by respective photons in the optical signal under test with different wavelengths.
According to the frequency of each photon in the optical signal to be detected, respectively determining second light energy generated by the first camera and the second camera when receiving photons with different frequencies; here, the second light energy is energy generated by a single photon.
It should be noted that the energy generated by a single photon is related to the frequency of the photon, and the energy generated by photons of the same frequency is the same.
The photon quantity of photons corresponding to different wavelengths can be determined by the first camera and the second camera according to the ratio between the first light energy corresponding to the optical signals to be detected with different wavelengths and the second light energy of photons corresponding to different wavelengths.
In some embodiments, the gray values corresponding to the first signals output by the first camera and the second camera under the radiation of the optical signals to be detected with different wavelengths are in positive correlation with the photon numbers of photons corresponding to the different wavelengths received by the first camera and the second camera.
Here, the positive correlation coefficient between the gray value corresponding to the first signal output by the first camera and the second camera under the radiation of the optical signal to be measured with different wavelengths and the photon number of the photons corresponding to the different wavelengths received by the first camera and the second camera may be determined by the parameters of the image sensor in the first camera and the second camera (for example, the size, the limiting frequency, etc. of the photosensitive element in the image sensor).
Optionally, the first pixel response includes: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
the obtaining a first deviation according to the difference between the first pixel response and the second pixel response includes:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
In the embodiment of the disclosure, according to a plurality of first pixel responses corresponding to pixels of a plurality of different color channels in a first camera and a plurality of second pixel responses corresponding to pixels of a plurality of different color channels in a second camera, a difference value between the first pixel responses and the second pixel responses corresponding to the same color channel is determined as a first deviation value corresponding to the pixels of the color channel according to the color channels.
Illustratively, the first deviation amount corresponding to the first color channel may be determined according to a difference between a first pixel response corresponding to a pixel of the first color channel (e.g., R-channel) in the first camera and a second pixel response corresponding to a pixel of the first color channel in the second camera;
determining a first deviation amount corresponding to the second color channel according to a difference value between a first pixel response corresponding to a pixel of the second color channel (such as a G channel) in the first camera and a second pixel response corresponding to a pixel of the second color channel in the second camera;
determining a first deviation amount corresponding to a third color channel (such as a B channel) according to a difference value between a first pixel response corresponding to a pixel of the third color channel in the first camera and a second pixel response corresponding to a pixel of the third color channel in the second camera;
so as to respectively determine the pixel deviation amounts corresponding to the first color channel, the second color channel and the third color channel by using the first deviation amounts corresponding to the first color channel, the second color channel and the third color channel.
Optionally, the acquiring the radiation illuminance data of the optical signal to be measured includes:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
And estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
In the embodiment of the disclosure, an illuminometer or a color temperature sensor may be used to measure the to-be-measured optical signal to obtain the spectral distributions of the to-be-measured optical signals of the plurality of different wavelength channels, and determine the illuminance of the to-be-measured optical signals of the plurality of different wavelength channels according to the spectral distributions of the to-be-measured optical signals of the plurality of different wavelength channels.
The illuminance of the light signal may be obtained by superimposing the spectral distribution and the reflection coefficient of the light signal.
According to the radiation illuminance of the optical signals to be detected of the plurality of different wavelength channels, the radiation illuminance curve of the optical signals to be detected can be estimated through an interpolation method.
The present disclosure also provides the following embodiments:
fig. 2 is a second flowchart of an image processing method according to an exemplary embodiment, where the method includes:
step S201, acquiring first spectral response data of a first camera and second spectral response data of a second camera;
in this example, a monochromator may be utilized to measure the spectral response curves of the first camera and the second camera within the terminal device; and obtaining an average value according to the multiple measurement results to obtain a relatively accurate first spectral response curve of the first camera and a relatively accurate second spectral response curve of the second camera.
Here, the first spectral response data includes: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: and a plurality of second sub-response curves corresponding to the pixels of the different color channels in the second camera.
The camera comprises a lens and an image sensor; the incident light signal of the camera is focused into the image sensor through the lens, each photosensitive element (namely pixel) in the image sensor generates photoelectric effect according to the intensity of the received light signal, and outputs an electric signal which can reflect the intensity of the received light signal but cannot reflect the color of the received light signal.
In this regard, the red, green, and blue (R, gb, gr, B) components in the incident light signal may be split by the Bayer image sensor (i.e., a color filter is covered in front of the image sensor), so that the photosensitive elements of the different color channels in the image sensor output electrical signals corresponding to the different color light signals according to the intensities of the received different color light signals. As shown in fig. 3, fig. 3 is a schematic diagram illustrating a spectral response curve of a camera according to an exemplary embodiment.
Step S202, obtaining radiation illuminance data of an optical signal to be detected;
in this example, the optical signal to be measured may be an ambient light signal of an environment where the terminal device is located, and the spectral distribution of the ambient light signal may be determined from the pre-stored CIE standard illuminant spectrum by performing scene analysis on the ambient light signal, and the illuminance curve of the ambient light signal may be determined.
It should be noted that, the irradiance curve of the optical signal may be formed by overlapping the spectral distribution and the reflection coefficient of the optical signal, and the irradiance curve should be the same for the same scene. As shown in fig. 4, fig. 4 is a schematic diagram illustrating an irradiance profile of an optical signal according to an exemplary embodiment.
Step 203, determining first light energy of the incident light signals of the first camera and the second camera based on the illuminance data of the light signal to be detected; according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
in this example, the photoelectric effect may be used to determine the light energy corresponding to each spectral band based on the irradiance profile of the ambient light signal.
It should be noted that, the camera includes a lens and a photoelectric sensor, for the photoelectric sensor, the light energy of each pixel point in the photoelectric sensor for receiving the incident light signal is proportional to the illuminance of the incident light signal, and the proportionality coefficient is k 1 The light energy is:
E=k 1 P;
wherein E is the light energy received by each pixel point of the photoelectric sensor; the P is the radiation illuminance of the environment light signals, and the radiation illuminance corresponding to the environment light signals with different spectrum frequency bands can be different; the proportionality coefficient k 1 Related to the size and process of the pixel sites within the photosensor.
The second light energy is energy generated by the first camera and the second camera receiving one photon.
It should be noted that, as known from the photoelectric effect, photons with different frequencies γ strike a cutoff frequency γ 0 The work function transferred to electrons is:
W=h(γ-γ 0 );
wherein W is the energy produced by a single photon; the h is the planck constant; the gamma is photon frequency, the gamma 0 Is the cut-off frequency of the photosensitive element within the image sensor.
In step S204, determining the photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energies; determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera;
It should be noted that, light is focused and projected into the image sensor through the lens, each photosensitive element in the image sensor generates a photoelectric effect according to the intensity of the light received by itself, an electric signal is output, and the electric signal is converted into a digital quantity through analog-to-digital conversion, where the digital quantity is called gray scale or gray scale.
For an image sensor, the gray scale converted from an stimulated electrical signal is proportional to the number of photons it receives over a measurement range:
Figure BDA0003350323330000171
wherein, Q (lambda) is the gray scale corresponding to the optical signal with wavelength lambda; the k is 2 Is a proportionality coefficient; n is the number of photons contained in the optical signal at wavelength λ; the k=k 1 k 2 And/hc, wherein c is the speed of light.
Step S205, performing integral processing on products of gray distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera, and determining a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
In this example, since the pixels of the different color channels in the image sensors of the first camera and the second camera do not receive all photons, the pixel responses of the pixels of the different color channels in the first camera and the second camera may be determined according to the spectral response curves corresponding to the pixels of the different color channels in the first camera and the second camera.
Here, the pixel response may be used to describe the intensity of the incident light signal received by the pixels of each of the different color channels within the camera.
Pixel response for pixel x for different color channels:
X=∫Q(λ)R x (λ)dλ;
wherein, X is the pixel response corresponding to the pixel X; the Q (lambda) is the gray scale distribution converted by the electric signal stimulated by the image sensor; the R is x (lambda) is the spectral response corresponding to pixel x.
Because the spectral response curve and the gray scale distribution are both numerical functions, the pixel responses corresponding to the pixels x of different color channels can be rewritten into the numerical integral calculation formula:
X=K△λ∑P(λ)R x (λ)λ;
step S206, obtaining a first deviation according to the difference between the first pixel response and the second pixel response; and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
Here, the pixel deviation amount is used for correcting an image acquired by the second camera when switching from the first camera to the second camera.
For a camera, a nonlinear brightness mapping relationship exists between the brightness of an incident light signal measured by an image sensor in the camera and the pixel value (namely RGB value) of a finally output image due to the compression mapping of the image; as shown in fig. 5, fig. 5 is a schematic diagram showing a brightness response curve of a camera according to an exemplary embodiment.
Here, the brightness response of the camera may be:
I X =f(X);
wherein I is X Is a pixel value; and f (·) is the brightness response of the camera.
Thus, the pixel value of the image that the camera ultimately outputs can be expressed as:
I X =f(K△λ∑P(λ)R x (λ)λ);
the amount of pixel shift between the image captured by the first camera and the image captured by the second camera:
△I X =I X -I X ′=f(X)-f(X′);
wherein the DeltaI X Is the pixel deviation amount; the I is X Outputting pixel values of an image for a first camera, the I X ' is the pixel value of the second camera output image; and X is the pixel response of the first camera, and X' is the pixel response of the second camera.
If the terminal device is switched from the first camera to the second camera, considering that under the current brightness environment, due to the influence of the difference of the photoreceptors (namely the second camera), the brightness response gradient of the second camera under the current brightness environment can be determined by utilizing the brightness response curve of the second camera, and the pixel deviation amount between the two cameras can be determined.
That is, the amount of pixel shift between the image captured by the first camera and the image captured by the second camera:
△I X ≈a(I X ′)(X-X′)=a(I X ′)K△λ∑P(λ)[R x (λ)-R x ′(λ)]λ;
wherein a (I) X ' is the brightness response gradient of the second camera under the current brightness environment, due to a (I) X ' a priori estimates exist, which can be obtained by looking up a table; the P (lambda) is the radiation illuminance of an incident light signal, can be obtained by a color temperature sensor, and has no difference for different cameras; the R is x (lambda) is the first spectral response corresponding to pixel x of the first camera, R x 'lambda' is the second spectral response corresponding to pixel x of the second camera, R x (λ)-R x 'lambda' can be obtained a priori by measurement.
Therefore, the more accurate pixel offset can be determined by selecting a proper color temperature sensor to measure the radiation illuminance of the incident light signals of the first camera and the second camera; when the terminal equipment is switched from the first camera to the second camera, correcting the image acquired by the second camera by using the pixel offset, and outputting the corrected image.
The white balance parameters of the image after pixel offset correction are the same as those of the image acquired by the first camera, so that the problem of flicker caused by the difference of the white balance parameters of the cameras can be effectively reduced, and a smoother and natural transition can be completed between different cameras.
Illustratively, as shown in fig. 6, fig. 6 is a flow chart three of an image processing method according to an exemplary embodiment.
Step S301, determining whether an illuminometer exists; if there is no illuminometer, execute step S302; if there is an illuminometer, execute step S303;
step S302, utilizing scene analysis, determining the spectral distribution of an ambient light signal based on a prestored CIE standard light source spectrum;
step S303, measuring ten channels of spectrums in real time by using an illuminometer, and estimating the global spectrum of an ambient light signal by an interpolation method;
step S304, collecting a spectral response curve of the main camera;
step S305, collecting a spectrum response curve of the auxiliary camera;
step S306, according to the spectral response curves of the main camera and the auxiliary camera and the spectral distribution of the ambient light signals, determining the brightness of the ambient light signals received by the pixels of the channels of different colors of the main camera and the auxiliary camera;
step S307, calculating white balance reference points of the main camera and the auxiliary camera;
step S308, estimating the brightness response gradient of the secondary camera in the current brightness environment according to the brightness response curve of the secondary camera;
step S309, determining the pixel offset of the secondary camera in the current brightness environment.
In order to better present the optimization effect of the image processing method on the image, the main camera and the auxiliary camera of the terminal device can acquire images for the same scene, as shown in fig. 7 and 8, and fig. 7 is a schematic diagram of a first image acquired by the main camera according to an exemplary embodiment; fig. 8 is a schematic diagram illustrating a second image captured by a secondary camera according to an exemplary embodiment. Acquiring spectral response curves of pixels of different color channels of the main camera and the auxiliary camera; FIG. 9 is a schematic diagram of spectral response curves of primary and secondary cameras, as shown in FIG. 9, according to an exemplary embodiment; acquiring the spectrum distribution of an ambient light signal of the environment where the terminal equipment is located through a color temperature sensor; as shown in fig. 10, fig. 10 is a schematic diagram of a spectral response curve of an ambient light signal, according to an example embodiment.
And determining the pixel offset of the auxiliary camera under the current brightness environment according to the spectral responses of pixels of different color channels of the main camera and the auxiliary camera and the spectral distribution of an ambient light signal of the environment where the terminal equipment is located, and correcting the second image acquired by the auxiliary camera by using the pixel offset to obtain a third image. As shown in fig. 11, fig. 11 is a schematic diagram of a corrected third image shown according to an exemplary embodiment.
The embodiment of the disclosure also provides an image processing device. Fig. 12 is a schematic structural view of an image processing apparatus according to an exemplary embodiment, and as shown in fig. 12, the image processing apparatus 100 includes:
an acquiring module 101, configured to acquire first spectral response data of a first camera and second spectral response data of a second camera; acquiring radiation illuminance data of an optical signal to be detected;
a first determining module 102, configured to determine, based on the illuminance data of the optical signal to be measured, the first spectral response data, and the second spectral response data, a first pixel response output by the first camera and a second pixel response output by the second camera, where the first pixel response is used to describe intensities of the optical signal to be measured received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
the second determining module 103 is configured to determine, based on the first pixel response and the second pixel response, a pixel deviation amount between output images of the first camera and the second camera after receiving the optical signals to be tested in different spectral bands, where the pixel deviation amount is used to correct an image acquired by the second camera when the first camera is switched to the second camera.
Optionally, the second determining module 103 is configured to:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
Optionally, the second determining module 103 is configured to:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
or alternatively, the process may be performed,
and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
Optionally, the first spectral response data includes: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the first determining module 101 is configured to:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
Integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
Optionally, the first determining module 101 is further configured to:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
And determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
Optionally, the first pixel response includes: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
the second determining module 103 is configured to:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
Optionally, the acquiring module 101 is configured to:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
and estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
Fig. 13 is a block diagram of an image processing apparatus according to an exemplary embodiment. For example, the device 800 may be a mobile phone, mobile computer, or the like.
Referring to fig. 13, apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the device 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the apparatus 800, the sensor assembly 814 may also detect a change in position of the apparatus 800 or one component of the apparatus 800, the presence or absence of user contact with the apparatus 800, an orientation or acceleration/deceleration of the apparatus 800, and a change in temperature of the apparatus 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as Wi-Fi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. An image processing method, the method comprising:
acquiring first spectral response data of a first camera and second spectral response data of a second camera;
acquiring radiation illuminance data of an optical signal to be detected;
determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the illuminance data of the light signal to be detected, the first spectral response curve and the second spectral response curve, wherein the first pixel response is used for describing the intensity of the light signal to be detected received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
And determining pixel deviation amounts between output images after the first camera and the second camera receive the optical signals to be detected in different spectral frequency bands based on the first pixel response and the second pixel response, wherein the pixel deviation amounts are used for correcting images acquired by the second camera when the first camera is switched to the second camera.
2. The method of claim 1, wherein determining the amount of pixel deviation between the output images of the first and second cameras after receiving the light signals under test in different spectral bands based on the first and second pixel responses comprises:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
3. The method of claim 2, wherein the determining the amount of pixel offset between the first camera and the second camera from the first amount of offset comprises:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
Or alternatively, the process may be performed,
and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
4. The method of claim 1, wherein the first spectral response data comprises: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the determining, based on the illuminance data of the optical signal to be measured and the first spectral response data and the second spectral response data, the first pixel response output by the first camera and the second pixel response output by the second camera includes:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
Integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
5. The method according to claim 4, wherein determining the gray distribution curve corresponding to the first signal output by the first camera and the second camera after receiving the radiation of the optical signal to be measured includes:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
Determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
and determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
6. The method of claim 2, wherein the first pixel response comprises: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
the obtaining a first deviation according to the difference between the first pixel response and the second pixel response includes:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
7. The method of claim 1, wherein the obtaining irradiance data for the optical signal under test comprises:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
and estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
8. An image processing apparatus, comprising:
the acquisition module is used for acquiring first spectral response data of the first camera and second spectral response data of the second camera; acquiring radiation illuminance data of an optical signal to be detected;
the first determining module is used for determining a first pixel response output by the first camera and a second pixel response output by the second camera based on the radiation illuminance data, the first spectral response data and the second spectral response data of the optical signal to be detected, wherein the first pixel response is used for describing the intensity of the optical signal to be detected received by pixels of different color channels in the first camera; the second pixel responses to the intensity of the light signals to be detected, which are received by the pixels for describing different color channels in the second camera;
The second determining module is configured to determine, based on the first pixel response and the second pixel response, a pixel deviation amount between output images of the first camera and the second camera after receiving the optical signals to be detected in different spectral bands, where the pixel deviation amount is used to correct an image acquired by the second camera when the first camera is switched to the second camera.
9. The apparatus of claim 8, wherein the second determining module is configured to:
obtaining a first deviation amount according to the difference value of the first pixel response and the second pixel response;
and determining the pixel deviation amount between the first camera and the second camera according to the first deviation amount.
10. The apparatus of claim 9, wherein the second determining module is configured to:
determining the first amount of deviation as the amount of pixel deviation between the first camera and a second camera;
or alternatively, the process may be performed,
and determining a product of a brightness response gradient of the second camera at the current brightness, which is determined based on the second pixel response and the brightness response of the second camera, and the first deviation amount as the pixel deviation amount between the first camera and the second camera.
11. The apparatus of claim 8, wherein the first spectral response data comprises: a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera; the second spectral response data includes: a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera;
the first determining module is configured to:
determining gray level distribution curves corresponding to first signals output by the first camera and the second camera after the first camera and the second camera receive the light signal radiation to be detected; the gray level distribution curve is used for describing the intensity of the optical signals to be detected of different spectrum frequency bands received by the camera;
integrating products of gray level distribution curves corresponding to the first camera and a plurality of first sub-response curves corresponding to pixels of a plurality of different color channels in the first camera to determine a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera;
and carrying out integral processing on products of gray level distribution curves corresponding to the second camera and a plurality of second sub-response curves corresponding to pixels of a plurality of different color channels in the second camera, and determining a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera.
12. The apparatus of claim 11, wherein the first determining module is further configured to:
determining first light energy of incident light signals of the first camera and the second camera based on the radiation illuminance data of the light signals to be detected;
according to the first camera and the second camera, after receiving a plurality of photons with different frequencies in the optical signal to be detected, respectively generating a plurality of second optical energy;
determining photon numbers of photons with different frequencies received by the first camera and the second camera according to the ratio of the first light energy to the plurality of second light energy;
and determining gray level distribution curves corresponding to the first camera and the second camera based on photon numbers of photons with different frequencies received by the first camera and the second camera.
13. The apparatus of claim 9, wherein the first pixel response comprises: a plurality of first pixel responses corresponding to the pixels of the plurality of different color channels in the first camera; the second pixel response includes: a plurality of second pixel responses corresponding to the pixels of the plurality of different color channels in the second camera;
The second determining module is configured to:
and determining first deviation amounts corresponding to the pixels of the different color channels according to differences between a plurality of first pixel responses corresponding to the pixels of the different color channels in the first camera and a plurality of second pixel responses corresponding to the pixels of the different color channels in the second camera.
14. The apparatus of claim 8, wherein the acquisition module is configured to:
measuring the optical signals to be measured to obtain the radiation illuminance of a plurality of channels with different wavelengths;
and estimating the radiation illuminance data of the optical signal to be detected based on the radiation illuminance of the plurality of different wavelength channels.
15. An image processing apparatus, comprising
A processor;
a memory for storing executable instructions;
wherein the processor is configured to: the image processing method of any of claims 1-7, when executed by executable instructions stored in the memory.
16. A non-transitory computer readable storage medium, which when executed by a processor of an image processing apparatus, causes the image processing apparatus to perform the image processing method of any one of claims 1-7.
CN202111335246.0A 2021-11-11 2021-11-11 Image processing method, device and storage medium Pending CN116132784A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111335246.0A CN116132784A (en) 2021-11-11 2021-11-11 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111335246.0A CN116132784A (en) 2021-11-11 2021-11-11 Image processing method, device and storage medium

Publications (1)

Publication Number Publication Date
CN116132784A true CN116132784A (en) 2023-05-16

Family

ID=86303185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111335246.0A Pending CN116132784A (en) 2021-11-11 2021-11-11 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116132784A (en)

Similar Documents

Publication Publication Date Title
US10542243B2 (en) Method and system of light source estimation for image processing
EP3845878B1 (en) Method and device for detecting ambient light, and storage medium
KR101154136B1 (en) White balance calibration for digital camera device
US8803994B2 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
US20130021484A1 (en) Dynamic computation of lens shading
CN112710383A (en) Light sensor calibration method and device and storage medium
US10771754B2 (en) Image white balance correction method and electronic device
JP2022027436A (en) Image processing method and device, terminal, and storage medium
CN105791790A (en) Image processing method and apparatus
CN113691795A (en) Image processing apparatus, image processing method, and storage medium
CN102238394A (en) Image processing apparatus, control method thereof, and image-capturing apparatus
US8654210B2 (en) Adaptive color imaging
CN106773453B (en) Camera exposure method and device and mobile terminal
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
US20120212636A1 (en) Image capture and post-capture processing
EP4033750A1 (en) Method and device for processing image, and storage medium
CN116132784A (en) Image processing method, device and storage medium
EP4055811B1 (en) A system for performing image motion compensation
EP3993399A1 (en) Color filter structure, related photographing method, device, terminal, and storage medium
US11451719B2 (en) Image processing apparatus, image capture apparatus, and image processing method
US8547447B2 (en) Image sensor compensation
JP6552165B2 (en) Image processing apparatus, control method therefor, and control program
CN109447925B (en) Image processing method and device, storage medium and electronic equipment
KR101589493B1 (en) White ballance control method and apparatus using a flash and digital photographing apparatus using thereof
EP3975548A1 (en) Photographing method and apparatus, terminal, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination