CN114554169A - Image processing method, image processing apparatus, electronic device, and storage medium - Google Patents

Image processing method, image processing apparatus, electronic device, and storage medium Download PDF

Info

Publication number
CN114554169A
CN114554169A CN202210173532.XA CN202210173532A CN114554169A CN 114554169 A CN114554169 A CN 114554169A CN 202210173532 A CN202210173532 A CN 202210173532A CN 114554169 A CN114554169 A CN 114554169A
Authority
CN
China
Prior art keywords
camera
information
rgb
xyz
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210173532.XA
Other languages
Chinese (zh)
Inventor
王琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210173532.XA priority Critical patent/CN114554169A/en
Publication of CN114554169A publication Critical patent/CN114554169A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a computer readable storage medium. The image processing method comprises the following steps: determining RGB information of the first camera and the second camera according to XYZ information of the ambient light collected by the multispectral sensor; and determining the white balance gain value of the second camera according to the RGB information of the first camera, the white balance gain value and the RGB information of the second camera. In the image processing method, the image processing device, the electronic device and the storage medium, the XYZ information of the ambient light collected by the multispectral sensor is accurate, so that the RGB information of the first camera and the second camera determined according to the XYZ information is accurate, and the white balance gain value of the second camera can be determined according to the RGB information of the first camera, the white balance gain value and the RGB information of the second camera, so that the white balance gain value of the second camera can follow the change of the white balance gain value of the first camera, and the second camera and the first camera present a consistent white balance effect.

Description

Image processing method, image processing apparatus, electronic device, and storage medium
Technical Field
The present disclosure relates to image processing technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a computer-readable storage medium.
Background
In the related art, an electronic apparatus may include a plurality of cameras, such as a main-shot, wide-angle, tele, and the like. The phenomenon of color jump can occur in the zooming process of a plurality of cameras, so that the colors acquired by the plurality of cameras are difficult to achieve consistency.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, an electronic device and a computer readable storage medium.
The image processing method of the embodiment of the application comprises the following steps: acquiring XYZ information of ambient light acquired by a multispectral sensor; determining RGB information of the first camera according to the XYZ information; determining RGB information of a second camera according to the XYZ information; and determining the white balance gain value of the second camera according to the RGB information of the first camera, the white balance gain value of the first camera and the RGB information of the second camera.
The image processing device of the embodiment of the application comprises a first processing module, a second processing module, a third processing module and a fourth processing module. The first processing module is used for acquiring XYZ information of the ambient light acquired by the multispectral sensor. The second processing module is used for determining the RGB information of the first camera according to the XYZ information. And the third processing module is used for determining the RGB information of the second camera according to the XYZ information. The fourth processing module is configured to determine a white balance gain value of the second camera according to the RGB information of the first camera, the white balance gain value of the first camera, and the RGB information of the second camera.
The electronic device of embodiments of the present application includes one or more processors and memory. The memory stores a computer program. The steps of the image processing method according to the above-described embodiment are implemented when the computer program is executed by the processor.
The computer-readable storage medium of the present embodiment stores thereon a computer program that, when executed by a processor, implements the steps of the image processing method described in the above embodiment.
In the image processing method, the image processing device, the electronic device and the computer readable storage medium, the XYZ information of the ambient light collected by the multispectral sensor is accurate, so that the RGB information of the first camera and the second camera determined according to the XYZ information is accurate, and the white balance gain value of the second camera can be determined according to the RGB information of the first camera, the white balance gain value of the first camera and the RGB information of the second camera, so that the white balance gain value of the second camera can follow the white balance gain value of the first camera, and the second camera and the first camera can present a consistent white balance effect.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 2 is a schematic view of an electronic device of some embodiments of the present application;
FIG. 3 is a schematic diagram of an image processing apparatus according to some embodiments of the present application;
FIG. 4 is a schematic view of a scene of an image processing method according to some embodiments of the present application;
FIG. 5 is a schematic diagram of a spectral response curve of a multispectral sensor according to some embodiments of the present application;
FIG. 6 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 7 is a schematic diagram of a second processing module of certain embodiments of the present application;
FIG. 8 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 9 is a schematic diagram of an image processing apparatus according to some embodiments of the present application;
FIG. 10 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 11 is a schematic diagram of a third processing module of certain embodiments of the present application;
FIG. 12 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 13 is a schematic diagram of an image processing apparatus according to some embodiments of the present application;
FIG. 14 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 15 is a schematic diagram of a third process module of certain embodiments of the present application;
FIG. 16 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 17 is a schematic diagram of an image processing apparatus according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the embodiments of the present application, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
In the related art, the current mobile phone industry has greatly popularized the use of multiple cameras, where optical zooming is an important factor. Due to the limitation of volume, zooming is actually completed by a plurality of cameras in a relay manner, which brings about a plurality of problems, one of which is that color jumping occurs during zooming of the plurality of cameras, so that the colors acquired by the plurality of cameras are difficult to achieve consistency.
Specifically, the color gamut spaces of different cameras are not the same, and the cameras are switched during zooming, because human eyes are very sensitive to gray, the color difference tendency is greatly enhanced by the difference of white balance among the cameras, and the zooming feeling of a user is deteriorated, so how to achieve effective white balance consistency among the cameras is a technical problem which needs to be solved urgently in the field.
When the consistency of the white balance among the cameras is realized, one camera may be used as a reference target, and the other cameras perform the white balance with reference to the camera, however, when the white balance restoration of the camera itself serving as the reference target does not completely restore the white color or obviously deviates the color, the other cameras may not be synchronized to the corresponding effect, for example, when the color information acquired by the camera itself serving as the reference target is inaccurate, the other cameras may not be synchronized to the corresponding effect.
Referring to fig. 1 and fig. 2, an image processing method according to an embodiment of the present application includes:
01: acquiring XYZ information of ambient light acquired by a multispectral sensor;
02: determining RGB information of the first camera 10 from the XYZ information;
03: determining RGB information of the second camera 20 from the XYZ information;
04: the white balance gain value of the second camera 20 is determined according to the RGB information of the first camera 10, the white balance gain value of the first camera 10, and the RGB information of the second camera 20.
Referring to fig. 2 and 3, an image processing apparatus 30 according to an embodiment of the present disclosure includes a first processing module 31, a second processing module 32, a third processing module 33, and a fourth processing module 34.
The image processing method of the present application can be implemented by the image processing apparatus 30 of the present embodiment, wherein step 01 can be implemented by the first processing module 31, step 02 can be implemented by the second processing module 32, step 03 can be implemented by the third processing module 33, and step 04 can be implemented by the fourth processing module 34, that is, the first processing module 31 can be used to obtain XYZ information of the ambient light collected by the multispectral sensor. The second processing module 32 may be configured to determine RGB information for the first camera 10 from the XYZ information. The third processing module 33 may be configured to determine RGB information of the second camera 20 from the XYZ information. The fourth processing module 34 may be configured to determine a white balance gain value of the second camera 20 according to the RGB information of the first camera 10, the white balance gain value of the first camera 10, and the RGB information of the second camera 20.
In the image processing method and the image processing apparatus 30, the XYZ information of the ambient light collected by the multispectral sensor is accurate, so that the RGB information of the first camera 10 and the second camera 20 determined according to the XYZ information is accurate, and the white balance gain value of the second camera 20 can be determined according to the RGB information of the first camera 10, the white balance gain value of the first camera 10, and the RGB information of the second camera 20, so that the white balance gain value of the second camera 20 can follow the white balance gain value of the first camera 10, and the second camera 20 and the first camera 10 can present a consistent white balance effect. The first camera 10 and the second camera 20 substantially use XYZ information of ambient light collected by the multispectral sensor as a reference, and the reference target is normalized, so that color differences between the cameras can be completely avoided.
In some embodiments, the electronic device 100 may include a smart phone, a tablet computer, a smart watch, a smart bracelet, and the like, which are not limited herein. The electronic device 100 according to the embodiment of the present application is illustrated as a smart phone, and is not to be construed as a limitation to the present application.
The multispectral sensor (not shown) may be disposed in the electronic device 100 or externally connected to the electronic device 100, and the multispectral sensor may be used to detect XYZ information of the ambient light, so as to obtain real RGB information of the first camera 10 under a corresponding light source, and obtain a real color representation of the first camera 10 according to the RGB information of the first camera 10 and a white balance gain value of the first camera 10, so that the second camera 20 may completely follow a white balance effect of the first camera 10, regardless of color cast.
In practical products, a white balance debugger always uses a white balance gain value which is not completely white for the white color r: g: b: 1:1 in a scene, namely, the corrected result is not equal to 1:1:1, which is reasonable and accords with the common sense of human vision, especially under the environment of high color temperature (10000-7000) or low color temperature (4500-2000). This application may cover such a scenario entirely, for example, the first camera 10 may capture an image that is white balanced and "color-shifted", and the second camera 20 may also capture an image that is "color-shifted". For example, referring to fig. 4, the second image is taken as the image captured by the first camera 10, the first image or the third image is taken as the image captured by the second camera 20, the white balance information of the second image is mapped onto the first image or the third image, the second image is restored in a white balance manner according to a "warmer" style, and the first image or the third image is also followed by the "warmer" style.
The multispectral sensor can sense the intensity of multiple channels of visible light and non-visible light in the environment, and the multispectral sensor can obtain XYZ information of the ambient light. When the scene color is described, the accuracy of the XYZ information acquired by the multispectral sensor is far higher than the RGB result acquired by the camera. Fig. 5 is an example of a spectral response curve of a multispectral sensor.
Different multispectral sensors have different accuracies of acquired XYZ information due to the difference of light sensing accuracies in visible light/non-visible light areas. Aiming at the multispectral sensor with few photosensitive channels, the accuracy can be further improved by the aid of a preset algorithm, and therefore the performance of white balance consistency among multiple cameras can be improved. The preset algorithm may be any preset algorithm for improving accuracy, and in one embodiment, the preset algorithm may be a deep learning algorithm.
In some embodiments, the angle of view of the multispectral sensor is greater than 160 degrees, so that the multispectral sensor can see more content and the acquired XYZ information is more accurate. In one embodiment, the viewing angle of the multispectral sensor is larger than that of the first camera 10, and the viewing angle of the multispectral sensor is larger than that of the second camera 20, so that the content that can be seen by the multispectral sensor exceeds that of the first camera 10 and that of the second camera 20, the XYZ information of the multispectral sensor is more accurate, and the RGB information of the first camera 10 and the RGB information of the second camera 20 can be accurately obtained.
The electronic device 100 may include a first camera 10 and a second camera 20, and the first camera 10 and the second camera 20 may capture scene information to obtain a first image and a second image. In one embodiment, the electronic device 100 may include a plurality of cameras, for example, a main camera, a wide camera, and a telephoto camera, wherein the first camera 10 may be any one of the main camera, the wide camera, and the telephoto camera, and the second camera 20 may be another one of the main camera, the wide camera, and the telephoto camera, and when performing a camera switching (for example, performing an optical zoom), the camera before the switching may be used as the first camera 10, the camera after the switching may be used as the second camera 20, and after the switching, the first camera 10 may be turned off to reduce power consumption.
The white balance gain value of the first camera 10 may be determined in any manner, for example, the white balance gain value of the first camera 10 may be determined according to RGB information of the first camera 10, or the white balance gain value of the first camera 10 may be determined according to a color part of the first image photographed by the first camera 10, or the white balance gain value of the first camera 10 may be determined according to user input information. Regardless of how the white balance gain value of the first camera 10 is determined, the embodiment of the present application may cause the second camera 20 to completely follow the white balance effect of the first camera 10, regardless of color cast.
In order for the second camera 20 to completely follow the white balance effect of the first camera 10, the white balance gain of the first camera 10 and the white balance gain of the second camera 20 satisfy the following relational expression: r is1*rgain1=r2*rgain2,b1*bgain1=b2*bgain2. Wherein r is1A channel value R of the R channel of the RGB information of the first camera 10gain1White balance gain value, R, for the R channel of the RGB information of the first camera 102A channel value R of the R channel of the RGB information of the second camera 20gain2A white balance gain value of an R channel which is RGB information of the second camera 20; b1Channel value of B channel which is RGB information of the first camera 10, Bgain1White balance gain of B channel for RGB information of the first camera 10Value, b2Channel value of B channel which is RGB information of the second camera 20, Bgain2The above relationship can be simplified to RGB1 × gain1 ═ RGB2 × gain2 for the B channel of the RGB information of the second camera 20, wherein RGB1 is the channel values of the three channels of the RGB information of the first camera 10, RGB2 is the channel values of the three channels of the RGB information of the second camera 20, gain1 is the white balance gain value of the first camera 10, and gain2 is the white balance gain value of the second camera 20. Therefore, after the RGB information of the first camera 10 and the RGB information of the second camera 20 are obtained, the white balance gain value of the second camera 20 can be determined according to the white balance gain value of the first camera 10, so that the white balance gain value of the second camera 20 can follow the white balance gain value of the first camera 10, and the second camera 20 and the first camera 10 present a consistent white balance effect.
Referring to fig. 6, in some embodiments, step 02 (determining RGB information for the first camera 10 from XYZ information) includes:
022: determining a first preset mapping relation according to the XYZ information, wherein the first preset mapping relation is a mapping relation between the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera 10;
024: and determining the RGB information of the first camera 10 according to the XYZ information and the first preset mapping relation.
Referring to fig. 7, in some embodiments, the second processing module 32 includes a first processing unit 322 and a second processing unit 324. Step 022 may be implemented by the first processing unit 322, and step 024 may be implemented by the second processing unit 324, that is, the first processing unit 322 may be configured to determine a first preset mapping relationship according to the XYZ information, where the first preset mapping relationship is a mapping relationship between XYZ calibration information of the multispectral sensor and RGB calibration information of the first camera 10. The second processing unit 324 may be configured to determine RGB information of the first camera 10 according to the XYZ information and the first preset mapping relationship.
In this way, the RGB information of the first camera 10 can be accurately determined according to the XYZ information and the first preset mapping relationship.
Specifically, the first preset mapping relationship is a mapping relationship between XYZ calibration information of the multispectral sensor and RGB calibration information of the first camera 10, the first preset mapping relationship may include a plurality of first preset mapping relationships, the plurality of first preset mapping relationships form a first preset mapping relationship table, and the first preset mapping relationship table may be searched according to the XYZ information to determine a corresponding first preset mapping relationship, so that the RGB information of the first camera 10 may be determined according to the XYZ information and the corresponding first preset mapping relationship. In one embodiment, RGB1 is x1 × yz, where RGB1 is the channel values of three channels of RGB information of the first camera 10, x1 is the first predetermined mapping relationship, and XYZ is the channel values of three channels of XYZ information.
Referring to fig. 8, in some embodiments, an image processing method includes:
05: and selecting a plurality of preset light sources and calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera 10 under the various preset light sources to obtain a first preset mapping relation.
Referring to fig. 9, in some embodiments, the image processing apparatus 30 includes a first calibration module 35. Step 05 may be implemented by the first calibration module 35, that is, the first calibration module 35 may be configured to select a plurality of preset light sources and calibrate the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera 10 under the various preset light sources to obtain the first preset mapping relationship.
Thus, the first preset mapping relationship can be obtained in advance through calibration.
Specifically, the predetermined light source may include an a light source, a D50 light source, a D65 light source, other standard light sources, intermediate states of standard light sources, other light sources with typical spectral response curve shapes (such as typical light sources in the market), and the like. The multiple preset light sources can be 50-100 preset light sources, the more the number of the preset light sources is, the more scenes can be covered by the first preset mapping relation obtained by calibration, and the less the number of the preset light sources is, the easier the calibration of the first preset mapping relation is. By calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera 10 under various preset light sources, the first preset mapping relationship under various preset light sources can be obtained, so as to obtain the first preset mapping relationship table.
Referring to fig. 10, in some embodiments, step 03 (determining RGB information for the second camera 20 from XYZ information) includes:
02: determining RGB information of the first camera 10 from the XYZ information;
032: determining a conversion relationship according to the RGB information of the first camera 10, where the conversion relationship is a mapping relationship between the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20;
034: the RGB information of the second camera 20 is determined from the RGB information of the first camera 10 and the conversion relation.
Referring to fig. 11, in some embodiments, the third processing module 33 includes a third processing unit 332 and a fourth processing unit 334. Wherein step 02 can be implemented by the second processing module 32, step 032 can be implemented by the third processing unit 332, and step 034 can be implemented by the fourth processing unit 334, that is, the second processing module 32 can be used to determine the RGB information of the first camera 10 according to the XYZ information. The third processing unit 332 may be configured to determine a conversion relationship according to the RGB information of the first camera 10, where the conversion relationship is a mapping relationship between the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20. The fourth processing unit 334 may be configured to determine RGB information of the second camera 20 according to the RGB information of the first camera 10 and the conversion relation.
In this way, the RGB information of the second camera 20 can be indirectly obtained through the RGB information of the first camera 10.
Specifically, the conversion relationship is a mapping relationship between the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20, and the conversion relationship may include a plurality of conversion relationships, where the conversion relationship forms a conversion relationship table, and the conversion relationship table may be searched according to the RGB calibration information of the first camera 10 to determine a corresponding conversion relationship, so that the RGB information of the second camera 20 may be determined according to the RGB calibration information of the first camera 10 and the corresponding conversion relationship. In one embodiment, RGB1 is t RGB2, where RGB1 is the channel values of the three channels of RGB information of the first camera 10, t is the conversion relationship, and RGB2 is the channel values of the three channels of RGB information of the second camera 10. In this way, the RGB information of the second camera 20 can be indirectly obtained from the RGB information of the first camera 10, and the second camera 20 does not need to be directly associated with or communicate with the multi-light-source sensor, so that power consumption can be reduced.
In certain embodiments, rgb1 gain1 gain tgrg 2 gain1 rggb 2 gain2 from rgb1 tgrg 2 and rgb1 gain1 tgain 2 tgain 2, with elimination of rgb 2: t gain1 is gain2, so in one embodiment, after determining the transform relationship from the RGB information of the first camera 10, steps 03 and 04 can be simplified as: the white balance gain value of the second camera 20 is determined according to the white balance gain value of the first camera 10 and the conversion relation.
Referring to fig. 12, in some embodiments, an image processing method includes:
06: selecting a plurality of preset light sources and calibrating the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20 under the various preset light sources to obtain a conversion relationship.
Referring to FIG. 13, in some embodiments, the image processing apparatus 30 includes a second calibration module 36. The steps may be implemented by the second calibration module 36, that is, the second calibration module 36 may be configured to select a plurality of preset light sources and calibrate the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20 under the various preset light sources to obtain the conversion relationship.
Thus, the conversion relationship can be obtained by calibration in advance.
Specifically, the predetermined light source may include an a light source, a D50 light source, a D65 light source, other standard light sources, intermediate states of standard light sources, other light sources with typical spectral response curve shapes (such as typical light sources in the market), and the like. The multiple preset light sources can be 50-100 preset light sources, the conversion relation obtained by calibration can cover more scenes when the number of the preset light sources is larger, and the conversion relation is easier to calibrate when the number of the preset light sources is smaller. By calibrating the RGB calibration information of the first camera 10 and the RGB calibration information of the second camera 20 under various preset light sources, conversion relationships under various preset light sources can be obtained, thereby obtaining a conversion relationship table.
Referring to fig. 14, in some embodiments, step 03 (determining RGB information for the second camera 20 from XYZ information) includes:
036: determining a second preset mapping relationship according to the XYZ information, where the second preset mapping relationship is a mapping relationship between the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera 20;
038: and determining the RGB information of the second camera 20 according to the XYZ information and a second preset mapping relation.
Referring to fig. 15, in some embodiments, the third processing module includes a fifth processing unit 336 and a sixth processing unit 338. The step 036 may be implemented by the fifth processing unit 336 and the step 038 may be implemented by the sixth processing unit 338, that is, the fifth processing unit 336 may be configured to determine a second preset mapping relationship according to the XYZ information, where the second preset mapping relationship is a mapping relationship between the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera 20. The sixth processing unit 338 may be configured to determine the RGB information of the second camera 20 according to the XYZ information and the second preset mapping relationship.
In this way, the RGB information of the second camera 20 can be accurately determined according to the XYZ information and the second preset mapping relationship.
Specifically, the second preset mapping relationship is a mapping relationship between XYZ calibration information of the multispectral sensor and RGB calibration information of the second camera 20, the second preset mapping relationship may include a plurality of second preset mapping relationships, the plurality of second preset mapping relationships form a second preset mapping relationship table, and the second preset mapping relationship table may be searched according to the XYZ information to determine a corresponding second preset mapping relationship, so that the RGB information of the second camera 20 may be determined according to the XYZ information and the corresponding second preset mapping relationship. In one embodiment, RGB2 is x2 × yz, wherein RGB2 is the channel values of three channels of RGB information of the second camera 20, x2 is the second predetermined mapping relationship, and XYZ is the channel values of three channels of XYZ information. Therefore, the RGB information of the second camera 20 can be directly obtained by mapping the XYZ calibration information of the multi-light source sensor, so that the mapping steps are reduced, and the accuracy of the RGB information of the second camera 20 can be improved.
Referring to fig. 16, in some embodiments, an image processing method includes:
07: and selecting a plurality of preset light sources and calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera 20 under the various preset light sources to obtain a second preset mapping relationship.
Referring to fig. 17, in some embodiments, the image processing apparatus 30 includes a third calibration module 37. Step 07 can be implemented by the third calibration module 37, that is, the third calibration module 37 can be configured to select multiple preset light sources and calibrate the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera 20 under the multiple preset light sources to obtain the second preset mapping relationship.
Specifically, the predetermined light source may include an a light source, a D50 light source, a D65 light source, other standard light sources, intermediate states of standard light sources, other light sources with typical spectral response curve shapes (such as typical light sources in the market), and the like. The multiple preset light sources can be 50-100 preset light sources, the larger the number of the preset light sources is, the more scenes can be covered by the second preset mapping relation obtained by calibration, and the smaller the number of the preset light sources is, the easier the calibration of the second preset mapping relation is. By calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera 20 under various preset light sources, a second preset mapping relationship under various preset light sources can be obtained, thereby obtaining a second preset mapping relationship table.
The types and/or the number of the preset light sources used for calibrating the first preset mapping relationship, the conversion relationship, and the second preset mapping relationship may be the same or different, and are not specifically limited herein.
It should be noted that the specific numerical values mentioned above are only for illustrating the implementation of the present application in detail and should not be construed as limiting the present application. In other examples or embodiments or examples, other values may be selected according to the application and are not specifically limited herein.
Referring to fig. 2, the image processing method according to the embodiment of the present application can be implemented by the electronic device 100 according to the embodiment of the present application. In particular, the electronic device 100 includes one or more processors 50 and memory 40. The memory 40 stores a computer program. The steps of the image processing method according to any of the above embodiments are implemented when the computer program is executed by the processor 50.
For example, in the case where the computer program is executed by the processor 50, the steps of the following image processing method are implemented:
01: acquiring XYZ information of ambient light acquired by a multispectral sensor;
02: determining RGB information of the first camera 10 from the XYZ information;
03: determining RGB information of the second camera 20 from the XYZ information;
04: the white balance gain value of the second camera 20 is determined according to the RGB information of the first camera 10, the white balance gain value of the first camera 10, and the RGB information of the second camera 20.
The computer-readable storage medium of the embodiments of the present application stores thereon a computer program that, when executed by a processor, implements the steps of the image processing method of any of the embodiments described above.
For example, in the case where the program is executed by a processor, the steps of the following image processing method are implemented:
01: acquiring XYZ information of ambient light acquired by a multispectral sensor;
02: determining RGB information of the first camera 10 from the XYZ information;
03: determining RGB information of the second camera 20 from the XYZ information;
04: the white balance gain value of the second camera 20 is determined according to the RGB information of the first camera 10, the white balance gain value of the first camera 10, and the RGB information of the second camera 20.
It will be appreciated that the computer program comprises computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), software distribution medium, and the like. The Processor may be a central processing unit, or may be other general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. An image processing method, characterized in that the image processing method comprises:
acquiring XYZ information of ambient light acquired by a multispectral sensor;
determining RGB information of the first camera according to the XYZ information;
determining RGB information of a second camera according to the XYZ information;
and determining the white balance gain value of the second camera according to the RGB information of the first camera, the white balance gain value of the first camera and the RGB information of the second camera.
2. The image processing method of claim 1, wherein determining RGB information for the first camera from the XYZ information comprises:
determining a first preset mapping relation according to the XYZ information, wherein the first preset mapping relation is the mapping relation between the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera;
and determining the RGB information of the first camera according to the XYZ information and the first preset mapping relation.
3. The image processing method according to claim 2, characterized in that the image processing method comprises:
selecting a plurality of preset light sources and calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the first camera under the various preset light sources to obtain the first preset mapping relation.
4. The image processing method of claim 1, wherein determining RGB information for the second camera from the XYZ information comprises:
determining RGB information of the first camera according to the XYZ information;
determining a conversion relation according to the RGB information of the first camera, wherein the conversion relation is a mapping relation between the RGB calibration information of the first camera and the RGB calibration information of the second camera;
and determining the RGB information of the second camera according to the RGB information of the first camera and the conversion relation.
5. The image processing method according to claim 4, characterized in that the image processing method comprises:
selecting a plurality of preset light sources and calibrating the RGB calibration information of the first camera and the RGB calibration information of the second camera under the various preset light sources to obtain the conversion relation.
6. The image processing method of claim 1, wherein determining RGB information for the second camera from the XYZ information comprises:
determining a second preset mapping relation according to the XYZ information, wherein the second preset mapping relation is the mapping relation between the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera;
and determining the RGB information of the second camera according to the XYZ information and the second preset mapping relation.
7. The image processing method according to claim 6, characterized in that the image processing method comprises:
and selecting a plurality of preset light sources and calibrating the XYZ calibration information of the multispectral sensor and the RGB calibration information of the second camera under the various preset light sources to obtain the second preset mapping relation.
8. An image processing apparatus characterized by comprising:
the first processing module is used for acquiring XYZ information of the ambient light collected by the multispectral sensor;
the second processing module is used for determining the RGB information of the first camera according to the XYZ information;
a third processing module, configured to determine RGB information of the second camera according to the XYZ information;
and the fourth processing module is used for determining the white balance gain value of the second camera according to the RGB information of the first camera, the white balance gain value of the first camera and the RGB information of the second camera.
9. An electronic device, characterized in that the electronic device comprises one or more processors and a memory, the memory storing a computer program which, when executed by the processors, implements the steps of the image processing method of any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 7.
CN202210173532.XA 2022-02-24 2022-02-24 Image processing method, image processing apparatus, electronic device, and storage medium Pending CN114554169A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210173532.XA CN114554169A (en) 2022-02-24 2022-02-24 Image processing method, image processing apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210173532.XA CN114554169A (en) 2022-02-24 2022-02-24 Image processing method, image processing apparatus, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN114554169A true CN114554169A (en) 2022-05-27

Family

ID=81677071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210173532.XA Pending CN114554169A (en) 2022-02-24 2022-02-24 Image processing method, image processing apparatus, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN114554169A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100039528A1 (en) * 2008-08-12 2010-02-18 Takayuki Ogasahara Image processing device correcting image data, image sensor and image processing method
CN103268618A (en) * 2013-05-10 2013-08-28 中国科学院光电研究院 Method for calibrating multispectral remote sensing data true colors
CN111314683A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 White balance adjusting method and related equipment
CN113676713A (en) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 Image processing method, apparatus, device and medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100039528A1 (en) * 2008-08-12 2010-02-18 Takayuki Ogasahara Image processing device correcting image data, image sensor and image processing method
CN103268618A (en) * 2013-05-10 2013-08-28 中国科学院光电研究院 Method for calibrating multispectral remote sensing data true colors
CN111314683A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 White balance adjusting method and related equipment
CN113676713A (en) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 Image processing method, apparatus, device and medium

Similar Documents

Publication Publication Date Title
US7920146B2 (en) User interface providing device
US11321830B2 (en) Image detection method and apparatus and terminal
JP2002027491A (en) Image input unit, white balance adjusting method, and computer readable recording medium storing program for executing the method
JP2005210526A (en) Image processing apparatus, method, and program, image pickup device, and image data outputting method and program
CN113810641B (en) Video processing method and device, electronic equipment and storage medium
CN113810642B (en) Video processing method and device, electronic equipment and storage medium
JP2023538781A (en) White balance correction method, device and electronic equipment
CN108307098A (en) Fisheye camera shadow correction parameter determination method, bearing calibration and device, storage medium, fisheye camera
JP2005210495A (en) Image processing apparatus, method, and program
US20160065925A1 (en) Color-mixture-ratio calculation device and method, and imaging device
CN112532960B (en) White balance synchronization method and device, electronic equipment and storage medium
CN102209202B (en) Gamma correction curve extraction method for camera
CN113315956A (en) Image processing apparatus, image capturing apparatus, image processing method, and machine-readable medium
EP4262192A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN114554169A (en) Image processing method, image processing apparatus, electronic device, and storage medium
WO2023016044A1 (en) Video processing method and apparatus, electronic device, and storage medium
US11451719B2 (en) Image processing apparatus, image capture apparatus, and image processing method
EP4055811B1 (en) A system for performing image motion compensation
JP2007312294A (en) Imaging apparatus, image processor, method for processing image, and image processing program
JP2007221678A (en) Imaging apparatus, image processor, image processing method and image processing program
JP2007293686A (en) Imaging apparatus, image processing apparatus, image processing method and image processing program
CN114697629B (en) White balance processing method and device, storage medium and terminal equipment
EP4270936A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023016042A1 (en) Video processing method and apparatus, electronic device, and storage medium
JP2007243556A (en) Imaging apparatus, image processing method, and image processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination