CN116134808A - White balance processing method and device - Google Patents
White balance processing method and device Download PDFInfo
- Publication number
- CN116134808A CN116134808A CN202080104052.7A CN202080104052A CN116134808A CN 116134808 A CN116134808 A CN 116134808A CN 202080104052 A CN202080104052 A CN 202080104052A CN 116134808 A CN116134808 A CN 116134808A
- Authority
- CN
- China
- Prior art keywords
- image
- solid
- color
- frame
- image frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title description 3
- 239000007787 solid Substances 0.000 claims abstract description 369
- 238000000034 method Methods 0.000 claims abstract description 134
- 238000012545 processing Methods 0.000 claims abstract description 117
- 230000015654 memory Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 13
- 230000000694 effects Effects 0.000 abstract description 22
- 230000008569 process Effects 0.000 description 18
- 238000000605 extraction Methods 0.000 description 16
- 238000004422 calculation algorithm Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000012937 correction Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000009432 framing Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000003705 background correction Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Of Color Television Signals (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
The application provides a method and a device for white balance processing, wherein the method comprises the following steps: acquiring image characteristics of a pure-color image frame; obtaining a non-solid color matching frame from one or more non-solid color image frames, wherein image features of the non-solid color matching frame match image features of the solid color image frame; and performing white balance processing on the solid-color image frame according to the white balance information of the non-solid-color matching frame. Since a non-solid color matching frame has image features that match those of a solid color image frame, it is reasonable to consider the non-solid color matching frame as being of the same scene as the solid color image frame, and it is also reasonable to apply white balance information of the non-solid color matching frame of the same scene to the solid color image frame. Therefore, in the case where the photographed scene is a solid-color scene, the white balance processing effect of the image can be improved.
Description
The present disclosure relates to the field of image processing, and in particular, to a method and apparatus for white balance processing.
The camera shooting function is a function widely used at present, and due to the characteristics of a camera and the reflection characteristics of an object, the imaging effect is inconsistent with the effect seen by human eyes under different color temperature environments, namely, the shot image has color cast phenomenon. In the prior art, the camera adopts an automatic white balance (automatic white balance, AWB) technology to process images so as to avoid color cast.
When the shooting scene is a solid-color scene (comprising a large-area solid-color area), the white balance information calculated by the camera is inaccurate, so that the shot image has color cast.
Disclosure of Invention
The application provides a white balance processing method and device, which can improve the white balance processing effect of an image under the condition that a shooting scene is a solid-color scene.
In a first aspect, a method of white balance processing is provided, the method comprising: acquiring image characteristics of a pure-color image frame; obtaining a non-solid color matching frame from one or more non-solid color image frames, wherein image features of the non-solid color matching frame match image features of the solid color image frame; and carrying out white balance processing on the solid-color image frame according to the white balance information of the non-solid-color matching frame.
Wherein the solid color image frame represents an image frame of which the color is solid color. For example, a solid color image frame represents an image frame having a single color. The non-solid image frames represent image frames whose colors are non-solid. For example, a non-solid color image frame represents an image frame of multiple colors.
Where the image features represent image features of solid image blocks (simply referred to as solid blocks) in the image frame.
Optionally, the image features include color attributes of solid color patches, and/or geometric attributes of solid color patches.
Wherein the image features of the non-solid color matching frame match the image features of the solid color image frame, representing: the non-solid color matching frame comprises solid color blocks with color attributes matched with the color attributes of the solid color blocks in the solid color image frame, and/or the non-solid color matching frame comprises solid color blocks with geometric attributes matched with the geometric attributes of the solid color blocks in the solid color image frame.
Optionally, the image features may include any one or more of the following image attributes in addition to the color attributes of the solid color patch, and/or the geometric attributes of the solid color patch:
brightness attribute of the solid color block, color saturation of the solid color block, contrast of the solid color block, and other image attributes.
Since a non-solid color matching frame has image features that match those of a solid color image frame, it is reasonable to consider the non-solid color matching frame as being of the same scene as the solid color image frame, and it is also reasonable to apply white balance information of the non-solid color matching frame of the same scene to the solid color image frame. In other words, the white balance information of the non-solid color matching frame can represent the white balance information of the solid color image frame more accurately. Because the non-solid color matching frame is a non-solid color image frame, its white balance information may be considered to be accurate or more accurate. Therefore, the white balance processing of the solid-color image frame is performed according to the white balance information of the non-solid-color matching frame, so that the white balance processing effect of the solid-color image frame can be improved.
In the prior art, a solution is proposed for the situation that a shooting scene is a solid-color scene: recording first white balance information at a first shooting point in the shooting process, wherein the accuracy of the first white balance information reaches a preset condition; receiving a shutter signal at a second shooting point in the shooting process; when the difference value between the first shooting point and the second shooting point is smaller than a preset threshold value, shooting a framing image corresponding to the second shooting point according to the first white balance information to obtain a target image; wherein the first shooting point and the second shooting point are points expressed by time and/or position. It can be understood that in this solution, when the difference between the first shooting point and the second shooting point is smaller than the preset threshold, the first shooting point is regarded as the same scene as the first shooting point, so that the framing image corresponding to the second shooting point is processed according to the white balance information of the first shooting point. In this scheme, when judging whether the first shooting point and the second shooting point are the same scene, the image content shot by the second shooting point is not considered, so that the accuracy and rationality of the same scene judgment cannot be ensured.
In this application, by performing image feature matching on a non-solid image frame and a solid image frame, a non-solid image frame with matched image features (i.e., a non-solid matching frame mentioned in this application) is regarded as the same scene as the solid image frame. The same scene is judged through image feature matching, which is equivalent to considering the content of the image frames in the same scene judgment process, so that the accuracy and rationality of the same scene judgment can be well ensured. Because the non-solid color matching frame and the solid color image frame are well guaranteed to be in the same scene, the solid color image frame is subjected to white balance processing according to the white balance information of the non-solid color matching frame, and the white balance processing effect of the solid color image frame can be effectively improved.
Based on the above description, compared with the prior art, the technical scheme provided by the application can effectively improve the white balance processing effect of the image under the condition that the shooting scene is a solid-color scene.
With reference to the first aspect, in a possible implementation manner, the acquiring image features of the pure color image frame includes: acquiring a first image feature of an R/G channel of the pure color image frame; acquiring a second image feature of a B/G channel of the pure color image frame; and fusing the first image feature and the second image feature to obtain the image feature of the pure-color image frame.
It should be appreciated that for an image frame, the image features of the R/G channel and the B/G channel are substantially similar, so by fusing the image features of the R/G channel and the B/G channel, the image features of the pure color image frame are obtained, which can avoid the repeatability of the image features and improve the robustness of the algorithm.
With reference to the first aspect, in a possible implementation manner, the acquiring image features of the pure color image frame includes: extracting image features of the solid-color image frame (namely extracting features of an original RGB image of the image frame) to obtain original image features; acquiring a first image feature of an R/G channel of the pure color image frame; acquiring a second image feature of a B/G channel of the pure color image frame; and fusing the first image feature, the second image feature and the original image feature to obtain the image feature of the pure-color image frame.
With reference to the first aspect, in one possible implementation manner, the pure color image frame is a current pure color image frame; the obtaining the non-solid color matching frame from the one or more non-solid color image frames comprises: the non-solid color matching frame is obtained from first history information, wherein the first history information includes image features of a plurality of history image frames including the one or more non-solid color image frames, and each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information, the common view feature representing image features common to at least two image frames.
Taking one image frame (denoted as a first image frame) of a plurality of history image frames recorded in the first history information as an example, the first history information includes image features of the plurality of history image frames, and the first history information includes image features of at least one solid-color block in the first image frame.
Optionally, in this implementation, the first history information may further include image features of a history pure color image frame. That is, not only the non-solid color image frames but also the solid color image frames are recorded in the first history information.
Optionally, in this implementation, the first history information includes a set of co-view features, where the set of co-view features includes co-view features of the image frames recorded in the first history information; wherein the obtaining the non-solid color matching frame from the first history information includes: acquiring matching features matched with the image features of the current pure-color image frame from common-view features included in the common-view feature set; and acquiring the non-solid color matching frame from the image frames which are recorded by the first historical information and correspond to the matching features.
The first history information includes the common view feature set, which is equivalent to that common view features of the history image frames are stored (or recorded) in a centralized manner in the first history information, so that in the process of matching image features, firstly, image features (i.e. matching features mentioned in the application) matched with the image features of the current pure color image frame are acquired from the common view feature set, and then, a non-pure color matching frame of the current pure color image frame is acquired from the history image frame corresponding to the matching features. It can be understood that in the process of matching image features, only the common view features recorded in the common view feature set are required to be matched, and all image frames recorded in the first historical information are not required to be traversed, so that the efficiency of feature matching can be effectively improved, the efficiency of obtaining non-pure-color matching frames of pure-color image frames is improved, and the efficiency of image white balance processing can be improved.
Further, by designing the first history information to include the set of common view features, which is equivalent to centrally storing (or recording) the common view features of the history image frames in the first history information, without storing the common view features once for each history image frame, it is possible to reduce the storage space of the first history information.
Optionally, in an implementation manner that the first history information includes a common view feature set, the first history information may further include frame identification of each image frame recorded in the first history information and identification information of whether the image frame is a non-solid color frame, and may further include white balance information of the recorded non-solid color image frame.
Assuming that the first history information records a first image frame, and the first image frame is a non-solid image frame, the first history information records the following information of the first image frame: the frame identification of the first image frame (for uniquely identifying the first image frame), the identification information of the first image frame being a non-solid color image frame (assuming that the symbol "0" represents a solid color frame and the symbol "1" represents a non-solid color frame, the identification information is "1"), the white balance information of the first image frame, the common view feature of the first image frame recorded in the common view feature set.
With reference to the first aspect, in an implementation manner of acquiring the non-solid color matching frame from the first history information, the method further includes: in a case where a first image frame representing an image frame processed in real time and at least one image frame recorded in the first history information have a common view feature, the first image frame is added to the first history information to update the first history information.
The first image frame is added to the first history information, which means that the relevant information of the first image frame is added to the first history information according to the information storage format of the first history information.
It should be appreciated that by dynamically updating the first history information, timeliness of the first history information may be maintained, thereby helping to improve the accuracy of non-solid color matching frames of solid color image frames acquired based on the first history information.
With reference to the first aspect, in an implementation manner of acquiring the non-solid color matching frame from the first history information, the first history information has a first window threshold, where the first window threshold represents a maximum number of image frames that can be recorded by the first history information, and the method further includes: and deleting one or more image frames recorded in the first historical information if the number of the image frames recorded in the first historical information exceeds the first window threshold.
Optionally, the deleting one or more image frames recorded in the first history information includes: one or more image frames recorded earliest in the first history information are deleted.
It should be understood that, the first history information has a first window threshold, which can be regarded as that the first history information is managed by adopting a sliding window mode, so that the first history information can be prevented from generating larger memory consumption, and meanwhile, the image feature matching process can be prevented from generating larger time consumption.
With reference to the first aspect, in one possible implementation manner, the pure color image frame is a current pure color image frame; the acquiring the image characteristics of the pure color image frame comprises the following steps: and under the condition that the current shooting scene is determined to be a solid-color scene, acquiring the image characteristics of the current solid-color image frame.
In other words, in the present implementation, in the case where the current shooting scene is a solid-color scene, the current image frame is treated as a solid-color image frame.
Due to the instability of the single-frame image feature extraction, there may be a case where one image frame (objectively, a solid-color frame) cannot be determined as a solid-color frame. In the method, when the current shooting scene is a solid-color scene, the current frame is judged to be the solid-color frame for processing, so that the stability of solid-color scene judgment can be ensured, and the robustness of an algorithm is improved.
In addition, there may be a phenomenon in which a non-solid image frame is misjudged as a solid frame due to instability in feature extraction of a single frame image.
It should be appreciated that for non-solid color image frames, the white balance processing may be performed in accordance with conventional automatic white balance (automatic white balance, AWB) techniques, which generally flow, by way of example, by obtaining white point information for the image frame, obtaining white balance information from the white point information, and processing the image frame using the white balance information.
If a non-solid color image frame is misinterpreted as a solid color frame, and processed using the methods provided herein, it is apparent that the computational overhead may be greater than if a non-solid color image frame were processed using conventional AWB techniques.
In this implementation manner, when the current shooting scene is a solid-color scene, the current frame is regarded as a solid-color frame to perform white balance processing, so that a situation that a non-solid-color image frame is mistakenly processed as a solid-color image frame can be avoided, and unnecessary calculation overhead can be avoided.
Optionally, in this implementation, the method further includes: and under the condition that the number of the solid-color image frames recorded in the second historical information reaches a threshold value, determining that the current shooting scene is a solid-color scene, wherein a plurality of historical image frames are recorded in the second historical information.
It is understood that in the case where the number of solid-color image frames of the history reaches the threshold value, the current shooting scene is determined as a solid-color scene, which can improve the accuracy of judging the solid-color scene. The threshold may be an empirical value, or may be determined according to application requirements, which is not limited in this application.
Optionally, the plurality of historical image frames recorded in the second historical information are consecutive.
The continuous history image frames recorded in the second history information can further improve the accuracy of judging the solid-color scene.
Alternatively, the first history information may also serve as the second history information, i.e. the second history information may be directly the first history information. Therefore, only the first historical information is required to be maintained, and the maintenance cost of the historical information can be reduced.
Similar to the first history information, the second history information may also be dynamically updated or may be configured with a window threshold.
Optionally, in an implementation manner of determining a solid-color scene according to the second history information, the method further includes: and adding the image frames processed in real time to the second history information to update the second history information.
Optionally, the second history information has a second window threshold, the second window threshold representing a maximum number of image frames that can be recorded by the second history information, wherein the method further comprises: and deleting one or more image frames recorded earliest in the second historical information if the number of the image frames recorded in the second historical information exceeds the second window threshold.
Based on the above description, the method for white balance processing provided in the first aspect considers the content of the solid-color image frame when acquiring the non-solid-color image frame of the same scene of the solid-color image frame, which can improve the accuracy of the non-solid-color matching frame, so that the white balance processing is performed on the solid-color image frame according to the white balance information of the non-solid-color matching frame, and the white balance processing effect of the solid-color image frame can be improved.
In a second aspect, there is provided an apparatus for image processing, the apparatus being operable to perform the method provided by the first aspect.
Optionally, the apparatus may include means for performing the method provided in the first or second aspect.
In a third aspect, there is provided an apparatus for image processing, the apparatus comprising a processor coupled to a memory for storing a computer program or instructions, the processor for executing the computer program or instructions stored by the memory such that the method of the first aspect is performed.
For example, the processor is configured to execute a computer program or instructions stored by the memory, to cause the apparatus to perform the method in the first aspect.
Optionally, the apparatus includes one or more processors.
Optionally, a memory coupled to the processor may also be included in the apparatus.
Alternatively, the apparatus may comprise one or more memories.
Alternatively, the memory may be integrated with the processor or provided separately.
Optionally, a transceiver may also be included in the apparatus.
In a fourth aspect, a chip is provided, the chip comprising a processing module and a communication interface, the processing module being configured to control the communication interface to communicate with the outside, the processing module being further configured to implement the method in the first aspect.
In a fifth aspect, a computer-readable storage medium is provided, on which a computer program (which may also be referred to as instructions or code) for implementing the method in the first aspect is stored.
For example, the computer program, when executed by a computer, causes the computer to perform the method of the first aspect. The computer may be an image processing device.
In a sixth aspect, there is provided a computer program product comprising a computer program (which may also be referred to as instructions or code) which, when executed by a computer, causes the computer to carry out the method of the first aspect. The computer may be an image processing device.
Based on the above description, the content of the solid-color image frame is considered when the non-solid-color image frame of the same scene of the solid-color image frame is acquired, which can improve the accuracy of the non-solid-color matching frame, so that the solid-color image frame is subjected to white balance processing according to the white balance information of the non-solid-color matching frame, and the white balance processing effect of the solid-color image frame can be improved. Therefore, compared with the prior art, the technical scheme provided by the application can effectively improve the white balance processing effect of the image under the condition that the shooting scene is a solid-color scene.
Fig. 1 is a schematic diagram of a system architecture to which embodiments of the present application may be applied.
Fig. 2 is a schematic flowchart of a method of white balance processing provided in an embodiment of the present application.
Fig. 3 is a schematic diagram of a transition of an imaging scene from a non-solid color scene to a solid color scene.
Fig. 4 is another schematic flowchart of a method of white balance processing provided in an embodiment of the present application.
Fig. 5 is a further schematic flow chart of a method of white balance processing provided in an embodiment of the present application.
Fig. 6 is a schematic flowchart of a method of white balance processing according to another embodiment of the present application.
Fig. 7 is a schematic flowchart of a method of white balance processing according to still another embodiment of the present application.
Fig. 8 is a schematic diagram of image feature extraction in an embodiment of the present application.
Fig. 9 is a schematic diagram of a local feature map and a global feature map in an embodiment of the present application.
Fig. 10 is a schematic block diagram of an apparatus for white balance processing provided in an embodiment of the present application.
Fig. 11 is a schematic block diagram of an apparatus for white balance processing according to another embodiment of the present application.
Fig. 12 is a schematic block diagram of an apparatus for white balance processing according to still another embodiment of the present application.
When a person views the natural world with his eyes, the perception of the same color is substantially the same under different light rays. This ability to eliminate or mitigate the effects of light sources and achieve "seeing" the color of the surface of the actual object is known as color constancy.
The image capturing function is a function widely used currently, but due to the characteristics of a camera and the reflection characteristics of an object, the imaging effect is inconsistent with the effect seen by human eyes under different color temperature environments, namely, the captured image has color cast. To solve the problem of color cast, an automatic white balance (automatic white balance, AWB) technique is proposed. With the automatic white balance technique, the camera can automatically estimate ambient color light and remove the influence of different light source radiation spectrums, thereby enabling the camera to have the same color constancy as the human visual system.
When the shooting scene is a solid-color scene (comprising a large-area solid-color area), the white balance information calculated by the camera is inaccurate, so that the shot image has color cast.
In view of the above problems, the present application provides a method and an apparatus for white balance processing, which can improve the white balance processing effect of an image when a shooting scene is a solid-color scene.
Referring now to fig. 1, a system architecture to which embodiments of the present application may be applied is illustratively described. As shown in fig. 1, the system architecture is an image signal processing (image signal process, ISP) chip platform. The present application is primarily directed to improvements in the algorithm of the automatic white balance (automatic white balance, AWB) module in an ISP chip.
The main purpose of ISP is to post-process the image data of the camera (camera), for example to restore the site details under different environmental conditions. The photographing flow of the camera and the processing procedure of the ISP are shown in fig. 1, an optical image generated by a scene through a lens (lens) is projected onto the surface of a sensor, the optical image is subjected to photoelectric conversion to be converted into an analog electric signal, the analog electric signal is subjected to analog/digital (A/D) conversion after noise elimination processing to be converted into a digital image signal, the digital image signal is then transmitted to the ISP to be subjected to a series of processing, the signal processed by the ISP is transmitted to a processing module (for example, CPU/GPU/DSP and the like) to be processed, and the signal processed by the processing module is finally transmitted to an Application (APP) of the camera to be subjected to preview/photographing/video and other operations, and finally, the photographed picture is obtained.
The ISP mainly has a function of performing a series of processes on an output image of the camera to restore real scene details so as to improve image quality. As shown in fig. 1, the ISP includes modules such as Black Level Compensation (BLC), lens correction (lens shading correction), color interpolation (color), noise removal (denoise), white balance (awb) correction, color correction (color correction), gamma correction, sharpening (sharp), color and contrast enhancement (Saturation), and the like, and further performs automatic exposure control (automatic exposure control, AEC) and the like in the middle, and then outputs YUV (or RGB) format data through a format conversion (format) module, and then transmits the YUV (or RGB) format data to the CPU/GPU/DSP through an I/O interface for further processing.
The method and the device mainly improve the AWB algorithm in the ISP and solve the AWB color cast problem in the pure-color scene.
Fig. 2 is a schematic flowchart of a method 200 for white balance processing according to an embodiment of the present application. For example, the method 200 may be performed by an apparatus for image processing. As shown in fig. 2, the method 200 includes steps S210, S220, and S230.
And S210, acquiring the image characteristics of the pure-color image frame.
S220, acquiring a non-solid color matching frame from one or more non-solid color image frames, wherein the image characteristics of the non-solid color matching frame are matched with the image characteristics of the solid color image frame.
And S230, performing white balance processing on the solid-color image frame according to the white balance information of the non-solid-color matching frame.
The solid color image frame indicates that the color is a solid color image frame. For example, a solid color image frame represents an image frame having a single color. The non-solid image frames represent image frames whose colors are non-solid. For example, a non-solid color image frame represents an image frame of multiple colors.
Image features mentioned in the embodiments of the present application represent image features (may be simply referred to as solid block features) of solid image blocks (simply referred to as solid blocks) in an image frame. For a solid-color image frame, the image features represent the image features of solid-color blocks in the solid-color image frame. For non-solid image frames, the image features represent the image features of solid blocks in the non-solid image frame.
Optionally, the image features include color attributes of the solid color patch, and/or geometric attributes of the solid color patch.
Color attributes of solid color blocks include, but are not limited to: information related to color attributes such as histograms, color moments, or color sets.
The geometrical properties of the solid color block include, but are not limited to: information about geometrical properties such as edge contour, area, position or circumscribed rectangle.
For example, the image features include color attributes of a solid color block, and then the image features of a non-solid color matching frame match the image features of the solid color image frame, representing: the non-solid color matching frame includes solid color blocks having color attributes matching the color attributes of the solid color blocks in the solid color image frame.
For another example, where the image features include geometric attributes of solid blocks, then the image features of a non-solid matching frame match the image features of the solid image frame, representing: the non-solid color matching frame includes solid color blocks whose geometric attributes match those of solid color blocks in the solid color image frame.
For another example, the image features include a color attribute and a geometric attribute of a solid block, and then the image features of the non-solid matching frame match the image features of the solid image frame, representing: the non-solid color matching frame includes solid color blocks whose color attributes match the color attributes of the solid color blocks in the solid color image frame, and the non-solid color matching frame includes solid color blocks whose geometric attributes match the geometric attributes of the solid color blocks in the solid color image frame.
Optionally, in addition to the color properties of the solid color patch, and/or the geometric properties of the solid color patch, the image features mentioned in embodiments of the present application may further include any one or more of the following image properties:
brightness attribute of the solid color block, color saturation of the solid color block, contrast of the solid color block, and other image attributes.
For example, the image features of the non-solid color matching frame match the image features of the solid color image frame, and may also indicate that the non-solid color matching frame includes solid color blocks whose brightness attributes (or color saturation, or contrast) match the brightness attributes (or color saturation, or contrast) of the solid color blocks in the solid color image frame.
It will be appreciated that because a non-solid color matching frame has image features that match those of a solid color image frame, it is reasonable to consider the non-solid color matching frame as being of the same scene as the solid color image frame, and it is reasonable to apply the white balance information of the non-solid color matching frame of the same scene to the solid color image frame.
In other words, the white balance information of the non-solid color matching frame can represent the white balance information of the solid color image frame more accurately. Because the non-solid color matching frame is a non-solid color image frame, its white balance information may be considered to be accurate or more accurate. Therefore, the white balance processing of the solid-color image frame is performed according to the white balance information of the non-solid-color matching frame, so that the white balance processing effect of the solid-color image frame can be improved.
In the prior art, a solution is proposed for the situation that a shooting scene is a solid-color scene: recording first white balance information at a first shooting point in the shooting process, wherein the accuracy of the first white balance information reaches a preset condition; receiving a shutter signal at a second shooting point in the shooting process; when the difference value between the first shooting point and the second shooting point is smaller than a preset threshold value, shooting a framing image corresponding to the second shooting point according to the first white balance information to obtain a target image; wherein the first shooting point and the second shooting point are points expressed by time and/or position. It can be understood that in this solution, when the difference between the first photographing point and the second photographing point is smaller than the preset threshold, the first photographing point is regarded as the same scene as the first photographing point, so that the viewfinder image corresponding to the second photographing point is processed according to the white balance information of the first photographing point. In this scheme, when judging whether the first shooting point and the second shooting point are the same scene, the content of the image shot by the second shooting point is not considered, so that the accuracy and rationality of the same scene judgment cannot be ensured.
In the embodiment of the present application, by performing image feature matching on a non-solid color image frame and a solid color image frame, one non-solid color image frame with matched image features (i.e., the non-solid color matching frame mentioned in the embodiment of the present application) is regarded as the same scene as the solid color image frame. The same scene is judged through image feature matching, which is equivalent to considering the content of the image frames in the same scene judgment process, so that the accuracy and rationality of the same scene judgment can be well ensured. Because the non-solid color matching frame and the solid color image frame are well guaranteed to be in the same scene, the solid color image frame is subjected to white balance processing according to the white balance information of the non-solid color matching frame, and the white balance processing effect of the solid color image frame can be effectively improved.
Therefore, compared with the prior art, the embodiment of the application can effectively improve the white balance processing effect of the image under the condition that the shooting scene is a solid-color scene.
It should be appreciated that the embodiments of the present application may be applied to solid-color scenes with single/multiple cameras in preview/take/record.
In embodiments of the present application, the camera may transition from a non-solid color scene to a solid color scene.
As an example, fig. 3 is a schematic diagram of a transition of an imaging scene from a non-solid color scene to a solid color scene. In fig. 3, a rectangle filled with small dots represents a shooting scene, and the contents enclosed by a bold line box represent a solid-color region included in the shooting scene, as indicated by an arrow in fig. 3, the shooting scene gradually transitions from a non-solid-color scene to a solid-color scene.
It should also be appreciated that the image frames referred to in embodiments of the present application may come directly from the output image of the camera without additional sensor input.
It should also be appreciated that embodiments of the present application use a Computer Vision (CV) method in acquiring non-solid color matched frames of solid color image frames, i.e., acquiring non-solid color matched frames from non-solid color image frames by image feature matching (otherwise referred to as image feature contrast).
Note that, the white balance information about the non-solid-color image frame may be obtained by an existing method, for example, an automatic white balance (automatic white balance, AWB) algorithm, which is not limited in this application.
It should be noted that white balance information is known, and it is known in the prior art to use the white balance information to perform white balance processing on one image frame. In the embodiment of the present application, white balance processing is performed on a solid-color image frame according to white balance information of a non-solid-color matching frame, and an existing method may be adopted, which is not limited in this application.
In step S210, image features of the pure color image frame may be acquired in a variety of implementations.
Optionally, in step S210, the implementation manner of acquiring the image features of the pure color image frame is as follows: acquiring a first image feature of a red/green (R/G) channel of the solid image frame; acquiring a second image feature of a blue/green (B/G) channel of the solid image frame; and fusing the first image characteristic and the second image characteristic to obtain the image characteristic of the pure-color image frame.
For example, performing solid color block feature extraction on an image of an R/G channel and an image of a B/G channel of a solid color image frame respectively to obtain a first image feature and a second image feature; and fusing the first image feature and the second image feature together according to the geometric similarity and/or the color similarity of the first image feature and the second image feature to obtain the image feature of the pure-color image frame.
Optionally, in step S210, the implementation manner of acquiring the image features of the pure color image frame is as follows: extracting image features of the solid-color image frame (namely extracting features of an original RGB image of the image frame) to obtain original image features; acquiring a first image feature of an R/G channel of a pure color image frame; acquiring a second image feature of a B/G channel of the pure color image frame; and fusing the first image features, the second image features and the original image features to obtain the image features of the pure-color image frames.
It should be understood that, for a frame of image, the manner in which the image of the R/G channel and the image of the B/G channel of the frame of image are separated is known in the art, and this will not be described in detail herein.
In step S210, any one or a combination of the following processing manners may be adopted to perform image feature extraction on the image: edge/region detection, clustering, classification or segmentation in machine/deep learning, etc.
For example, in the process of acquiring the first image feature of the R/G channel of the pure color image frame, the pure color block feature extraction is performed on the image of the R/G channel by using any one or more combinations of the above processing manners, so as to obtain the first image feature.
It should be appreciated that for an image frame, the image of the R/G channel and the image of the B/G channel may differ in brightness attribute (or contrast attribute), but be similar in geometric attribute. By extracting the image features of the R/G channel and the B/G channel, respectively, the influence of the difference of the R/G channel and the B/G channel in the brightness attribute (or contrast attribute) on the feature extraction can be avoided or reduced. By fusing the image features extracted by the R/G channels with the image features extracted by the B/G channels, the repeatability of the image features caused by the similarity of the R/G channels and the B/G channels in geometric properties can be avoided.
It should be noted that, in practical application, an appropriate manner may be selected to obtain the image features of the pure color image frame according to the application requirement. For example, the image features of the R/G channel and the image features of the B/G channel of the solid-color image frame may be each regarded as the image features of the solid-color image frame.
In step S230, white balance processing is performed on the solid-color image frame according to the white balance information of the non-solid-color matching frame, in other words, white balance processing is performed on the solid-color image frame with reference to the white balance information of the non-solid-color matching frame. Including in particular but not limited to the following implementations.
Alternatively, as an alternative implementation manner, the white balance information of the non-solid color matching frame obtained in step S220 is directly used to perform white balance processing on the solid color image frame.
Optionally, as another alternative implementation manner, the white balance information (denoted as first white balance information) of the non-pure color matching frame obtained in step S220 and the white balance information (denoted as second white balance information) of the pure color image frame obtained by adopting the conventional AWB algorithm are comprehensively processed, so as to obtain corrected white balance information; and performing white balance processing on the solid-color image frame by using the corrected white balance information. The integrated processing is, for example, an averaging processing or a weighting processing for the first white balance information and the second white balance information.
Optionally, as a further alternative implementation manner, the white balance information (denoted as first white balance information) of the non-pure color matching frame obtained in step S220, the white balance information (denoted as second white balance information) of the pure color image frame obtained by adopting the conventional AWB algorithm, and the white balance information (denoted as third white balance information) of other non-pure color image frames are comprehensively processed to obtain corrected white balance information; and performing white balance processing on the solid-color image frame by using the corrected white balance information. The integrated processing is, for example, an averaging processing or a weighting processing for the first white balance information, the second white balance information, and the third white balance information.
For example, the solid-color image frame acquired in step S210 may be the current solid-color image frame. In step S220, a non-solid color matching frame may be acquired from one or more historical non-solid color image frames.
The "history" in the historical non-solid image frames is relative to the "current" in the current solid image frame. The history of non-solid color image frames represents non-solid color image frames whose shooting time is earlier than the shooting time of the current solid color image frame. Assuming that the time corresponding to the current solid-color image frame is T1, the time corresponding to the historical non-solid-color image frame is T2, and T2 is earlier than T1.
The one or more historical non-solid color image frames may be one or more images automatically selected from an album of the terminal device; alternatively, the user may manually select one or more images from the album; alternatively, the history preview image may be internally cached in the camera preview state; alternatively, when receiving a photographing instruction from the user, the terminal device may automatically photograph one or more images at the same time. The present application is not limited to how the one or more historical non-solid image frames are obtained.
It should be appreciated that the historical non-solid color image frames represent non-solid color image frames that were captured earlier than the current image frame and for which white balance information was acquired.
Alternatively, the method for acquiring the one or more historical non-pure color image frames may be selected, in which when a shooting instruction of a user is received, the terminal device automatically shoots a plurality of image frames at the same time. In this way, it is ensured that the acquired plurality of image frames belong to the same scene as much as possible, i.e. that one or more historical non-solid-color image frames belong to the same scene as the currently processed image frame as much as possible.
The terminal mentioned in the embodiment of the present application means a device having a photographing (or image capturing) function. Optionally, the terminal is a mobile terminal, such as a mobile phone, a tablet computer, etc. Optionally, the terminal is a digital camera or an industrial camera.
Optionally, as shown in fig. 4, in some embodiments, step S210 includes: acquiring image characteristics of a current pure-color image frame; step S220 includes: a non-solid color matching frame of a current solid color image frame is obtained from a first history information, wherein the first history information includes image features of a plurality of history image frames including the one or more non-solid color image frames, and each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information, the common view feature representing image features common to at least two image frames.
For example, image frame 1 and image frame 2 have common view characteristics, meaning that image frame 1 and image frame 2 have common image characteristics.
Taking one of the plurality of historical image frames recorded in the first history information (denoted as a first image frame) as an example, the first history information includes image features of the plurality of historical image frames, and the first history information includes image features of at least one solid-color block in the first image frame.
It is assumed that the first image frame comprises a plurality of solid color blocks, which are different solid color blocks, i.e. the image characteristics of the solid color blocks are not exactly the same. For example, the plurality of solid color patches differ in color attribute; alternatively, the plurality of solid color tiles may have the same color attribute but different geometric or brightness attributes. The first history information may include image features of a portion of the solid color blocks in the first image frame or image features of all the solid color blocks. For example, the first history information includes image features of a solid color block in the first image frame.
In this embodiment, a non-solid color matching frame of a current solid color image frame is found based on first history information, wherein each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information. In other words, any one of the image frames recorded in the first history information has at least a common view characteristic with another one of the image frames. It will be appreciated that features common to at least two image frames (i.e., common view features) have the reference value of feature matching. Therefore, any one of the image frames recorded in the first history information has a common view feature with at least another one of the image frames, which can improve the reference value of the first history information.
For example, an image feature common to at least two image frames may be noted as a stable feature, and the image frames recorded in the first history information are all image frames having the stable feature.
Optionally, in the embodiment shown in fig. 4, the first history information may further include image features of the history solid-color image frames. That is, not only the historical non-solid color image frame but also the historical solid color image frame are recorded in the first history information.
Optionally, in the embodiment shown in fig. 4, the first history information includes a set of common view features, wherein the set of common view features includes common view features of the image frames recorded in the first history information. As shown in fig. 5, step S220 further includes steps S221 and S222.
S221, obtaining matching features matched with the image features of the current pure-color image frame from the common-view features included in the common-view feature set.
The operation of feature matching is referred to in the foregoing description and will not be described in detail here.
S222, acquiring the non-solid color matching frame from the image frames corresponding to the matching features recorded by the first history information.
The first history information includes the set of common view features, which is equivalent to storing (or recording) common view features of the history image frames in the first history information in a centralized manner, so that in the process of matching image features, firstly, image features (i.e., matching features mentioned in the embodiment of the present application) matched with the image features of the current pure color image frame are acquired from the set of common view features, and then, a non-pure color matching frame of the current pure color image frame is acquired from the history image frame corresponding to the matching features. It can be understood that in the process of matching image features, only the common view features recorded in the common view feature set are required to be matched, and all image frames recorded in the first historical information are not required to be traversed, so that the efficiency of feature matching can be effectively improved, the efficiency of obtaining non-pure-color matching frames of the current pure-color image frames is improved, and the efficiency of image white balance processing can be improved.
Further, by designing the first history information to include the set of common view features, which is equivalent to centrally storing (or recording) the common view features of the history image frames in the first history information, without storing the common view features once for each history image frame, it is possible to reduce the storage space of the first history information.
Optionally, as an embodiment, the matching feature acquired in step S221 corresponds to M non-solid color historical image frames, where M is an integer greater than 1, that is, the matching feature is a common view feature of the M non-solid color historical image frames; step S222 includes: and selecting one image frame with highest reference value from the M non-solid color historical image frames as a non-solid color matching frame.
The image frame with the highest reference value is represented by the image frame with the highest matching feature quantity and/or the image frame with the highest matching similarity in the M non-solid historical image frames.
For example, the matching feature numbers of the M non-solid color historical image frames are respectively scored, and the image frame with the highest score is taken as the non-solid color matching frame.
For another example, matching similarity scores of M non-solid color historical image frames are respectively scored, and the image frame with the highest score is taken as the non-solid color matching frame.
For another example, the matching feature numbers and matching similarity of the M non-solid color historical image frames are respectively scored, and the image frame with the highest score is taken as the non-solid color matching frame.
The number of matching features is defined as that, assuming that the image features include color attributes, geometric attributes and brightness attributes, if one image frame is matched with the color attributes (or geometric attributes or brightness attributes) of the current pure-color image frame, the number of matching features of the image frame can be considered to be 1; if the color attribute and the geometric attribute (or the geometric attribute and the brightness attribute, or the color attribute and the brightness attribute) of one image frame are matched with the color attribute and the brightness attribute of the current pure color image frame, the matching feature quantity of the image frame can be considered as 2; if one image frame matches the color attribute, geometric attribute, and brightness attribute of the current pure color image frame, the number of matching features of the image frame may be considered to be 3.
It should be appreciated that by designing the first history information to include a set of co-view features, during feature matching, the matching features of higher reference value may be determined according to the number of image frames corresponding to the co-view features, so as to obtain non-solid color matching image frames of higher reference value.
Alternatively, in the embodiment shown in fig. 5, the first history information may further include a frame identifier of each image frame recorded in the first history information and identifier information of whether the image frame is a non-solid color frame, and may further include white balance information of the recorded non-solid color image frame.
As an example. Assuming that the first history information records a first image frame, and the first image frame is a non-solid image frame, the first history information records the following information of the first image frame: the frame identification of the first image frame (for uniquely identifying the first image frame), the identification information of the first image frame being a non-solid color image frame (assuming that the symbol "0" represents a solid color frame and the symbol "1" represents a non-solid color frame, the identification information is "1"), the white balance information of the first image frame, the common view feature of the first image frame recorded in the common view feature set.
Optionally, in some embodiments, the information storage format of the first history information is that information of each history image frame is recorded with an image frame granularity. For example, the information recorded for each image frame in the first history information includes: frame identification, identification information of whether the frame is a solid-color image frame, image characteristics and white balance information.
Alternatively, in some embodiments, the first history information may be obtained by: relevant information (e.g., including image characteristics, frame information, identification information of whether a solid color frame is, white balance information) of an image frame processed in real time is dynamically stored (or referred to as recording), thereby forming the first history information.
Optionally, in the embodiment shown in fig. 4 or fig. 5, the method 200 further includes: in the case that a first image frame has a common view feature with at least one image frame recorded in the first history information, the first image frame is added to the first history information to update the first history information.
Wherein the first image frame represents a real-time processed image frame. Alternatively, the first image frame may represent an image frame processed prior to processing the current solid color image frame. Alternatively, the first image frame may represent the current pure color image frame.
Wherein the first image frame is added to the first history information, meaning that the related information of the first image frame is added to the first history information according to an information storage format of the first history information.
For example, assuming that the information storage format of the first history information is such that the information stored for each image frame includes, at the granularity of the image frame: frame identification, identification information of whether the frame is a solid-color image frame, image characteristics and white balance information. In this case, adding the first image frame to the first history information means adding information of one image frame, that is, adding information of the first image frame including a frame identification of the first image frame, identification information of whether or not it is a solid-color image frame, image characteristics, white balance information, to the first history information.
For another example, the information storage format of the first history information is that the information storage format includes a common view feature set and frame information, wherein the common view feature set includes common view features of image frames recorded in the first history information, and the frame information includes frame identifications of respective image frames recorded in the first history information, identification information of whether the image frames are non-solid color frames, and white balance information of the non-solid color image frames. In this case, adding the first image frame to the first history information means that frame information of the first image frame is added to the first history information as to whether or not the common feature set is updated according to the situation. If the common view feature of the first image frame exists in the common view feature set, the common view feature set does not need to be updated; if the common view feature of the first image frame does not appear in the common view feature set, adding the common view feature of the first image frame to the common view feature set.
Two examples are described below.
Example 1, assume that the first image frame may be one image frame processed before the current solid color image frame is processed. And if the first image frame and at least one image frame recorded in the first history information have common view characteristics, adding the first image frame into the first history information. Specifically, the following is described.
If the first image frame has a common view feature that is already present in the common view feature set, the operation of adding the first image frame to the first history information includes: operation 1) adding the frame identification of the first image frame and the identification information of whether the first image frame is a solid color frame to the first history information; if the first image frame is a non-solid color image frame, operation 2) is further included, adding white balance information of the first image frame to the first history information. If the common view feature of the first image frame does not appear in the common view feature set, the operation of adding the first image frame to the first history information includes operation 3) in addition to the above-described operations 1) and 2), and the common view feature of the first image frame is added to the common view feature set.
Example 2, assume that the first image frame may be the current pure color image frame. And if the current solid-color image frame and at least one image frame recorded in the first history information have the common view characteristic, adding the current solid-color image frame into the first history information. Specifically, the following is described.
If the common view feature possessed by the current solid image frame already exists in the common view feature set, the operation of adding the current solid image frame to the first history information includes: operation 1) adds the frame identification of the first image frame and the identification information of the solid-color frame to the first history information. If the common view feature of the current solid image frame does not appear in the common view feature set, the operation of adding the current solid image frame to the first history information includes operation 3) in addition to operation 1) described above, and the common view feature of the current solid image frame is added to the common view feature set.
The operation of adding the first image frame to the first history information may be performed before the white balance processing is performed on the first image frame, or may be performed after the white balance processing is performed on the first image frame.
It should be appreciated that by dynamically updating the first history information, timeliness of the first history information may be maintained, thereby helping to improve the accuracy of non-solid color matching frames of a current solid color image frame acquired based on the first history information.
Optionally, in an embodiment of acquiring the non-solid color matching frame from the first history information, the first history information has a first window threshold, the first window threshold representing a maximum number of image frames that can be recorded by the first history information, and the method 200 further includes: and deleting one or more image frames recorded in the first history information if the number of the image frames recorded in the first history information exceeds the first window threshold.
Optionally, the deleting one or more image frames recorded in the first history information includes: one or more image frames recorded earliest in the first history information are deleted.
Optionally, the deleting one or more image frames recorded in the first history information includes: and deleting one or more image frames recorded in the first historical information according to a preset rule. For example, the preset rule is to delete the solid image frame recorded in the first history information preferentially.
The deletion of one image frame (referred to as an image frame Y) from the first history information indicates that the related information of the image frame Y recorded in the first history information is deleted. It is assumed that, regarding the image frame Y, the information recorded in the first history information includes: frame identification of the image frame Y, white balance information of the image frame Y, identification information for identifying the image frame Y as a non-solid color frame, and common view characteristics of the image frame Y. Under this assumption, the operation of deleting the image frame Y from the first history information includes: the frame identification of the image frame Y, the white balance information of the image frame Y, and the identification information for identifying the image frame Y as a non-solid color frame are deleted. If the common view feature of the image frame Y corresponds to only two image frames, that is, the image frame Y has the common view feature with only one other image frame, the deleting the image frame Y in the first history information further includes deleting the common view feature of the image frame Y.
It should be understood that the first history information has a first window threshold, which can be regarded as that the first history information is managed by adopting a sliding window mode, so that the first history information can be prevented from generating larger memory consumption, and meanwhile, the image feature matching process can be prevented from generating larger time consumption. Particularly, in the embodiment of dynamically updating the first history information, the first history information is managed by adopting a sliding window mode, so that the generation of larger memory consumption can be avoided, and the generation of larger time consumption in the image feature matching process can be avoided.
Based on the above description, in the embodiment of the present application, when acquiring a non-solid-color image frame of the same scene of a solid-color image frame, the content of the solid-color image frame is considered, which can improve the accuracy of the non-solid-color matching frame, so that the white balance processing is performed on the solid-color image frame according to the white balance information of the non-solid-color matching frame, and the white balance processing effect of the solid-color image frame can be improved. Therefore, compared with the prior art, the technical scheme provided by the application can effectively improve the white balance processing effect of the image under the condition that the shooting scene is a solid-color scene.
Alternatively, as another embodiment, in any one or more embodiments provided above, step S210 includes: and under the condition that the current shooting scene is determined to be a solid-color scene, acquiring the image characteristics of the current solid-color image frame.
In other words, in the present embodiment, in the case where the current shooting scene is a solid-color scene, the current image frame is treated as a solid-color image frame.
Due to the instability of the single-frame image feature extraction, there may be a case where one image frame (objectively, a solid-color frame) cannot be determined as a solid-color frame. In the method, when the current shooting scene is a solid-color scene, the current frame is judged to be the solid-color frame for processing, so that the stability of solid-color scene judgment can be ensured, and the robustness of an algorithm is improved.
In addition, there may be a phenomenon in which a non-solid image frame is misjudged as a solid frame due to instability in feature extraction of a single frame image.
It should be appreciated that for non-solid color image frames, the white balance processing may be performed in accordance with conventional automatic white balance (automatic white balance, AWB) techniques, which generally flow, by way of example, by obtaining white point information for the image frame, obtaining white balance information from the white point information, and processing the image frame using the white balance information.
If a non-solid color image frame is misinterpreted as a solid color frame, and processed using the methods provided herein, it is apparent that the computational overhead may be greater than if a non-solid color image frame were processed using conventional AWB techniques.
In order to avoid unnecessary computation overhead caused by mistaking a non-solid-color image frame for a solid-color image frame, in the present embodiment, when the current shooting scene is a solid-color scene, the current frame is regarded as a solid-color frame to perform white balance processing. This avoids unnecessary computational overhead.
Optionally, in this embodiment, the method 200 further includes: and determining that the current shooting scene is a solid-color scene under the condition that the number of solid-color image frames recorded in the second historical information reaches a threshold value, wherein a plurality of historical image frames are recorded in the second historical information.
The threshold may be an empirical value, or may be determined according to application requirements, which is not limited in this application. For example, the threshold value is 3, 5, 7, or the like.
The second history information represents information of the maintained (or stored) history image frames. For example, frame identification of a plurality of history image frames, and identification information of whether or not the history image frames are solid-color image frames are recorded in the second history information.
It is understood that in the case where the number of solid-color image frames of the history reaches the threshold value, the current shooting scene is determined as a solid-color scene, which can improve the accuracy of judging the solid-color scene.
Under the condition that the number of the solid-color image frames recorded in the second historical information reaches a threshold value, determining that the current shooting scene is a solid-color scene, and treating the current image frame as a solid-color frame under the condition that the current shooting scene is determined to be the solid-color scene, the problem that the solid-color scene judgment is unstable due to the unstable feature extraction of a single image frame can be solved, and therefore the judgment stability of the solid-color scene can be improved.
When the number of the solid-color image frames recorded in the second history information reaches the threshold value, the current shooting scene is determined to be the solid-color scene, or may be described as determining that the current shooting scene is the solid-color scene when the duty ratio of the solid-color image frames in the second history map reaches the threshold value.
Optionally, the plurality of historical image frames recorded in the second historical information are consecutive.
Wherein the plurality of history image frames are continuous, meaning that the photographing moments of the plurality of history image frames are continuous, or that the plurality of history image frames are continuously photographed.
It should be appreciated that the information of a plurality of image frames photographed in succession may better represent the current photographed scene. Therefore, the present embodiment judges whether the current shooting scene is a solid-color scene by the number of solid-color image frames recorded in the second history information including the continuous plurality of history image frame information, and can further improve the accuracy of judging the solid-color scene.
Alternatively, the first history information mentioned in the foregoing embodiment may also serve as the second history information, i.e., the second history information may be the first history information directly. Therefore, only the first historical information is required to be maintained, and the maintenance cost of the historical information can be reduced.
Optionally, the second history information is different from the first history information.
For example, the second history information is constructed from each image frame processed in real time, i.e., a plurality of history image frames recorded in the second history information are consecutive. The first history information is constructed by filtering image frames recorded in the second history information, and any one of the image frames recorded in the first history information has at least a common view characteristic with another one of the image frames, as described above. In other words, when one image frame (noted as image frame X) recorded in the second history information has a common view feature with at least one image frame recorded in the first history information, image frame X is added to the first history image frame.
For example, the information storage format of the second history information may be such that information of each history image frame is recorded with the image frame as granularity. For example, the information recorded for each image frame in the second history information includes: frame identification, identification information of whether the frame is a solid-color image frame, image characteristics and white balance information.
For another example, the information storage format of the second history information may include a common view feature set and frame information, where the common view feature set includes common view features of image frames recorded in the second history information, and the frame information includes frame identifiers of respective image frames recorded in the second history information, identifier information of whether the image frames are non-solid color frames, and white balance information of the non-solid color image frames.
The second history information may also be updated dynamically. The operation of dynamically updating the second history information may be similar to the operation of dynamically updating the first history information, e.g. the second history information may also be configured with a window threshold.
Optionally, in an embodiment of determining a solid color scene according to the second history information, the method 200 further includes: the image frames processed in real time are added to the second history information to update the second history information.
For example, it may be selected that information of the current solid-color image frame is added to the second history information before or after an operation of judging a shooting scene based on the second history information.
It should be appreciated that, in order to further improve the accuracy of the judgment of the photographed scene, it is possible to select to add the information of the current solid-color image frame to the second history information before the operation of judging the photographed scene based on the second history information.
Optionally, the second history information has a second window threshold, the second window threshold representing a maximum number of image frames that can be recorded by the second history information, wherein the method 200 further comprises: and if the number of the image frames recorded in the second historical information exceeds the second window threshold, deleting one or more image frames recorded earliest in the second historical information.
Therefore, in the present embodiment, by performing white balance processing with the current frame regarded as a solid-color frame in the case where the current shooting scene is a solid-color scene, it is possible to avoid unnecessary computation overhead caused by mistaking a non-solid-color image frame for processing as a solid-color image frame.
It should also be appreciated that the technical solution provided by the embodiments of the present application enables Automatic White Balance (AWB) to have a memory function.
As shown in fig. 6, the embodiment of the present application further provides a method 600 for white balance processing. The method 600 may be performed by an apparatus for image processing. As shown in fig. 6, the method 600 includes steps S610 and S620.
And S610, acquiring a non-solid color matching frame of the current image frame from one or more historical non-solid color image frames in the case that the current shooting scene is a solid color scene.
S620, performing white balance processing on the current image frame according to the white balance information of the non-solid color matching frame.
The current image frame itself may be determined to be a solid color frame. Alternatively, the current image frame itself is not determined to be a solid color frame.
Due to the instability of the single-frame image feature extraction, there may be a case where one image frame (objectively, a solid-color frame) cannot be determined as a solid-color frame. In the method, when the current shooting scene is a solid-color scene, the current frame is judged to be the solid-color frame to be processed, so that the stability of solid-color scene judgment can be ensured, and the robustness of an algorithm is improved.
In addition, in the case where the current shooting scene is a solid-color scene, white balance processing is performed with respect to the current frame as a solid-color frame, so that erroneous processing of a non-solid-color image frame as a solid-color image frame can be avoided, and thus unnecessary calculation overhead can be avoided.
In the embodiment shown in fig. 6, the method of the previous method embodiment may be used, or an existing method may be used, with respect to acquiring a non-solid color matching frame for the current image frame from one or more historical non-solid color image frames.
Optionally, in the embodiment shown in fig. 6, in step S610, the operation of obtaining the non-solid color matching frame of the current image frame from the one or more historical non-solid color image frames may include steps S210 and S220 in the above embodiments. The specific description is referred to above, and will not be repeated here.
Alternatively, in the embodiment shown in fig. 6, in step S610, an existing method of performing white balance processing on a pure color image frame may be used to obtain a non-pure color matching frame of the current pure color image frame. The present application is not limited in this regard.
Optionally, in the embodiment shown in fig. 6, the method 600 further includes: and determining that the current shooting scene is a solid-color scene when the number of solid-color image frames recorded in the history information reaches a threshold value, wherein a plurality of history image frames are recorded in the history information.
In the case where the number of solid-color image frames recorded in the history information reaches the threshold value, the current shooting scene is determined to be a solid-color scene, which can improve the stability of solid-color scene judgment. Under the condition that the current shooting scene is determined to be a solid-color scene, the current image frame is treated as the solid-color frame, so that the problem of unstable judgment of the solid-color scene caused by unstable feature extraction of a single image frame can be solved, and the judgment stability of the solid-color scene is improved.
Optionally, the plurality of historical image frames recorded in the historical information are consecutive.
Optionally, the method 600 further includes: image frames processed in real time are added to the history information to update the history information.
Optionally, the history information has a second window threshold, the second window threshold representing a maximum number of image frames that can be recorded by the history information, the method further comprising: and if the number of the image frames recorded in the history information exceeds the second window threshold, deleting one or more image frames recorded earliest in the history information.
The history information involved in the method 600 may correspond to the second history information mentioned in the method 200 provided in the above embodiment, and the detailed description is described above, which is not repeated here.
It will be appreciated that in the embodiment shown in fig. 6, there is no strict requirement that the camera transition from a non-solid color scene to a solid color scene.
As one example, the first history information referred to in the above-described respective embodiments may be referred to as a global feature map, and the second history information may be referred to as a local feature map.
For a better understanding of the embodiments of the present application, a specific example of the present application is described below with reference to fig. 7. In the description of fig. 7, the global feature map may correspond to the first history information in the above embodiment, and the local feature map may correspond to the second history information in the above embodiment.
Fig. 7 is a schematic flowchart of a method 700 of white balance processing provided in an embodiment of the present application. The method 700 includes steps S710 to S760.
S710, white point information and an R/G, B/G statistical image of the current pure-color image frame are obtained.
For example, the current pure color image frame is processed by using a conventional AWB algorithm, and white point information (recorded as first white point information) and an R/G, B/G statistical map of the current pure color image frame are obtained. It should be appreciated that the white point information is a set of R/G and B/G statistics, which are the most important statistics in AWB, and can be used to correct the color shift of the image and restore the true color.
The R/G, B/G statistical map of the current solid color image frame represents an image of the R/G channel and an image of the B/G channel of the current solid color image frame. Methods for acquiring white point information and R/G, B/G statistics for image frames are known in the art and are not described in detail herein.
The white point information mentioned in the embodiment shown in fig. 7 may correspond to the white balance information in the above embodiment.
It will be appreciated that since the current image frame is a solid image frame, the first white point information may be erroneous and require correction. The white point information (noted as second white point information) acquired in step S760, which will be described later, is regarded as more correct. The first white point information may be corrected using the second white point information, and then white balancing processing may be performed on the current pure color image frame using the corrected first white point information. Alternatively, the white balance processing is directly performed on the current pure color image frame using the second white point information.
And S720, acquiring the image characteristics of the current pure-color image frame.
For example, the R/G, B/G statistical map obtained in step S710 is subjected to solid patch feature extraction.
As an example, step S720 includes the following substeps.
Substep 1), feature extraction of the R/G, B/G channel.
For example, pure color patch feature extraction is performed on the image of the R/G channel and the image of the B/G channel, respectively, using methods such as edge/region detection, clustering, classification or segmentation in machine/deep learning, and the like. As shown in fig. 8, the extracted solid patch features are the areas enclosed by the envelope in fig. 6.
And 2) fusing the features extracted from the image of the R/G channel with the features extracted from the image of the B/G channel to obtain the image features of the current pure-color image frame.
For example, according to the similarity of the image attributes such as the geometric similarity of the features and/or the color similarity, the features of the R/G channel and the B/G channel are fused together to obtain the image features of the current pure-color image frame.
Substep 3) determines a feature description of the image features of the current solid-color image frame, the feature description comprising color attributes of the solid-color blocks and/or geometric attributes of the solid-color blocks.
For example, color attributes of solid color blocks include, but are not limited to: information related to color attributes such as histograms, color moments, or color sets. The geometrical properties of the solid color block include, but are not limited to: information about geometrical properties such as edge contour, area, position or circumscribed rectangle.
S730, perceiving the current shooting scene as a solid-color scene.
For example, in the case where the number of history frame solid-color image frames in the local feature map is greater than a threshold value, it is determined that the current shooting scene is a solid-color scene. As for the local feature map, it will be described below.
For another example, in a multi-camera scene, when all the images output by the multi-camera are solid-color image frames, it is determined that the current scene is a solid-color scene.
In step S710, the current pure color image frame is processed. That is, the current image frame is a pure color image frame.
For example, the method for judging the current image frame as the pure color image frame is as follows: whether the current image frame is a solid image frame is determined according to the characteristic geometric attributes (edge contour, area, position, circumscribed rectangle, etc.) of the current image frame and the proportion of color distribution information (histogram, color moment, color set, etc.) in the image.
It should be noted that, in the embodiment of the present application, when the current image frame is a solid-color image frame and the current shooting scene is a solid-color scene, the current image frame is treated as a solid-color image frame.
S740, constructing a feature map comprising a global feature map and a local feature map.
The feature map is used to record image features of historical image frames and AWB white point information. As shown in fig. 9, the feature map includes a local feature map and a global feature map. And the sizes of the local feature map and the global feature map are managed by continuously adding the latest frame and removing the image features and white point information of the oldest frame.
For example, step S740 includes the following sub-steps.
And 1) constructing a local feature map.
The local feature map is a short-time sliding window. For example, images, features, and AWB white point information of the history frame are added to the local feature map. In the case of multiple cameras on, historical frame images, features, and AWB white point information for multiple cameras may be added to the local feature map. As shown in fig. 9, the continuously photographed image frame k, the image frame k+1, the image frame k+2, and the image frame k+3 are stored in the local feature map.
And 2) constructing a global feature map.
The global feature map is a larger window. For example, the feature, image, and AWB white point information of an image frame having a stable feature in the local feature map may be added to the global feature map. For a description of the image frame having the stabilization feature, see the foregoing description, and a detailed description thereof will be omitted.
In the example of fig. 9, the image frames with stable features in the local feature map are image frame k, image frame k+1, and image frame k+3, because image frame k and image frame k+1 have common view feature 0, and image frame k+1 and image frame k+3 have common view feature 1. It can be appreciated that in the example of fig. 9, the related information of the image frame k, the image frame k+1, and the image frame k+3 recorded in the local feature map is added to the global feature map.
It should be understood that fig. 9 is only an example. For example, image frames (not shown in fig. 9) having common view features other than image frame k, image frame k+1, and image frame k+3 may also be included in the global feature map.
Sub-step 3) manages the local feature map and the global feature map.
According to a preset window threshold (namely, the maximum number of image frames allowed to be recorded by the map), the latest frames are continuously added and the oldest frames are removed, so that the local feature map and the global feature map are managed and updated.
S750, feature matching.
Based on the co-view relationship of the features (which image frames the same feature is seen by, which may include single/multiple camera image frames), a historical non-solid color matching frame for the current solid color frame may be found in the global feature map. The following features can be found in the co-view relationship of the historical frames containing the features (and the historical frame information of the multiple cameras if the multiple cameras are started), and the features and feature descriptions contained in the historical image frames.
S760, white point information (noted as second white point information) is obtained.
The AWB white point information of the non-solid color matching frame which is generally found through the characteristic matching is accurate, and the color cast problem is avoided. Therefore, the AWB white point information of the non-solid color matching frame is used as a reference, in addition, in order to improve the robustness of the algorithm, the reference white point information is filtered to be used as output, the correct AWB white point information can be inherited, the solid color scene color cast problem of a single/multiple camera can be solved, and the purpose of solid color scene color restoration is realized.
For example, in the embodiment shown in fig. 7, steps S710 and S720 may be regarded as front-end steps, and steps S730 to S760 may be regarded as back-end steps.
The non-solid color matching frame (AWB information of the non-solid color frame is generally accurate and has no color cast problem) of the current solid color image frame is found in the global feature map, and the non-solid color matching frame can be regarded as the same scene through feature matching, so that the white point position information of the non-solid color matching frame is filtered and used as output, the color cast problem of the solid color scene can be solved, and the purpose of color restoration is achieved.
As can be seen from the foregoing description, in the embodiment of the present application, when acquiring a non-solid color image frame of the same scene of a current solid color image frame, the content of the current solid color image frame is considered, which can improve the accuracy of a non-solid color matching frame, so that the white balance processing is performed on the current solid color image frame according to the white balance information of the non-solid color matching frame, and the white balance processing effect of the current solid color image frame can be improved. Therefore, compared with the prior art, the technical scheme provided by the application can effectively improve the white balance processing effect of the image under the condition that the shooting scene is a solid-color scene.
The method solves the AWB color cast problem of the solid-color scene by recording the history information for the first time by using the computer vision method, extracts the corresponding block characteristic information aiming at the solid-color scene, and can well express the AWB problem scene if other points, lines, planes or semantic characteristics exist, and the thought of the method can solve similar AWB problems.
The concept of solving the problem that a single frame or a single moment cannot be solved by utilizing the historical information can be expanded to other fields such as automatic gain (AE), automatic Focusing (AF), high Dynamic (HDR) image processing and the like.
The method only records the image, the feature and the AWB information of the history frame, and can solve the problem faced by more image fields if the information such as the position, the gesture and the like of the history frame and the feature is obtained by other computer vision methods.
The various embodiments described herein may be separate solutions or may be combined according to inherent logic, which fall within the scope of the present application.
Having described the method embodiments provided herein, embodiments of the apparatus provided herein are described below. It should be understood that the descriptions of the apparatus embodiments and the descriptions of the method embodiments correspond to each other, and thus, descriptions of details not described may be referred to the above method embodiments, which are not repeated herein for brevity.
As shown in fig. 10, the embodiment of the present application further provides an apparatus 1000 for white balance processing. The apparatus 1000 may be used to perform the method 200 provided by the method embodiments above. As shown in fig. 10, the apparatus 1000 includes an acquisition unit 1010, a matching unit 1020, and a white balance processing unit 1030.
An acquisition unit 1010 is configured to acquire image features of the pure color image frame.
A matching unit 1020 for obtaining a non-solid color matching frame from one or more non-solid color image frames, wherein the image features of the non-solid color matching frame match the image features of the solid color image frame.
The white balance processing unit 1030 is configured to perform white balance processing on the solid-color image frame according to white balance information of the non-solid-color matching frame.
In the embodiment of the present application, by performing image feature matching on a non-solid color image frame and a solid color image frame, one non-solid color image frame with matched image features (i.e., the non-solid color matching frame mentioned in the present application) is regarded as the same scene as the solid color image frame. The same scene is judged through image feature matching, which is equivalent to considering the content of the image frames in the same scene judgment process, so that the accuracy and rationality of the same scene judgment can be well ensured. Because the non-solid color matching frame and the solid color image frame are well guaranteed to be in the same scene, the solid color image frame is subjected to white balance processing according to the white balance information of the non-solid color matching frame, and the white balance processing effect of the solid color image frame can be effectively improved.
Optionally, in some embodiments, the image features include color attributes of solid color patches, and/or geometric attributes of solid color patches.
Optionally, in some embodiments, the image features of the non-solid color matching frame match the image features of the solid color image frame, representing: the non-solid color matching frame includes solid color blocks having color attributes that match the color attributes of the solid color blocks in the solid color image frame, and/or the non-solid color matching frame includes solid color blocks having geometric attributes that match the geometric attributes of the solid color blocks in the solid color image frame.
Optionally, in some embodiments, the acquiring unit 1010 is configured to acquire image features of a current pure color image frame; the matching unit 1020 is configured to obtain a non-solid color matching frame from the first history information, where the first history information includes image features of a plurality of history image frames, the plurality of history image frames includes one or more history non-solid color image frames, and each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information, and the common view feature represents an image feature that is common to at least two image frames.
Optionally, in some embodiments, image features of the historical solid-color image frames are also included in the first historical information.
Optionally, in some embodiments, the first history information includes a set of co-view features, wherein the set of co-view features includes co-view features of the image frames recorded in the first history information; wherein, matching unit 1020 is configured to: acquiring matching features matched with the image features of the current pure-color image frame from the common-view features included in the common-view feature set; and acquiring a non-solid color matching frame from the image frames recorded by the first historical information and corresponding to the matching features.
Optionally, in some embodiments, the first history information includes frame identification of each image frame recorded in the first history information and identification information of whether the image frame is a non-solid color frame, and further includes white balance information of the recorded non-solid color image frame.
Optionally, in some embodiments, the apparatus 1000 further includes a first updating unit configured to add the first image frame to the first history information to update the first history information in a case where the first image frame has a common view feature with at least one image frame recorded in the first history information, wherein the first image frame represents an image frame processed in real time.
Optionally, in some embodiments, the first history information has a first window threshold, the first window threshold represents a maximum number of image frames that can be recorded in the first history information, and the first updating unit is configured to delete one or more image frames recorded in the first history information if the number of image frames recorded in the first history information exceeds the first window threshold.
Optionally, in some embodiments, the first updating unit is configured to delete one or more image frames recorded earliest in the first history information if the number of image frames recorded in the first history information exceeds the first window threshold.
Optionally, in some embodiments, the obtaining unit 1010 is configured to obtain the image feature of the current pure-color image frame in a case where the current shooting scene is determined to be a pure-color scene.
Optionally, in some embodiments, the apparatus 1000 further includes a determining unit, configured to determine that the current shooting scene is a solid-color scene if the number of solid-color image frames recorded in the second history information reaches a threshold, where a plurality of history image frames are recorded in the second history information.
Optionally, in some embodiments, the plurality of historical image frames recorded in the second historical information are consecutive.
Optionally, in some embodiments, the apparatus 1000 further comprises a second updating unit for adding the image frames processed in real time to the second history information to update the second history information.
Optionally, in some embodiments, the second history information has a second window threshold, the second window threshold represents a maximum number of image frames that can be recorded in the second history information, and the second updating unit is configured to delete one or more image frames recorded earliest in the second history information if the number of image frames recorded in the second history information exceeds the second window threshold.
Optionally, in some embodiments, the obtaining unit 1010 is configured to: acquiring a first image feature of an R/G channel of a pure color image frame; acquiring a second image feature of a B/G channel of the pure color image frame; and fusing the first image features and the second image features to obtain the image features of the pure-color image frames.
As shown in fig. 11, the embodiment of the present application further provides an apparatus 1100 for white balance processing. Apparatus 1100 may be used to perform method 600 in the above method embodiments. As shown in fig. 11, the apparatus 1100 includes an acquisition unit 1110 and a white balance processing unit 1120.
An obtaining unit 1110, configured to obtain, in a case where the current shooting scene is a solid-color scene, a non-solid-color matching frame of the current image frame from one or more historical non-solid-color image frames;
the white balance processing unit 1120 is configured to perform white balance processing on the current image frame according to the white balance information of the non-solid color matching frame.
Due to the instability of the single-frame image feature extraction, there may be a case where one image frame (objectively, a solid-color frame) cannot be determined as a solid-color frame. In the method, when the current shooting scene is a solid-color scene, the current frame is judged to be the solid-color frame to be processed, so that the stability of solid-color scene judgment can be ensured, and the robustness of an algorithm is improved.
In addition, in the case where the current shooting scene is a solid-color scene, white balance processing is performed with respect to the current frame as a solid-color frame, so that erroneous processing of a non-solid-color image frame as a solid-color image frame can be avoided, and thus unnecessary calculation overhead can be avoided.
Optionally, as shown in fig. 11, in some embodiments, the apparatus 1100 further includes a determining unit 1130, configured to determine that the current shooting scene is a solid-color scene if the number of solid-color image frames recorded in the history information reaches a threshold, where a plurality of history image frames are recorded in the history information.
Optionally, in some embodiments, the plurality of historical image frames recorded in the historical information are consecutive.
Optionally, in some embodiments, the apparatus 1100 further comprises an updating unit for adding the image frames processed in real time to the history information to update the history information.
Optionally, in some embodiments, the history information has a second window threshold, the second window threshold representing a maximum number of image frames that can be recorded in the history information, and the updating unit is configured to delete one or more image frames recorded earliest in the history information if the number of image frames recorded in the history information exceeds the second window threshold.
Optionally, in some embodiments, the acquiring unit 1110 is configured to: acquiring image characteristics of a current pure-color image frame;
and acquiring non-solid color matching frames from one or more historical non-solid color image frames according to the image characteristics of the current solid color image frames, wherein the image characteristics of the non-solid color matching frames are matched with the image characteristics of the current solid color image frames. For example, the acquisition unit 1110 is configured to perform steps S210 and S220 in the above method embodiment.
As shown in fig. 12, the embodiment of the present application further provides an apparatus 1200 for white balance processing. The apparatus 1200 comprises a processor 1210, the processor 1210 being coupled to a memory 1220, the memory 1220 being for storing computer programs or instructions, the processor 1210 being for executing the computer programs or instructions stored by the memory 1220, such that the method in the above method embodiments is performed.
Optionally, as shown in fig. 12, the apparatus 1200 may further include a memory 1220.
Optionally, as shown in fig. 12, the apparatus 1200 may further include a data interface 1230, where the data interface 1230 is used for data transmission with the outside world.
Alternatively, the apparatus 1200 is used to implement the method 200 in the above embodiments as an option.
Alternatively, the apparatus 1200 is used to implement the method 600 in the above embodiments.
Alternatively, as a further alternative, the apparatus 1200 is adapted to implement the method 700 in the above embodiments.
The present application also provides a computer readable medium storing program code for device execution, the program code comprising means for performing the above-described embodiments.
The present application also provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the above embodiments.
The embodiment of the application also provides a chip, which comprises a processor and a data interface, wherein the processor reads instructions stored in a memory through the data interface, and the method of the embodiment is executed.
Optionally, as an implementation manner, the chip may further include a memory, where an instruction is stored in the memory, and the processor is configured to execute the instruction stored on the memory, where the instruction is executed, and where the processor is configured to perform the method in the foregoing embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
It should be noted that the various numbers of the first or second, etc. are referred to herein for convenience of description only and are not intended to limit the scope of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a universal serial bus flash disk (UFD) (UFD may also be simply referred to as a U-disk or a U-disk), a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (35)
- A method of white balance processing, comprising:acquiring image characteristics of a pure-color image frame;obtaining a non-solid color matching frame from one or more non-solid color image frames, wherein image features of the non-solid color matching frame match image features of the solid color image frame;and carrying out white balance processing on the solid-color image frame according to the white balance information of the non-solid-color matching frame.
- The method of claim 1, wherein the image features comprise color attributes of solid color blocks, and/or geometric attributes of solid color blocks.
- The method of claim 2, wherein the image features of the non-solid color matching frame match the image features of the solid color image frame, representing:the non-solid color matching frame includes solid color blocks having color attributes matching the color attributes of the solid color blocks in the solid color image frame, and/orThe non-solid color matching frame comprises solid color blocks with geometrical properties matched with those of the solid color blocks in the solid color image frame.
- A method according to any of claims 1-3, wherein the solid-color image frame is a current solid-color image frame; wherein the acquiring the non-solid color matching frame from the one or more non-solid color image frames comprises:the non-solid color matching frame is obtained from first history information, wherein the first history information includes image features of a plurality of history image frames including the one or more non-solid color image frames, and each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information, the common view feature representing image features common to at least two image frames.
- The method of claim 4, wherein the first history information further includes image features of historical solid color image frames.
- The method of claim 4 or 5, wherein the first history information comprises a set of co-view features, wherein the set of co-view features comprises co-view features of image frames recorded in the first history information;Wherein the obtaining the non-solid color matching frame from the first history information includes:acquiring matching features matched with the image features of the current pure-color image frame from common-view features included in the common-view feature set;and acquiring the non-solid color matching frame from the image frames which are recorded by the first historical information and correspond to the matching features.
- The method of claim 6, wherein the first history information includes frame identification of each image frame recorded in the first history information and identification information of whether the image frame is a non-solid color frame, and further includes white balance information of the recorded non-solid color image frame.
- The method according to any one of claims 4-7, further comprising:in the case where a first image frame has a common view characteristic with at least one image frame recorded in the first history information, adding the first image frame to the first history information to update the first history information,wherein the first image frame represents a real-time processed image frame.
- The method of claim 8, wherein the first history information has a first window threshold that represents a maximum number of image frames that the first history information can record, the method further comprising:And deleting one or more image frames recorded in the first historical information if the number of the image frames recorded in the first historical information exceeds the first window threshold.
- The method of claim 9, wherein the deleting one or more image frames recorded in the first history information comprises:one or more image frames recorded earliest in the first history information are deleted.
- The method according to any one of claims 1-10, wherein the solid-color image frame is a current solid-color image frame; the acquiring the image characteristics of the pure color image frame comprises the following steps:and under the condition that the current shooting scene is determined to be a solid-color scene, acquiring the image characteristics of the current solid-color image frame.
- The method as recited in claim 11, further comprising:and under the condition that the number of the solid-color image frames recorded in the second historical information reaches a threshold value, determining that the current shooting scene is a solid-color scene, wherein a plurality of historical image frames are recorded in the second historical information.
- The method of claim 12, wherein the plurality of historical image frames recorded in the second historical information are consecutive.
- The method according to claim 12 or 13, further comprising:and adding the image frames processed in real time to the second history information to update the second history information.
- The method of claim 14, wherein the second history information has a second window threshold that represents a maximum number of image frames that the second history information can record, the method further comprising:and deleting one or more image frames recorded earliest in the second historical information if the number of the image frames recorded in the second historical information exceeds the second window threshold.
- The method of any of claims 1-15, wherein the acquiring image features of the solid-color image frame comprises:acquiring a first image feature of an R/G channel of the pure color image frame;acquiring a second image feature of a B/G channel of the pure color image frame;and fusing the first image feature and the second image feature to obtain the image feature of the pure-color image frame.
- An apparatus for white balance processing, comprising:an acquisition unit for acquiring image features of the pure color image frames;A matching unit, configured to obtain a non-solid color matching frame from one or more non-solid color image frames, where image features of the non-solid color matching frame match image features of the solid color image frame;and the white balance processing unit is used for carrying out white balance processing on the solid-color image frame according to the white balance information of the non-solid-color matching frame.
- The apparatus of claim 17, wherein the image features comprise color attributes of solid color blocks, and/or geometric attributes of solid color blocks.
- The apparatus of claim 18, wherein the image features of the non-solid color matching frame match the image features of the solid color image frame, representing:the non-solid color matching frame includes solid color blocks having color attributes matching the color attributes of the solid color blocks in the solid color image frame, and/orThe non-solid color matching frame comprises solid color blocks with geometrical properties matched with those of the solid color blocks in the solid color image frame.
- The apparatus of any of claims 17-19, wherein the solid-color image frame is a current solid-color image frame; the matching unit is configured to obtain the non-solid color matching frame from first history information, where the first history information includes image features of a plurality of history image frames including the one or more non-solid color image frames, and each image frame recorded in the first history information has a common view feature with at least one other image frame recorded in the first history information, the common view feature representing an image feature common to at least two image frames.
- The apparatus of claim 20, wherein the first history information further includes image features of historical color-only image frames.
- The apparatus of claim 20 or 21, wherein the first history information comprises a set of co-view features, wherein the set of co-view features comprises co-view features of image frames recorded in the first history information;wherein, the matching unit is used for:acquiring matching features matched with the image features of the current pure-color image frame from common-view features included in the common-view feature set;and acquiring the non-solid color matching frame from the image frames which are recorded by the first historical information and correspond to the matching features.
- The apparatus of claim 22, wherein the first history information includes frame identification of each image frame recorded in the first history information and identification information of whether the image frame is a non-solid color frame, and further includes white balance information of the recorded non-solid color image frame.
- The apparatus according to any one of claims 20-23, further comprising:a first updating unit configured to add a first image frame to the first history information to update the first history information in a case where the first image frame has a common view feature with at least one image frame recorded in the first history information,Wherein the first image frame represents a real-time processed image frame.
- The apparatus of claim 24, wherein the first history information has a first window threshold indicating a maximum number of image frames that can be recorded in the first history information, and wherein the first updating unit is configured to delete one or more image frames recorded in the first history information if the number of image frames recorded in the first history information exceeds the first window threshold.
- The apparatus of claim 25, wherein the first updating unit is configured to delete one or more image frames recorded earliest in the first history information if a number of image frames recorded in the first history information exceeds the first window threshold.
- The apparatus of any of claims 17-26, wherein the solid-color image frame is a current solid-color image frame; the acquisition unit is used for acquiring the image characteristics of the current pure-color image frame under the condition that the current shooting scene is determined to be the pure-color scene.
- The apparatus as recited in claim 27, further comprising:And the judging unit is used for determining the current shooting scene to be a solid-color scene under the condition that the number of the solid-color image frames recorded in the second historical information reaches a threshold value, wherein a plurality of historical image frames are recorded in the second historical information.
- The apparatus of claim 28, wherein a plurality of calendar Shi Tu frames recorded in the second history information are consecutive.
- The apparatus according to claim 28 or 29, further comprising:and a second updating unit configured to add an image frame processed in real time to the second history information to update the second history information.
- The apparatus of claim 30, wherein the second history information has a second window threshold, the second window threshold representing a maximum number of image frames that can be recorded in the second history information, and the second updating unit is configured to delete one or more image frames recorded earliest in the second history information if the number of image frames recorded in the second history information exceeds the second window threshold.
- The apparatus according to any one of claims 17-31, wherein the acquisition unit is configured to:Acquiring a first image feature of an R/G channel of the pure color image frame;acquiring a second image feature of a B/G channel of the pure color image frame;and fusing the first image feature and the second image feature to obtain the image feature of the pure-color image frame.
- An apparatus for white balance processing, comprising:a memory for storing executable instructions;a processor for invoking and executing the executable instructions in the memory to perform the method of any of claims 1-16.
- A computer readable storage medium, characterized in that it has stored therein program instructions which, when executed by a processor, implement the method of any of claims 1 to 16.
- A computer program product, characterized in that it comprises a computer program code for implementing the method of any one of claims 1 to 16 when said computer program code is run on a computer.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/108851 WO2022032564A1 (en) | 2020-08-13 | 2020-08-13 | White balance processing method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116134808A true CN116134808A (en) | 2023-05-16 |
Family
ID=80246778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080104052.7A Pending CN116134808A (en) | 2020-08-13 | 2020-08-13 | White balance processing method and device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116134808A (en) |
WO (1) | WO2022032564A1 (en) |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105989599A (en) * | 2015-02-15 | 2016-10-05 | 西安酷派软件科技有限公司 | Image processing method and apparatus, and terminal |
KR102581802B1 (en) * | 2016-11-21 | 2023-09-25 | 삼성전자주식회사 | Display apparatus, system and recording media |
CN109688396B (en) * | 2017-07-25 | 2021-02-02 | Oppo广东移动通信有限公司 | Image white balance processing method and device and terminal equipment |
CN107690065A (en) * | 2017-07-31 | 2018-02-13 | 努比亚技术有限公司 | A kind of white balance correcting, device and computer-readable recording medium |
CN108616691B (en) * | 2018-04-28 | 2020-07-10 | 北京小米移动软件有限公司 | Photographing method and device based on automatic white balance, server and storage medium |
CN108769634B (en) * | 2018-07-06 | 2020-03-17 | Oppo(重庆)智能科技有限公司 | Image processing method, image processing device and terminal equipment |
CN110460826B (en) * | 2019-08-12 | 2022-03-22 | Oppo广东移动通信有限公司 | White balance processing method and device and mobile terminal |
-
2020
- 2020-08-13 CN CN202080104052.7A patent/CN116134808A/en active Pending
- 2020-08-13 WO PCT/CN2020/108851 patent/WO2022032564A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2022032564A1 (en) | 2022-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110473185B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN108335279B (en) | Image fusion and HDR imaging | |
CN113766125B (en) | Focusing method and device, electronic equipment and computer readable storage medium | |
US8488896B2 (en) | Image processing apparatus and image processing method | |
CN106899781B (en) | Image processing method and electronic equipment | |
CN110443766B (en) | Image processing method and device, electronic equipment and readable storage medium | |
CN109712177B (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN107622497B (en) | Image cropping method and device, computer readable storage medium and computer equipment | |
CN111915505A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN113674303B (en) | Image processing method, device, electronic equipment and storage medium | |
CN110121031B (en) | Image acquisition method and device, electronic equipment and computer readable storage medium | |
CN113298735A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN107682611B (en) | Focusing method and device, computer readable storage medium and electronic equipment | |
CN107578372B (en) | Image processing method, image processing device, computer-readable storage medium and electronic equipment | |
CN108052883B (en) | User photographing method, device and equipment | |
CN114418879A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN110365897B (en) | Image correction method and device, electronic equipment and computer readable storage medium | |
CN113793257B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN107454318B (en) | Image processing method, image processing device, mobile terminal and computer readable storage medium | |
CN110276730B (en) | Image processing method and device and electronic equipment | |
CN109118427B (en) | Image light effect processing method and device, electronic equipment and storage medium | |
CN107451971A (en) | The blind convolved image restoring method of low-light (level) of priori is combined based on dark and Gauss | |
CN107770446B (en) | Image processing method, image processing device, computer-readable storage medium and electronic equipment | |
JP5509621B2 (en) | Image processing apparatus, camera, and program | |
CN107292853B (en) | Image processing method, image processing device, computer-readable storage medium and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |