CN117119316A - Image processing method, electronic device, and readable storage medium - Google Patents

Image processing method, electronic device, and readable storage medium Download PDF

Info

Publication number
CN117119316A
CN117119316A CN202311387337.8A CN202311387337A CN117119316A CN 117119316 A CN117119316 A CN 117119316A CN 202311387337 A CN202311387337 A CN 202311387337A CN 117119316 A CN117119316 A CN 117119316A
Authority
CN
China
Prior art keywords
parameter
color
light source
image
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311387337.8A
Other languages
Chinese (zh)
Inventor
刘添鑫
王枫桥
尹彦卿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311387337.8A priority Critical patent/CN117119316A/en
Publication of CN117119316A publication Critical patent/CN117119316A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Abstract

The application provides an image processing method, an electronic device and a readable storage medium. By designing a number of color tuning parameters associated with ISP parameters, the user is provided with the aforementioned number of color tuning parameters that can color tune the taken picture. And determining target ISP parameter values of corresponding shooting scenes according to the debugging results of the users on the color adjustment parameters. Furthermore, when it is detected that the user takes a photograph belonging to the taken scene again, color adjustment can be automatically performed on the taken photograph according to the determined target ISP parameter value in the scene, so that the color effect of the adjusted taken photograph conforms to the color effect watched by human eyes.

Description

Image processing method, electronic device, and readable storage medium
Technical Field
The present application relates to the field of image technology, and in particular, to an image processing method, an electronic device, and a readable storage medium.
Background
When the color effect of a photograph taken by a terminal device is debugged based on an image processor (image signal processor, ISP), a person who is debugged usually manually performs the debugging of the photograph taken in a scene (hereinafter, referred to as a typical scene) with a high application frequency before the device leaves the factory. For example, for typical scenes such as nighttime, daytime, portrait, city, etc., a debugger usually adjusts the display color effect of a photo to be close to the visual color effect of human eyes by repeating the debugging of ISP parameters with the visual color effect observed by the debugger as a reference, and takes the ISP parameter value at that time as a target ISP parameter value corresponding to each typical scene. Furthermore, when the user takes a photo in each typical scene, the terminal device can automatically debug the color effect of the photo according to the target ISP parameter value corresponding to the scene, so as to restore the color effect observed by human eyes and provide better visual experience for the user.
However, in the face of massive scenes that a user may encounter in the actual use process, manual debugging of the device before leaving the factory cannot achieve full scene coverage. Therefore, for atypical scenes of the device which are not manually debugged before leaving the factory, the problem that the shooting effect is difficult to meet the color requirement when a user shoots a photo may be caused. For example, in these atypical scenes, the terminal device may directly display the photo taken by the terminal device to the user, or debug the photo taken in the atypical scene by using the target ISP parameter value for other typical scene debugging, where the color effect of the photo may be greatly different from the actual color seen by the human eye.
Disclosure of Invention
The application provides an image processing method, an electronic device and a readable storage medium. The display color effect of the user taking the picture can be adjusted to be close to the visual color effect of human eyes.
In a first aspect, the present application provides an image processing method, which is applied to an electronic device, including: acquiring a first image to be processed; acquiring a first color parameter value of a color parameter corresponding to a first shooting scene to which a first image belongs; and performing color adjustment on the first image based on the first color parameter value to obtain a second image, wherein the first color parameter value is determined based on the first adjustment parameter value, the first adjustment parameter value is a parameter value of an adjustment parameter input by a user for performing color adjustment on the third image, and the shooting scenes of the first image of the third image are the same.
For example, the first image to be processed may be a photograph taken by a user operating the electronic device, may be a photograph taken in real time, or may be a photograph already stored in the electronic device or obtained by the electronic device from another storage medium. Herein, other storage media include, but are not limited to, servers, cloud, etc.
As another example, the first shooting scene to which the first image belongs may be a typical scene or an atypical scene below. It can be understood that, if the first shooting scene is a typical scene, the image processing method provided by the application can optimize the first color parameter value of the typical scene, and the optimization is performed based on the first color adjustment parameter value input by the user, so that the method can better meet the user requirement; if the first shooting scene is an atypical scene, the first color parameter value of the atypical scene can be determined based on the image processing method provided by the application, so that the color effect of the shooting photo under the atypical scene can accord with the visual effect of a user.
Here, the color parameter may be an ISP parameter hereinafter, and the first color parameter value may be a target ISP parameter value hereinafter. The adjustment parameter may be a color adjustment parameter, hereinafter, and the first adjustment parameter value may be a target color adjustment parameter value, hereinafter. The second image may be an image obtained by color-adjusting a photographed picture based on the target ISP parameter value, hereinafter. The third image may be a photograph taken by the user having entered the first adjustment parameter value hereinafter. The third image may be a photograph taken in real time, or may be a photograph already stored in the electronic device or obtained by the electronic device from another storage medium. Herein, other storage media include, but are not limited to, servers, cloud, etc.
In a possible implementation of the first aspect described above, the color parameter includes at least one of the following parameters of the ISP: hue intensity parameters, saturation intensity parameters, hue weight parameters, saturation weight parameters in the CCM module of ISP; light source weight parameters and light source parameters in the AWB module of the ISP.
For example, the HUE intensity parameter may be CCM parameter 1, hereinafter, representing the HUE parameter segment intensity (hue_idxx) with index X; the SATURATION intensity parameter may be CCM parameter 2, hereinafter, representing SATURATION parameter segment intensity (saturation_idxx) indexed by X; the HUE weight parameter may be CCM parameter 3 below, representing a HUE parameter segment weight (hue_idxx_weight) with index X; the SATURATION weight parameter may be CCM parameter 4 below, representing SATURATION parameter segment weight (saturation_idxx_weight) with index X; the light source Weight parameter in the AWB module of the ISP may be AWB parameter 1 below, representing a light source Weight (lightsource_weight). The determination method of the index X will be described in detail below, and will not be described in detail herein.
In one possible implementation of the first aspect, the light source parameters include a first light source confidence parameter, a first light source red channel gain parameter, a first light source red channel offset parameter, a first light source blue channel gain parameter, a first light source blue channel offset parameter, and a second light source confidence parameter, a second light source red channel gain parameter, a second light source red channel offset parameter, a second light source blue channel gain parameter, a second light source blue channel offset parameter of the second light source; the first light source is a light source with the largest number of pixels covering the first image among a plurality of light sources for shooting the first image, and the second light source is a light source with the number of pixels covering the first image being inferior to that of the first light source.
For example, the first light source confidence parameter of the first light source may be AWB parameter 2 below, representing a first light source confidence (dayleight_current_prob_top1_lightsource); the first light source red channel Gain parameter may be AWB parameter 3 below, representing a first light source red channel Gain (r_gain_top1_lightsource); the first light source red channel offset parameter may be AWB parameter 4 below, representing a first light source red channel offset (r_gain_locus_offset_top1_lightsource); the first light source blue channel Gain parameter may be AWB parameter 5 below, representing a first light source blue channel Gain (b_gain_top1_lightsource); the first light source blue channel offset parameter may be AWB parameter 6 below, representing a first light source blue channel offset (b_gain_offset_top1_lightsource); the second light source confidence parameter of the second light source may be AWB parameter 7 below, representing a second light source confidence (dayleght_lucus_prob_top2_lightsource); the second light source red channel Gain parameter may be AWB parameter 8 below, representing a second light source red channel Gain (r_gain_top2_lightsource); the second light source red channel offset parameter may be AWB parameter 9 below, representing a second light source red channel offset (r_gain_locus_offset_top2_lightsource); the second light source blue channel Gain parameter may be AWB parameter 10, hereinafter, representing a second light source blue channel Gain (b_gain_top2_lightsource); the second light source blue channel offset parameter may be AWB parameter 11, hereinafter, representing a second light source blue channel offset (b_gain_offset_top2_lightsource).
In a possible implementation of the first aspect, the adjustment parameters include at least one of the following parameters: a first white balance parameter corresponding to the light source weight parameter; a second white balance parameter corresponding to the first light source confidence parameter, the first light source red channel gain parameter, and the first light source red channel offset parameter; a third white balance parameter corresponding to the first light source confidence parameter, the first light source blue channel gain parameter, and the first light source blue channel offset parameter; a fourth white balance parameter corresponding to the second light source confidence parameter, the second light source red channel gain parameter, and the second light source red channel offset parameter; a fifth white balance parameter corresponding to the second light source confidence parameter, the second light source blue channel gain parameter, and the second light source blue channel offset parameter; a first color correction parameter corresponding to the hue intensity parameter and the saturation intensity parameter; a second color correction parameter corresponding to the tone weight parameter; a third color correction parameter corresponding to the saturation weight parameter.
For example, the first white balance parameter may be the white balance parameter 1 hereinafter, which represents a light source weight parameter for adjusting the color temperature of a photo, and may be named light source_weight; the second white balance parameter may be a white balance parameter 2 hereinafter, which represents a red channel gain parameter of the first light source, and may be named r_gain_top1_lightsource; the third white balance parameter may be a white balance parameter 3 hereinafter, which represents a gain parameter of a blue channel of the first light source, and may be named as b_gain_top1_lightsource; the fourth white balance parameter may be a white balance parameter 4 hereinafter, representing a red channel gain parameter of the second light source, which may be named r_gain_top2_lightsource; the fifth white balance parameter may be a white balance parameter 5 hereinafter, which represents a blue channel gain parameter of the second light source, and may be named b_gain_top2_lightsource; the first color correction parameter may be color parameter 1 hereinafter, representing ROI parameters for determining hue and saturation of the color adjustment region, which may be named ROI; the second color correction parameter may be color parameter 2 hereinafter, representing a tone parameter for adjusting the tone of the ROI, which may be named HUE; the third color correction parameter may be color parameter 3, hereinafter, representing a SATURATION parameter for adjusting the SATURATION of the photograph, which may be named SATURATION.
In a possible implementation manner of the first aspect, the correspondence between each adjustment parameter and the color parameter is as follows:
first white balance parameter = light source weight parameter;
second white balance parameter = first light source confidence parameter x first light source red channel gain parameter x first light source red channel offset parameter;
third white balance parameter = first light source confidence parameter x first light source blue channel gain parameter x first light source blue channel offset parameter;
fourth white balance parameter = second light source confidence parameter x second light source red channel gain parameter x second light source red channel offset parameter;
fifth white balance parameter = second light source confidence parameter x second light source blue channel gain parameter x second light source blue channel offset parameter;
the first color correction parameter corresponds to a hue intensity parameter and a saturation intensity parameter;
second color correction parameter=tone weight parameter;
third color correction parameter = saturation weight parameter.
In a possible implementation of the first aspect described above, the first color parameter value of the color parameter is determined by: acquiring M first adjustment parameter values of each adjustment parameter based on the M third images; based on the corresponding relation between each adjusting parameter and the color parameter, N second color parameter values corresponding to the M first adjusting parameter values are calculated, wherein the N second color parameter values all meet the value range of the color parameter; a first color parameter value for the color parameter is determined based on the N second color parameter values.
For example, the M third images may be photographs of the adjustment of the plurality of users in step 801 below, where the M first adjustment parameter values may be a plurality of target color adjustment parameter values adjusted by the plurality of users for the same color adjustment parameter. Wherein the N second color parameter values may be a plurality of first ISP parameter values for one ISP parameter determined based on a plurality of target color adjustment parameter values in step 804 below.
In a possible implementation of the first aspect, determining the first color parameter value of the color parameter based on the N second color parameter values includes: calculating differences between the N second color parameter values and standard values of the color parameters; selecting L second color parameter values with differences within a preset difference range from the N second color parameter values; a first color parameter value for the color parameter is determined based on the L second color parameter values.
Here, the standard value of the color parameter may be a preset standard value of the ISP parameter hereinafter, and the difference between the N second color parameter values and the standard value of the color parameter may be a deviation value between each first ISP parameter value of the ISP parameter hereinafter and the preset standard value. The lower limit of the preset difference range may be the minimum discard threshold hereinafter, and the upper limit of the preset difference range may be the maximum discard threshold hereinafter. The L second color parameter values may be a plurality of first ISP parameter values in a subsequence of the parameter sequence below.
In a possible implementation of the first aspect, determining the first color parameter value of the color parameter based on the L second color parameter values includes: if the color parameter belongs to the CCM module of the ISP, the first color parameter value X of the color parameter is:
wherein the method comprises the steps of,x 1 、x 2 、…x L Is the L second color parameter values for that color parameter.
A 1 、A 2 、…A L The weight of the L second color parameter values is determined based on the L second color parameter values and the standard value of the color parameter, wherein the smaller the difference value between the second color parameter values and the standard value is, the larger the weight of the second color parameter values is.
Here, the second color parameter value belonging to the CCM module may be the first CCM parameter value hereinafter. The first color parameter value X may be a target CCM parameter value of a CCM parameter hereinafter. The standard value of the color parameter may be the standard value of the preference distribution of the CMM parameter hereinafter.
In a possible implementation of the first aspect, determining the first color parameter value of the color parameter based on the L second color parameter values includes: if the color parameter belongs to the AWB module of the ISP, a first color parameter value Y of the color parameter is:
wherein y is 1 、y 2 、…y L Is the L second color parameter values for that color parameter.
B 1 、B 2 、…B L For the weights of the L second color parameter values, the weights are determined based on the pregain parameter values of the third image corresponding to the L second color parameter values and the standard values of the pregain parameters, wherein the smaller the difference value between the pregain parameter values of the third image and the standard values is, the larger the weights of the second color parameter values corresponding to the third image are, wherein the third image comprises EXIF information, and the EXIF information comprises pregain parameters.
Here, the second color parameter value belonging to the AWB module may be the first AWB parameter value hereinafter. The first color parameter value Y may be a target AWB parameter value of the AWB parameter hereinafter.
In a possible implementation of the first aspect, the first shooting scene to which the first image and/or the third image belong is determined by: acquiring EXIF information of the first image and/or the third image, wherein the EXIF information comprises CCT parameter values and/or LV parameter values; and determining a first shooting scene to which the third image belongs based on the CCT parameter value and/or the LV parameter value of the first image and/or the third image, wherein the first shooting scene has a corresponding CCT parameter value interval and/or LV parameter value interval, and the CCT parameter value of the first image and/or the third image is located in the CCT parameter value interval and/or the LV parameter value is located in the LV parameter value interval.
For example, the CCT parameter value interval and the LV parameter value interval of the first shooting scene may be data segments divided based on the numerical ranges of the CCT parameter and the LV parameter in step 802 below.
After determining the first shooting scene to which the third image belongs, the first color parameter value determined according to the first adjustment parameter value of the third image may be associated with the first shooting scene. Further, after determining the first shooting scene to which the first image belongs based on EXIF information of the first image, the first color parameter value may be obtained based on a correspondence between the first shooting scene and the first color parameter value, so as to perform color adjustment on the first image according to the first color parameter value.
In a possible implementation of the first aspect, the image processing method further includes: detecting a first operation of a user, and displaying a first interface, wherein the first interface comprises a third image and a color adjustment control corresponding to the color adjustment parameter; and acquiring a first color adjustment parameter value adjusted by the user on the color adjustment control.
For example, the first interface may be a color adjustment interface hereinafter.
In one possible implementation of the first aspect, the first operation is a click operation on a display control in a second interface, where the second interface is a capturing interface where a user captures a third image.
For example, the display control of the second interface may be a photo browsing icon in the shooting interface below.
In a possible implementation of the first aspect, the first operation is a click operation on an edit control of the selected third image.
For example, the selected third image may be displayed in a photo browsing interface, hereinafter, and the editing control may be an editing icon of the photo browsing interface, hereinafter.
In a second aspect, the present application provides an electronic device comprising: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the image processing methods provided by the above-described first aspect and the various possible implementations of the above-described first aspect.
In a third aspect, the present application provides a computer readable medium having stored thereon instructions which, when executed on a computer, cause the computer to perform the image processing method provided by the above first aspect and the various possible implementations of the above first aspect.
In a fourth aspect, embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by any one of the possible implementations of the first aspect described above.
Drawings
FIG. 1 shows a schematic view of a scenario in which color adjustment is performed based on ISP parameters manually adjusted before the equipment leaves the factory;
FIG. 2 is a schematic view of an application scenario in which an image processing method performs color adjustment according to an embodiment of the present application;
fig. 3 is a schematic flow chart of image processing implemented by using a terminal device as an execution subject according to an embodiment of the present application;
FIG. 4a is a schematic diagram of another application scenario in which an image processing method performs color adjustment according to an embodiment of the present application;
FIG. 4b illustrates a color adjustment interface schematic according to an embodiment of the present application;
FIG. 4c illustrates another color adjustment interface schematic according to an embodiment of the present application;
FIG. 4d illustrates a user operation diagram according to an embodiment of the present application;
FIG. 4e illustrates an interface diagram of a prompt window, in accordance with an embodiment of the present application;
FIG. 4f illustrates another user operation schematic in accordance with an embodiment of the application;
FIG. 4g illustrates a schematic diagram of an operation for selecting a region of interest, in accordance with an embodiment of the present application;
FIG. 5 is a diagram illustrating a mapping of color tuning parameters to ISP parameters according to an embodiment of the present application;
FIG. 6a is a diagram of an interface for determining whether a user's desired effect is met, according to an embodiment of the application;
FIG. 6b is a schematic diagram of another interface for determining whether a user's desired effect is met, according to an embodiment of the application;
FIG. 7 illustrates a schematic diagram of a photo browsing interface, in accordance with an embodiment of the present application;
fig. 8 is a schematic flow chart of determining a target ISP parameter value by using terminal equipment as a main body according to an embodiment of the present application;
FIG. 9a is a schematic diagram showing a display effect achieved based on the image processing method of the present application according to an embodiment of the present application;
FIG. 9b is a schematic diagram showing another display effect realized based on the image processing method of the present application according to an embodiment of the present application;
FIG. 10 is an interactive schematic diagram of image processing based on a common implementation of a terminal device and a server according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 12 is a block diagram showing a software structure of a terminal device system according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be described in detail below with reference to the accompanying drawings and specific embodiments of the present application.
In order to facilitate understanding of the solutions in the embodiments of the present application, some concepts and terms related to the embodiments of the present application are explained below.
(1) Image processor (ISP): the method can be used for processing the output signals of the front-end image sensor, and comprises an automatic white balance module (auto white balance, AWB) module, a color correction matrix (color correction matrix, CCM) module and the like. The AWB module may perform color constancy (color constancy) processing on the color temperature environment where the image is captured based on the ISP parameters in the AWB module, so as to solve the problem of image color imbalance. The ISP parameters in the AWB module may be referred to herein as AWB parameters. The CCM module may color correct the photographed picture according to the color correction matrix based on ISP parameters in the module to correct the pixel values of the picture to approach the standard value of the color plate. The ISP parameters in the CCM module may be referred to herein as CCM parameters.
(2) Exchangeable image files (exchangeable image file, EXIF): the photo in EXIF format includes EXIF information for recording attribute information of the photographed photo, etc. The attribute information includes parameters such as correlated color temperature (correlatedcolour temperature, CCT), luminance (LV), and the like. Based on the EXIF information, the terminal device and/or the server side can divide a shooting scene in which a photo is shot. For example, the terminal device and/or the server may divide the same or similar photo of the EXIF information into the same shooting scene, where the EXIF information is similar, that is, the difference value of the same parameter in the EXIF information is within the set threshold range.
As described above, in the process of debugging the color effect of a photo taken by a terminal device, full coverage of a shooting scene cannot be achieved due to the method of manually debugging ISP parameters by a debugger before the device leaves the factory. Further, for the aforementioned atypical scene, there may occur a problem that the color effect of the photographed picture cannot meet the user's demand because the scene does not have the target ISP parameters manually adjusted.
Fig. 1 shows a schematic view of a scenario in which a terminal device 10 performs color adjustment based on ISP parameters manually adjusted before shipment.
Referring to fig. 1, the user takes a photograph in an atypical scene, which may be, for example, a night portrait scene or the like, based on the photographing interface 101 of the terminal device 10, and clicks the photograph browsing icon 101b in the photographing interface 101. In response to a click operation of the photo browsing icon 101b by the user, the terminal device 10 displays the photo browsing interface 102. Since the shooting scene of the photo is an atypical scene, there is no target ISP parameter value manually debugged before the device corresponding to the scene leaves the factory, and thus the terminal device 10 cannot perform color adjustment on the photo for the scene after the user takes the photo. Further, the photo displayed in the photo browsing interface 102 may be the image 101a directly captured by the terminal device 10/adjusted based on the target ISP parameter values of other typical scenes. It will be appreciated that the color effect of the image 101a may be significantly different from that seen by the human eye.
In order to solve the above problems, the present application provides an image processing method. Specifically, the method designs a number of color adjustment parameters associated with ISP parameters and provides the user with the aforementioned number of color adjustment parameters that can color adjust the taken picture. And determining target ISP parameter values of corresponding shooting scenes according to the debugging results of the users on the color adjustment parameters. Further, when it is detected that the user takes a photograph again belonging to the taken scene, the taken photograph may be automatically color-adjusted according to the determined target ISP parameter value in the scene.
For example, when a user takes a photograph in a typical scene, the taken photograph may be color-adjusted based on a target ISP parameter value corresponding to the typical scene. When a user takes a photo under an atypical scene, a color adjustment interface capable of adjusting color adjustment parameters can be displayed, so that the user can debug the color adjustment parameters of the photo, and a target ISP parameter value of the shooting scene to which the photo belongs is determined according to a debugging result.
Fig. 2 shows a schematic view of a scene of color adjustment by an image processing method according to an embodiment of the present application.
Referring to fig. 2, the user takes a photograph in an atypical scene, which may be, for example, a night portrait scene or the like, based on the photographing interface 201 of the terminal device 10, and clicks the photograph browsing icon 201b in the photographing interface 201. In response to a click operation of the photo browsing icon 201b by the user, the terminal device 10 displays a color adjustment interface 202, and the color adjustment interface 202 may display an image 201a and a color adjustment area 202a acquired by the terminal device 10 in response to a photographing operation by the user. Color adjustment region 202a may display the aforementioned number of color adjustment parameters and their corresponding color adjustment controls, each of which may correspond to ISP parameters in one or more AWB modules and/or CCM modules.
Based on the color adjustment interface 202 described above, when the user is not satisfied with the color effect of the image 201a, the ISP parameters in the AWB module and/or CCM module may be adjusted by manually adjusting the color adjustment parameters in the color adjustment area 202a to adjust the color effect of the photograph to conform to the visual color effect of the human eye. For example, a user may adjust image 201a to be an image 202b that conforms to the visual color effects of the human eye. Further, in response to a click operation of the confirm icon 202c in the color adjustment interface 202 by the user, the terminal device 10 may display the image 202b in the photo browsing interface 203. And, the terminal device 10 may take the color adjustment parameter value corresponding to the image 202b as a target color adjustment parameter value to determine a target ISP parameter value corresponding to a shooting scene in which the image 201a is located based on an association relationship between the color adjustment parameter and the ISP parameter and the target color adjustment parameter value, wherein the shooting scene in which the image 201a is located may be determined based on EXIF information of the image 201a and/or the image 202b.
Further, if the user takes a photograph again in the atypical scene, the terminal apparatus 10 may directly display the photograph color-adjusted based on the target ISP parameter value corresponding to the atypical scene in the color-adjusting interface 202 and/or the photograph browsing interface 203.
The implementation procedure of the image processing method provided by the application will be described in detail with reference to specific embodiments.
Specifically, the following embodiment 1 provides a specific implementation procedure for implementing the image processing method provided by the present application with a terminal device as an execution subject.
Example 1
The embodiment will specifically describe a specific implementation process of the image processing method provided by the application with a terminal device as an execution body with reference to the accompanying drawings.
Fig. 3 is a schematic flow chart of image processing performed by using a terminal device as an execution subject according to an embodiment of the present application. It will be appreciated that the execution subject of each step of the flow shown in fig. 3 may be the terminal device 10. For convenience of description, in the following description of each step, the execution subject of each step will not be described again.
Specifically, the image processing method implemented by using the terminal equipment as an execution main body provided by the application can comprise the following steps:
301: in response to a user operation to view a taken photograph, a color-toning interface is displayed that includes a color-toning control.
In one example manner, when a user takes a photograph and clicks a photograph browsing icon in the photographing interface, the terminal device may display a color adjustment interface in response to a clicking operation of the photograph browsing icon by the user. Herein, the user may refer to fig. 2 for a schematic view of an interface for taking a photo, which is not described herein. Specifically, when a user takes a photo in a typical scene, the photo may be color-adjusted directly based on a target ISP parameter value corresponding to the typical scene. When a user takes a photograph in an atypical scene, a color-tuning interface may be displayed to enable the user to tailor the color-tuning parameters of the photograph based on the color-tuning control. It will be appreciated that for photographs taken in a typical scene, the color adjustment interface may also be displayed in response to user operation to adjust the target ISP parameter value corresponding to the typical scene based on user demand.
In another exemplary manner, for example, fig. 4a shows another schematic view of a scenario in which the terminal device 10 performs color adjustment based on the image processing method provided by the present application according to an embodiment of the present application.
Referring to fig. 4a, the user views an image 401a taken in an atypical scene, such as a nighttime portrait scene or the like, based on the photo browsing interface 401 of the terminal device 10, and clicks an edit icon 401b in the photo browsing interface 401. In response to a click operation of the edit icon 401b by the user, the terminal apparatus 10 displays a color adjustment interface 410. Based on the color adjustment interface 410, when the user is not satisfied with the color effect of the image 401a, the color effect of the photograph can be adjusted to conform to the visual color effect of the human eye by manually adjusting the color adjustment parameters in the color adjustment area 410 b. For example, a user may adjust image 401a to be an image 401b that conforms to the visual color effects of the human eye. Further, the terminal device 10 may display the image 401b again in the photo browsing interface 401.
An interface schematic diagram of the color adjustment interface is specifically described below. For example, FIG. 4b illustrates a color adjustment interface schematic according to an embodiment of the present application.
Referring to fig. 4b, color adjustment interface 410 may include a photo display area 410a and a color adjustment area 410b. The photo display area 410a is used for displaying a photo taken by a user, and the color adjustment area 410b includes several color adjustment parameters and color adjustment controls thereof, and the color adjustment controls may be, for example, an input box 410c, an input box 410d, an input box 410e, and an input box 410f in fig. 4 b. The default display content of each input box may be a numerical range of the color adjustment parameter corresponding to the input box.
For another example, FIG. 4c shows another color adjustment interface schematic in accordance with an embodiment of the application.
Referring to fig. 4c, the color adjustment interface 411 may include a photo display area 411a and a color adjustment area 411b. The photo display area 411a is used for displaying a photo taken by a user, and the color adjustment area 411b includes several color adjustment parameters and color adjustment controls thereof, and the color adjustment controls may be, for example, a slider 411c, a slider 411d, a slider 411e, and a slider 411f in fig. 4 c. The lower and upper limits of the selectable data for each slider may be a range of values for the color adjustment parameters corresponding to that slider.
The interface shown in fig. 4c is not intended to be a limiting illustration of the type of color adjustment control and the location of the display for color adjustment parameters, and in other embodiments, the color adjustment interface may be displayed in a different layout style than the interface shown in fig. 4 c.
302: a first color adjustment parameter value set by a user through a color adjustment control is obtained.
For example, a user may adjust color adjustment parameters based on provided color adjustment controls.
For example, fig. 4d shows a schematic diagram of a user operation according to an embodiment of the present application.
Referring to fig. 4d, the user may input specific values of color adjustment parameter 1, color adjustment parameter 2, color adjustment parameter 3, color adjustment parameter n based on input boxes 410c, 410d, 410e, 410f of color adjustment interface 410, respectively. The terminal device 10 may use the value input by the user as the first color adjustment parameter value corresponding to each color adjustment parameter. Here, the terminal device 10 may provide a prompt window to the user based on the parameter attributes of the respective color adjustment parameters, wherein the parameter attributes of the color adjustment parameters may include a data type, a numerical range, and the like. When the color adjustment parameter value input by the user does not accord with the parameter attribute of the color adjustment parameter, the user can be prompted to input a value accord with the parameter attribute in a mode of popup prompt window.
Fig. 4e shows a schematic view of a prompt window according to an embodiment of the application.
Referring to fig. 4e, when the value 1350 input by the user at the input box 410c exceeds the value range of 0 to 300 of the color adjustment parameter 1, the terminal device 10 may pop up the prompt window 410g to prompt the user to input the correct value range.
Fig. 4f shows another user operation schematic according to an embodiment of the application.
Referring to fig. 4f, the user may determine specific values of the color adjustment parameters 1, 2, 3, and n by sliding the slider based on the sliding bars 411c, 411d, 411e, 411f of the color adjustment interface 411, respectively. The terminal device 10 may use the value corresponding to the slider after the user operates as the first color adjustment parameter value corresponding to each color adjustment parameter.
Here, it is understood that the terminal device 10 may provide the user with a region selection function.
Fig. 4g shows a schematic representation of an adjustment region selection according to an embodiment of the application.
Referring to fig. 4g, taking the color adjustment interface 411 as an example, the user may select a region of interest (region of interest, ROI) in the photo display region 411a in which the color tone is desired to be adjusted, for example, the user may select a face region in the photo display region 411a as the ROI region 411g. Further, specific values of the color adjustment parameters 1, 2, 3, and n of the ROI area 411g may be determined based on the slide bars 411c, 411d, 411e, 411f of the color adjustment interface 411 to perform color adjustment on the ROI area 411g.
303: and performing color adjustment on the shot photo based on the first color adjustment parameter value to obtain a first preview image.
Illustratively, each color adjustment parameter provided to the user by the terminal device 10 may be a fusion of one or more ISP parameters, which may be AWB parameters in the AWB module and/or CCM parameters in the CCM module.
In one example manner, a plurality of color adjustment parameters may be determined based on the AWB parameters in the AWB module, the CCM parameters in the CCM module, and their respective parameter attributes. Here, the process is carried out. The parameter attributes of the AWB parameter and CCM parameter may include data type, value range, difference range between parameters, etc. The above-mentioned numerical range and the difference range can be adaptively adjusted according to different application scenarios, and the above-mentioned numerical range and the difference range are not limited herein.
Specifically, fig. 5 shows a mapping diagram of color adjustment parameters and ISP parameters according to an embodiment of the present application.
Referring to fig. 5, ISP parameters for fusing color tuning parameters may include a plurality of different types of AWB parameters, which may be one or more of AWB parameters 1 through 11 in the AWB module, and CCM parameters, which may be one or more of CCM parameters 1 through 4 in the CCM module.
Wherein AWB parameter 1 represents a light source Weight (lightsource_weight);
AWB parameter 2, representing a first light source confidence (dayleght_lucus_prob_top1_lightsource);
AWB parameter 3, representing the first light source red channel Gain (r_gain_top1_lightsource);
AWB parameter 4, representing a first light source red channel offset (r_gain_locus_offset_top1_lightsource);
AWB parameter 5, representing the first light source blue channel Gain (b_gain_top1_lightsource);
AWB parameter 6, which represents a first light source blue channel offset (b_gain_lock_offset_top1_lightsource);
AWB parameter 7, representing a second light source confidence (dayleght_lucus_prob_top2_lightsource);
AWB parameter 8, representing the second light source red channel Gain (r_gain_top2_lightsource);
AWB parameter 9, representing a second light source red channel offset (r_gain_locus_offset_top2_lightsource);
AWB parameter 10, representing the second light source blue channel Gain (b_gain_top2_lightsource);
AWB parameter 11, represents the second light source blue channel offset (b_gain_lock_offset_top2_lightsource).
The first light source and the second light source are two light sources which cover the most pixels of the photo among a plurality of light sources for shooting the photo by the user.
CCM parameter 1, representing the HUE parameter segment intensity (hue_idxx) with index X;
CCM parameter 2, representing SATURATION parameter segment strength (saturation_idxx) with index X;
CCM parameter 3, representing a HUE parameter segment weight (hue_idxx_weight) with index X;
CCM parameter 4, represents SATURATION parameter segment weight (saturation_idxx_weight) with index X.
Wherein the index X may be determined from the color space of the ROI selected by the user based on step 302 described above. For example, terminal device 10 may map the user-selected ROI from an RGB color space model to a YUV color space model based on a color space conversion module in the ISP, and calculate index value X based on the chrominance components in the YUV color space model.
Furthermore, the AWB parameters 1 to 11 and the CCM parameters 1 to 4 described above can be mapped to a small number of color adjustment parameters open to the user. For example, with continued reference to fig. 5, the color adjustment parameters may include a white balance parameter and a color parameter. Specifically, AWB parameters 1 to AWB parameters 11 in the foregoing AWB module may be mapped to white balance parameters 1 to 5, and CCM parameters 1 to 4 in the foregoing CCM module may be mapped to color parameters 1 to 3.
Furthermore, the first color adjustment parameter value may be restored to a first ISP parameter value according to a mapping relationship between the AWB parameter 1 to AWB parameter 11 and the white balance parameter 1 to white balance parameter 5, and a mapping relationship between the CCM parameter 1 to CCM parameter 4 and the color parameter 1 to color parameter 3, wherein the first ISP parameter value may include a first AWB parameter value and/or a first CCM parameter value. And further, performing color adjustment on the shot photo according to the restored first AWB parameter value and/or the first CCM parameter value to obtain a first preview image.
Here, the process of restoring the first color adjustment parameter value to the first ISP parameter value is substantially the same as the process of restoring the second color adjustment parameter value to the second ISP parameter value in step 305 described below. To ensure narrative consistency, the foregoing reduction process will be described in more detail in step 305 below.
304: and acquiring a second color adjustment parameter value set by the user through the color adjustment control.
Illustratively, the terminal device 10 displays the first preview image obtained in the foregoing step 303 on a color adjustment interface, such as the foregoing color adjustment interface 410 or the foregoing color adjustment interface 411. The first preview image displayed at the color adjustment interface may not conform to the user's expectations and therefore the user may set the second color adjustment parameter values via the color adjustment control to again color adjust the taken photograph based on step 305 described below.
In one way of determining whether the preview image meets the user's expectations by the terminal equipment 10, fig. 6a shows a schematic diagram of an expectation determination.
Referring to fig. 6a, taking the color adjustment interface 611 as an example, the terminal device 10 may set a time threshold, and if the user does not perform the sliding adjustment operation on the sliding bar 611c, the sliding bar 611d, the sliding bar 611e, and the sliding bar 611f within the time threshold, the popup window 611h may be displayed to confirm whether the adjustment is completed to the user. If the user clicks "no," it may be determined that the first preview image does not meet the user's expectations, then the second color adjustment parameter value set by the user through the color adjustment control may continue to be obtained.
Alternatively, in another way of determining user satisfaction, FIG. 6b shows another effect determination schematic.
Referring to fig. 6b, taking color adjustment interface 611 as an example, a confirm button 611i may be provided in color adjustment area 611 b. If the user does not click the confirm button 611i, it may be determined that the first preview image does not meet the user's expectations, then the second color adjustment parameter value set by the user through the color adjustment control may be continuously obtained.
It will be appreciated that if the second preview image is not in accordance with the user's expectations based on the second color adjustment parameter values, the user may make repeated adjustments based on the color adjustment control until a preview image in accordance with the expectations is adjusted.
305: and performing color adjustment on the shot photo based on the second color adjustment parameter value to obtain a second preview image.
For example, the terminal device 10 may determine the second ISP parameter value corresponding to the second color adjustment parameter value based on the aforementioned mapping relationship between the AWB parameters 1 to AWB parameter 11 and the white balance parameters 1 to 5, and the mapping relationship between the CCM parameters 1 to CCM parameter 4 and the color parameters 1 to 3. The second color adjustment parameter values may include a second white balance parameter and a second color parameter.
Specifically, the white balance parameter 1, which represents a light source weight parameter for adjusting the color temperature of a photo, may be named as lightsource_weight;
white balance parameter 2, which represents the red channel gain parameter of the first light source, may be named r_gain_top1_lightsource;
white balance parameter 3, which represents a blue channel gain parameter of the first light source, may be named b_gain_top1_lightsource;
white balance parameter 4, representing the red channel gain parameter of the second light source, may be named r_gain_top2_lightsource;
white balance parameter 5, which represents the blue channel gain parameter of the second light source, may be named b_gain_top2_lightsource.
Here, the names of the aforementioned white balance parameters 1 to 5 are not limited to the description.
The mapping relationship between the white balance parameters 1 to 5 and the AWB parameters 1 to 11 may be, for example:
white balance parameter 1 (light source weight parameter) =awb parameter 1 (light source weight);
white balance parameter 2 (red channel gain parameter of first light source) =awb parameter 2 (first light source confidence) ×awb parameter 3 (first light source red channel gain) ×awb parameter 4 (first light source red channel offset);
white balance parameter 3 (blue channel gain parameter of first light source) =wb parameter 2 (first light source confidence) ×awb parameter 5 (first light source blue channel gain) ×awb parameter 6 (first light source blue channel offset);
white balance parameter 4 (red channel gain parameter of second light source) =wb parameter 7 (second light source confidence) ×awb parameter 8 (second light source red channel gain) ×awb parameter 9 (second light source red channel offset);
white balance parameter 5 (blue channel gain parameter of the second light source) =wb parameter 7 (second light source confidence) ×awb parameter 10 (second light source blue channel gain) ×awb parameter 11 (second light source blue channel offset).
Specifically, color parameter 1, which represents an ROI parameter for determining hue and saturation of a color adjustment region, may be named ROI;
Color parameter 2, representing a HUE parameter for adjusting the HUE of the ROI, may be named HUE;
color parameter 3, which represents a SATURATION parameter for adjusting the SATURATION of a photograph, may be named SATURATION.
Here, the names of the foregoing color parameters 1 to 3 are not limited to the description.
The mapping relationship between the color parameters 1 to 3 and the CCM parameters 1 to 4 may be, for example:
color parameter 1 (ROI parameter, ROI) corresponds to CCM parameter 1 (hue parameter segment intensity with index X) and CCM parameter 2 (saturation parameter segment intensity with index X) of the user-selected ROI area;
color parameter 2 (tone parameter) =ccm parameter 3 (tone parameter segment weight with index X);
color parameter 3 (saturation parameter) =ccm parameter 4 (saturation parameter segment weight with index X).
Further, based on the data types of the AWB parameters 1 to 11, the data correlations between the AWB parameters 1 to 11, and the above mappings between the white balance parameters 1 to 5 and the AWB parameters 1 to 11, the second AWB parameter value corresponding to the second white balance parameter value among the second color adjustment parameter values input by the user can be determined.
Accordingly, based on the data type of CCM parameters 1 to 4, the data association between CCM parameters 1 to 4, and the mapping relationship between the above color parameters 1 to 3 and CCM parameters 1 to 4, a second CCM parameter value corresponding to a second color parameter value among the second color adjustment parameter values inputted by the user can be determined. Here, the algorithm for determining the AWB parameter value and/or the CCM parameter value may include an optimal solution algorithm such as a greedy algorithm (greedy).
Further, the second AWB parameter value and the second CCM parameter value may be used as the second ISP parameter value. The terminal device 10 may color adjust the taken photograph based on the second ISP parameter value to obtain a second preview image and display the second preview image on a color adjustment interface, such as the color adjustment interface 410 or the color adjustment interface 411 described above.
306: a user operation confirming that the second preview image is a display image of the photo browsing interface is detected.
For example, referring to fig. 6a, the terminal device 10 may set a time threshold, and if the user does not perform the sliding adjustment operation on the sliding bars 611c, 611d, 611e, 611f within the time threshold, the popup window 611h may be displayed to confirm whether the adjustment is completed to the user. If the user clicks "yes", it may be determined that the second preview image meets the user's expectations, and the terminal device 10 may confirm that the second preview image is taken as a display image of the photo browsing interface.
Alternatively, in another way of determining user satisfaction, referring to fig. 6b described above, using color adjustment interface 611 as an example, a confirm button 611i may be provided in color adjustment area 611 b. If the user clicks the confirm button 611i, it may be determined that the second preview image meets the user's expectations, the terminal device 10 may confirm the second preview image as a display image of the photo browsing interface.
307: and displaying the second preview image on a photo browsing interface.
Illustratively, the terminal device 10 may display the second preview image that meets the user's expectations in the photo browsing interface based on the relevant content of the foregoing step 306.
For example, FIG. 7 illustrates a schematic diagram of a photo browsing interface, according to an embodiment of the present application.
Referring to fig. 7, the second preview image that meets the user's expectations may be an image 701a. Further, the terminal apparatus 10 may display the image 701a in the photo browsing interface 701.
308: the second color adjustment parameter value is taken as the target color adjustment parameter value.
Illustratively, the second preview image is in line with the user's expectations, indicating that the terminal device 10 may determine a target ISP parameter value corresponding to the scene in which the photograph is taken based on the second color adjustment parameter value. Thus, the second color adjustment parameter value may be taken as the target color adjustment parameter value for determining the target ISP parameter value.
309: and determining a target ISP parameter value corresponding to the same shooting scene according to the target color adjustment parameter value.
Illustratively, based on the foregoing, the photographed scene of the photographed picture may be divided based on the EXIF information of the photographed picture, and thus, with continued reference to fig. 5 described above, a correspondence relationship between the color adjustment parameters and the EXIF information of the photographed picture may be established. Specifically, the EXIF information of the photographed picture may include EXIF parameters 1, 2, 3, 4, 5, and 6.
Wherein EXIF parameter 1 represents a correlated color temperature (correlated colour temperature, CCT) parameter;
EXIF parameter 2, representing a Luminance (LV) parameter;
EXIF parameter 3, representing a light source weight (light source problitity, lightSourceProb) parameter;
EXIF parameter 4, which represents a spatial GAIN (spat_gain) parameter;
EXIF parameter 5, an equivalent Gain (eqv_gain) parameter;
EXIF parameter 6, represents a color shift (pregain) parameter.
Specifically, after the target color adjustment parameter value adjusted by the user is determined based on the foregoing step 308, a correspondence relationship between the target color adjustment parameter value and EXIF information of the taken photograph may be established. For example, a correspondence between the target color adjustment parameter value and the CCT parameter value and LV parameter value of the photograph may be established.
It will be appreciated that the terminal device 10 may also upload the target color adjustment parameter value and EXIF information of the photographed picture to the server, and then the server determines the target ISP parameter value corresponding to the same photographed scene based on the target color adjustment parameter value and EXIF information of the photographed picture.
Taking the example that the terminal device 10 determines the target ISP parameter value corresponding to the same shooting scene, fig. 8 shows a flow chart of determining the target ISP parameter value by the terminal device 10. As shown in fig. 8, the terminal device 10 may receive the target color adjustment parameter values and the CCT parameter values and the LV parameter values corresponding thereto adjusted by a plurality of users, and determine the target ISP parameter values corresponding to the shooting scene based on the flow of steps 801 to 813 shown in fig. 8. Here, in order to ensure narrative continuity, a specific flow of determining target ISP parameter values by the terminal apparatus 10 shown in fig. 8 will be described below.
310: in response to a user operation to take a photograph, EXIF information of the taken photograph is acquired.
For example, after the terminal device 10 determines target ISP parameter values corresponding to a plurality of photographing scenes based on target color adjustment parameters and EXIF information transmitted by a plurality of users, if the user photographs again, EXIF information of the photographed photographs may be obtained to adjust color effects of the photographed photographs based on the following steps 311 and 312.
311: a shooting scene to which the shot picture belongs is determined based on the EXIF information.
For example, the terminal apparatus 10 may determine a photographing scene to which the photographed picture belongs based on the CCT parameter and the LV parameter in the EXIF information of the photographed picture. To ensure narrative consistency, the content of the shooting scene to which the shot belongs is determined as will be explained in detail in the following description of fig. 8.
312: and performing color adjustment on the shot photo based on the target ISP parameter value corresponding to the shot scene.
For example, after determining a shooting scene to which a shot belongs, the terminal device 10 may adjust the color effect of the shot according to the target ISP parameter value corresponding to the shooting scene.
For example, fig. 9a shows a schematic diagram of a display effect of the terminal device 10 according to an embodiment of the present application.
Referring to fig. 9a, when the user clicks the photo browsing icon 901a in the photographing interface 901, the terminal device 10 may display the resulting image 902a in the photo browsing interface 902 after color-adjusting the photographed photo based on the target ISP parameter value. It will be appreciated that the image 902a in the photo browsing interface 902 may now achieve a visual effect that meets the user's expectations.
As another example, fig. 9b shows another display effect diagram of the terminal device 10 according to an embodiment of the present application.
Referring to fig. 9b, when the user clicks the photo browsing icon 901a in the photographing interface 901, the terminal device 10 may display the resulting image 902a in the color adjustment interface 903 after color adjustment of the photographed photo based on the photographing target ISP parameter value. It will be appreciated that if the image 902a in the color adjustment interface 903 has not yet reached the desired effect of the user of the terminal device 10, the user may continue to input color adjustment parameter values based on the color adjustment interface 903, so that the terminal device 10 may dynamically optimize the target ISP parameter values of each shooting scene.
The specific flow of determining the target ISP parameter value by the terminal device 10 in step 309 will be described in detail with reference to the flow chart shown in fig. 8.
Specifically, as shown in fig. 8, an implementation process for determining a target ISP parameter value by using a terminal device 10 as an execution subject according to an embodiment of the present application may include the following steps:
801: target color adjustment parameter values from a plurality of users and corresponding EXIF information thereof are received.
For example, the terminal device 10 may receive EXIF information of target color adjustment parameter values adjusted by different users from a plurality of terminal devices and photos adjusted by the target color adjustment parameter values, and/or EXIF information of target color adjustment parameter values adjusted by users from different user terminals of the same terminal device and photos adjusted by the target color adjustment parameter values. Here, the user side may include an application program capable of performing color adjustment on a photographed picture based on the image processing method provided in the present embodiment. The EXIF information received for adjusting the photo may include CCT parameter values and LV parameter values for the photo.
It will be appreciated that the target color adjustment parameter values adjusted by different users may be different for photographs taken with the same or different EXIF information. Thus, each color adjustment parameter may have a plurality of target color adjustment parameter values.
802: different shooting scenes are divided based on EXIF information.
For example, a shot picture may be divided into a number of shot scenes based on a numerical range of a number of parameters in EXIF information. For example, a shooting scene may be determined according to the numerical ranges of the CCT parameters and the LV parameters. In particular, CCT parameter values and LV parameter values may be divided into data segments based on a range of values of the CCT parameter and the LV parameter. Furthermore, the photographed pictures with the CCT parameter values and the LV parameter values both located in the same data section can be divided into the same photographed scene. Here, a specific EXIF parameter type for dividing a photographed scene is not described restrictively.
803: and taking the ISP parameter value corresponding to the target color adjustment parameter value in the same shooting scene as a first ISP parameter value of the shooting scene.
For example, the received target color adjustment parameter values may be classified by photographing scene based on the data pieces of EXIF information corresponding to the respective photographing scenes. Further, ISP parameter values corresponding to respective target color adjustment parameter values belonging to the same shooting scene may be determined and used as the first ISP parameter value for determining the target ISP parameter value corresponding to the shooting scene. In particular, the first ISP parameter values may include parameter values of AWB parameter 1 through AWB parameter 11, and CCM parameter 1 through CCM parameter 4, corresponding to the target color adjustment parameter value.
Here, the specific content of the ISP parameter values corresponding to the color adjustment parameter values can be referred to in the foregoing step 305, which is not described herein.
804: and calculating the deviation value of each first ISP parameter value of the same shooting scene.
Illustratively, based on the foregoing step 801, each color adjustment parameter may have a plurality of target color adjustment parameter values. Thus, the first ISP parameter value of one ISP parameter determined based on the target color adjustment parameter value may also be a plurality. Furthermore, the deviation value between each first ISP parameter value of each ISP parameter and the preset standard value may be calculated based on the preset standard value of each ISP parameter. The deviation value may be, for example, a difference between the first ISP parameter value and a predetermined standard value.
Here, it is understood that the preset standard values of the ISP parameters in different shooting scenes may be different, and a limitation is not made on the preset standard values of the ISP parameters corresponding to the shooting scenes.
805: it is determined whether each deviation value is less than a minimum discard threshold.
If the determination is yes, indicating that the deviation value is less than the minimum discard threshold, the following step 807 may be performed;
if the determination is negative, indicating that the deviation value is greater than or equal to the minimum discard threshold, then step 806 may be performed as follows.
For example, if the deviation between the first ISP parameter value of the ISP parameter and the preset standard value is small, it is indicated that there is no need to adjust the target ISP parameter value of the ISP parameter according to the first ISP parameter value. Accordingly, to avoid unnecessary resource waste, a minimum discard threshold corresponding to each ISP parameter may be set to delete the first ISP parameter value having a smaller difference from the preset standard value based on step 807 described below. Here, the minimum discard threshold corresponding to the ISP parameter is not described in a limiting manner.
806: it is determined whether the respective bias values are greater than a maximum discard threshold.
If the determination is yes, indicating that the deviation value is greater than the maximum discard threshold, the following step 807 may be performed;
if the determination is negative, indicating that the deviation value is less than or equal to the maximum discard threshold, then step 808, described below, may be performed.
For example, if the deviation between the first ISP parameter value of the ISP parameter and the preset standard value is large, it is indicated that the first ISP parameter value may exceed the reasonable range of the ISP parameter in the shooting scene. Therefore, to avoid unnecessary resource waste, a maximum discard threshold corresponding to each ISP parameter may be set to delete the first ISP parameter value having a larger difference from the preset standard value based on step 807 described below. Here, the maximum discard threshold corresponding to the ISP parameter is not described in a limiting manner.
807: and deleting the first ISP parameter value corresponding to the deviation value.
Illustratively, based on the foregoing steps 805 and 806, the first ISP parameter value having a deviation from the preset standard value less than the minimum discard threshold or greater than the maximum discard threshold may be deleted, while the first ISP parameter value having a deviation from the preset standard value between the minimum discard threshold and the maximum discard threshold is retained. Further, the influence of unreasonable data of user feedback on the target ISP parameter value can be reduced while the target ISP parameter value is determined based on the user feedback.
808: and adding the first ISP parameter value corresponding to the deviation value into the parameter sequence of the shooting scene.
For example, a parameter sequence corresponding to each shooting scene may be set, and the parameter sequence may include several subsequences, and each subsequence may correspond to each ISP parameter. Specifically, a sub-sequence may include a first ISP parameter value based on one of the ISP parameters retained in steps 805 through 807 described above. Here, the ISP parameters corresponding to the subsequences may be one or more of the AWB parameters 1 to 11 and CCM parameters 1 to 4.
809: and judging whether the number of the first ISP parameter values in the parameter sequence is larger than a preset threshold value.
If the determination is yes, it indicates that the number of the first ISP parameter values is greater than the preset threshold, the following step 810 may be performed;
if the determination result is no, which indicates that the number of the first ISP parameter values is less than or equal to the preset threshold, the foregoing steps 801 to 809 may be repeatedly performed.
For example, if there are fewer first ISP parameter values to determine the target ISP parameter value, the determined target ISP parameter value may lack stability. Thus, the number of first ISP parameter values in each sub-sequence may be compared to a preset threshold. If the number of the first ISP parameter values is smaller than the preset threshold value, the target color adjustment parameters and the corresponding EXIF information from different users can be continuously received so as to enlarge the data volume for determining the target ISP parameter values. It will be appreciated that the preset thresholds for the various ISP parameters may be different and are not described in a limiting manner herein.
810: and judging that the first ISP parameter value belongs to the AWB module or the CCM module.
If the judgment result is that the CCM module belongs, the following step 811 can be executed;
if the determination is that the AWB module belongs, the following step 812 may be performed.
Illustratively, different policies may be employed to determine target ISP parameter values corresponding to ISP parameters of different modules based on steps 811 and 812 described below.
811: and calculating CCM statistics of the first ISP parameter value of the CCM module according to the preference distribution of the user.
The first ISP parameter value of the CCM module may be, for example, a first CCM parameter value, for example, parameter values of CCM parameters 1 to 4 determined based on the target color adjustment parameter value adjusted by the user.
The parameter values of CCM parameters 1 to 4 determined based on the target color adjustment parameter value of the user are greatly affected by the personal preference of the user. Accordingly, CCM statistics for each CMM parameter may be calculated for CCM parameters 1 through 4 according to the user's preference profile to take the CCM statistics as the target CCM parameter value for the CCM parameter based on step 813 described below.
Specifically, the parameter values of CCM parameters 1 through 4 determined according to the target color adjustment parameter values provided by a large number of users can be analyzed based on big data technology, so as to determine the preference distribution of the users to the CCM parameters. For example, the average of the parameter values of CCM parameters 1 to 4 may be calculated as a standard value characterizing the preference distribution of the user for the CMM parameters.
Further, the weight of a first CCM parameter value of each CCM parameter may be determined based on a difference between the first CCM parameter value and its standard value. Taking the standard value of the CCM parameter 1 as 300, the first CCM parameter value comprises 200, 300 and 400 as examples, and the difference between the first CCM parameter value with the value of 300 and the standard value is 0, so that the weight of the first CCM parameter value can be 1; the difference between the first CCM parameter values and the standard values, which are 200 and 400, are 100, and thus the weights thereof may be 0.5. It will be appreciated that the above is merely an example, and the present application is not limited to specific rules for setting weights.
Further, CCM statistics of CCM parameters may be calculated based on the plurality of first CCM parameter values and weights thereof. For example, the CCM statistics of the CCM parameters may be weighted averages calculated based on the respective first CCM parameter values and their weights. Further, the weighted average of the CCM parameters may be used as the target CCM parameter value for the CCM parameter based on step 813 described below.
812: and calculating an AWB statistical value of a first ISP parameter value of the AWB module according to the camera color cast parameter.
For example, the first ISP parameter value of the AWB module may be a first AWB parameter value, for example, parameter values of AWB parameters 1 to 11 determined based on the user-adjusted target color adjustment parameter value.
The parameter values of the AWB parameters 1 to 11 determined on the basis of the target color adjustment parameter value of the user are greatly influenced by the camera color cast of the terminal device. Accordingly, for the AWB parameters 1 to 11, AWB statistics of the respective AWB parameters may be calculated according to the color cast parameter of the terminal device camera to take the AWB statistics as target AWB parameter values of the AWB parameters based on step 813 described below.
Specifically, the color cast parameter of the camera of the terminal device may be a pregain parameter in EXIF information of the photographed picture, where the pregain parameter has a corresponding standard value. Thus, for example, after target color adjustment parameter values and corresponding EXIF information from a plurality of users are acquired, weights of AWB parameters corresponding to the respective terminal devices may be determined according to differences between the pregain parameter values and standard values of the pregain parameters therein. Taking the pregain parameter standard value as 256, the pregain parameter value from one user as 400, and the pregain parameter value from another user as 256 as an example. The difference between the pregain parameter value with the value of 256 and the pregain parameter standard value is 0, so that the AWB parameter weight corresponding to the terminal equipment of the user can be 1; the difference between the pregain parameter value with the value 400 and the pregain parameter standard value is large, and for example, in order to reduce the influence of the first AWB parameter value determined based on the terminal device of the user on the target AWB parameter value, the AWB parameter weight corresponding to the terminal device of the user may be set to a small value, for example, 0.4. It will be appreciated that the above is merely an example, and the present application is not limited to specific rules for setting weights.
Further, AWB statistics for the AWB parameters may be calculated based on the plurality of first AWB parameter values and their corresponding AWB parameter weights. For example, the AWB statistics of the AWB parameters may be weighted averages calculated based on the respective first AWB parameter values and their corresponding AWB parameter weights. Further, the weighted average of the AWB parameters may be taken as the target AWB parameter value for the AWB parameters based on step 813 described below.
813: and taking the AWB statistic value and the CCM statistic value as target ISP parameter values corresponding to the shooting scene.
For example, after the AWB statistic and the CCM statistic are determined in the foregoing steps 811 and 812, the AWB statistic and the CCM statistic may be used as the target ISP parameter value corresponding to the shooting scene.
Specifically, after determining the weighted average of the CCM parameters in a certain shooting scene based on the foregoing step 811, the weighted average may be used as the target CCM parameter value of the CCM parameters corresponding to the shooting scene. After determining the weighted average of the AWB parameters in a certain shooting scene based on the foregoing step 812, the weighted average may be used as the target AWB parameter value of the AWB parameter corresponding to the shooting scene. Further, the target ISP parameter values may include the aforementioned target CCM parameter values and the aforementioned target AWB parameter values.
Based on the above detailed description of the flow shown in fig. 3 and fig. 8, it can be understood that, for each shooting scene, the image processing method provided by the present application can determine the target ISP parameter corresponding to the shooting scene based on the result of the user debugging the color adjustment parameter. Therefore, for the atypical scene, the corresponding target ISP parameters can be determined based on the method, so that the color effect of the photo taken in the atypical scene can also meet the visual requirement of a user.
Specifically, the following embodiment 2 provides a specific implementation procedure for implementing the image processing method provided by the present application, with the terminal device and the server side as execution subjects.
Example 2
The embodiment will specifically explain an image processing method based on common implementation of the terminal device 10 and the server 20 according to the present application with reference to the accompanying drawings.
It will be appreciated that the terminal device 10 to which the image processing method provided in this embodiment is applicable may include, but is not limited to, a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a netbook, and an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a smart television, a smart watch, and other wearable devices, a car machine device, a portable game machine, a portable music player, and other terminal devices having one or more processors. The applicable server side 20 may include, but is not limited to, cloud servers, physical servers, and the like.
The image processing method based on the common implementation of the terminal device 10 and the server 20 provided by the present application will be specifically described with reference to the accompanying drawings.
Fig. 10 shows an image processing interaction diagram based on a common implementation of the terminal device 10 and the server side 20 according to an embodiment of the present application.
Referring to fig. 10, the image processing method specifically includes the following steps:
1001: the terminal device 10 displays a color adjustment interface including a color adjustment control in response to a user operation to view a taken photograph.
For example, in response to a user operation for viewing a shot photo, the terminal device 10 displays specific content of a color adjustment interface including a color adjustment control, which is described in the foregoing step 301, and will not be described herein.
1002: the terminal device 10 obtains the first color adjustment parameter value set by the user through the color adjustment control.
For example, the specific content of the first color adjustment parameter value set by the user through the color adjustment control may be obtained by the terminal device 10, which is described in the foregoing step 302, and will not be described herein.
1003: the terminal device 10 color-adjusts the photographed picture based on the first color-adjustment parameter value, resulting in a first preview image.
For example, the terminal device 10 performs color adjustment on the shot photo based on the first color adjustment parameter value to obtain the specific content of the first preview image, which is described in the foregoing step 303, and will not be described herein.
1004: the terminal device 10 obtains a second color adjustment parameter value set by the user via the color adjustment control.
For example, the specific content of the second color adjustment parameter value set by the user through the color adjustment control may be obtained by the terminal device 10, which is described in the foregoing step 304 and will not be described herein.
1005: the terminal device 10 color-adjusts the taken picture based on the second color-adjustment parameter value to obtain a second preview image.
For example, the terminal device 10 performs color adjustment on the shot photo based on the second color adjustment parameter value to obtain the specific content of the second preview image, which is described in the foregoing step 305, and will not be described herein.
1006: the terminal device 10 detects a user operation confirming the second preview image as a display image of the photo preview interface.
For example, the specific content of the user operation for confirming the second preview image as the display image of the photo preview interface detected by the terminal device 10 may be referred to as the related description in the foregoing step 306, which is not described herein.
1007: the terminal device 10 displays the second preview image on the photo browsing interface.
For example, the specific content of the second preview image displayed by the terminal device 10 in the photo browsing interface may be referred to the description in the foregoing step 307, which is not described herein.
1008: the terminal device 10 takes the second color adjustment parameter value as the target color adjustment parameter value.
For example, the specific content of the terminal device 10 regarding the second color adjustment parameter value as the target color adjustment parameter value may be referred to the description in the foregoing step 308, which is not described herein.
1009: the terminal device 10 transmits the target color adjustment parameter value and EXIF file information of the photographed picture to the server side 20.
For example, after determining the target color adjustment parameter value of the photographed picture based on the aforementioned step 1008, the terminal device 10 may transmit the target color adjustment parameter value and EXIF information of the photographed picture to the server side 20. The EXIF information transmitted may be, for example, CCT parameter values, LV parameter values, pregain parameter values of the photographed picture. Here, the type of EXIF parameter to be transmitted is not limited.
1010: the server 20 determines the target ISP parameter value corresponding to the same shooting scene according to the target color adjustment parameter value.
Here, the content of the target ISP parameter value corresponding to the same shooting scene determined by the server 20 according to the target color adjustment parameter value is substantially the same as the content of the target ISP parameter value determined by the terminal device shown in fig. 8, which is not described herein.
1011: the server 20 transmits EXIF information and target ISP parameter values corresponding to the shooting scene to the terminal device 10.
For example, after determining the shooting scenes and the corresponding target ISP parameter values thereof, the server 20 may update each shooting scene and the corresponding target ISP parameter value thereof into the installation package and/or the update package of the shooting and/or picture processing application program, so that the terminal device 10 obtains each shooting scene and the corresponding target ISP parameter value thereof by downloading or updating the shooting and/or picture processing application program. Here, a specific manner in which the server 20 photographs the scene and the corresponding target ISP parameter values to the terminal device 10 is not limited.
1012: the terminal device 10 acquires EXIF information of the photographed picture in response to a user operation of photographing the picture.
For example, the terminal device 10 obtains the specific content of the exchangeable image file information of the photographed picture in response to the user operation of photographing the picture, which is described in the foregoing step 310, and will not be described herein.
1013: the terminal device 10 determines a shooting scene to which the shot photograph belongs based on the EXIF information.
For example, the terminal device 10 determines, based on EXIF information of the photographed picture, the specific content of the photographed scene to which the photographed picture belongs, which is described in the foregoing step 311, and will not be described herein.
1014: the terminal device 10 performs color adjustment on the photographed picture based on the target ISP parameter value corresponding to the photographed scene.
For example, the specific content of the terminal device 10 for performing color adjustment on the shot photo based on the target ISP parameter value corresponding to the shot scene can be referred to the related description in the foregoing step 312, which is not described herein.
It can be understood that, in the image processing method provided in this embodiment, based on the interactive implementation of the terminal device and the server, the method transfers the received target color adjustment parameter values and corresponding EXIF information sent by the plurality of terminal devices, and determines the content of each shooting scene and the corresponding target ISP parameter value based on the received data to the server. Therefore, reasonable resource allocation is realized, and the running speed and performance of the terminal equipment are effectively improved.
Fig. 11 shows a schematic structural diagram of the terminal device 10 of the present application according to an embodiment of the present application.
The terminal device 10 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriberidentification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal apparatus 10. In other embodiments of the application, the terminal device 10 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digitalsignal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, the processor 110 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, or the like. The I2C interface is a bidirectional synchronous serial bus, and includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device 10.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not constitute a structural limitation of the terminal device 10. In other embodiments of the present application, the terminal device 10 may also use different interfacing manners, or a combination of multiple interfacing manners, in the foregoing embodiments.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The terminal device 10 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emittingdiode), a flexible light-emitting diode (flex), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantumdot light emitting diodes, QLED), or the like. In some embodiments, the terminal device 10 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal apparatus 10 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to the naked eye. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the terminal device 10 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal device 10 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The terminal device 10 may support one or more video codecs. In this way, the terminal device 10 can play or record video in a plurality of encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the terminal device 10 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the terminal device 10 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The terminal device 10 may receive key inputs, generating key signal inputs related to user settings and function controls of the terminal device 10.
Fig. 12 shows a software architecture block diagram of a terminal device system according to an embodiment of the application.
The system of the terminal device 10 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the terminal device 10 is illustrated.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
As shown in fig. 11, the application layer may include a series of application packages. The application package may include camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the terminal device 10. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The embodiment of the application also provides a computer program product for realizing the image processing method provided by each embodiment.
Embodiments of the disclosed mechanisms may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer program modules or module code executing on a programmable system including at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Computer program modules or module code may be applied to the input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), microcontroller, application specific integrated circuit (application specific integrated circuit, ASIC), or microprocessor.
The module code may be implemented in a high level modular language or an object oriented programming language for communication with a processing system. The module code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, magneto-optical disks, read Only Memories (ROMs), random access memories (randomaccess memory, RAMs), erasable programmable read only memories (erasable programmable read only memory, EPROMs), electrically erasable programmable read only memories (electrically erasable programmable read-only memories, EEPROMs), magnetic or optical cards, flash memory, or tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example implementation or technique disclosed in accordance with embodiments of the application. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The disclosure of the embodiments of the present application also relates to an operating device for executing the text. The apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processors for increased computing power.
Additionally, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure of embodiments is intended to be illustrative, but not limiting, of the scope of the concepts discussed herein.

Claims (15)

1. An image processing method applied to an electronic device, comprising:
acquiring a first image to be processed;
acquiring a first color parameter value of a color parameter corresponding to a first shooting scene to which the first image belongs;
performing color adjustment on the first image based on the first color parameter value to obtain a second image,
the first color parameter value is determined based on a first adjustment parameter value, wherein the first adjustment parameter value is a parameter value of an adjustment parameter input by a user for performing color adjustment on a third image, and the third image is the same as a shooting scene to which the first image belongs.
2. The method of claim 1, wherein the color parameters include at least one of the following parameters of the ISP:
hue intensity parameters, saturation intensity parameters, hue weight parameters, saturation weight parameters in the CCM module of ISP;
Light source weight parameters and light source parameters in the AWB module of the ISP.
3. The method of claim 2, wherein the light source parameters comprise a first light source confidence parameter, a first light source red channel gain parameter, a first light source red channel offset parameter, a first light source blue channel gain parameter, a first light source blue channel offset parameter for the first light source, and,
a second light source confidence parameter, a second light source red channel gain parameter, a second light source red channel offset parameter, a second light source blue channel gain parameter, a second light source blue channel offset parameter for the second light source;
the first light source is a light source with the largest number of pixels covering the first image among a plurality of light sources for shooting the first image, and the second light source is a light source with the largest number of pixels covering the first image, which is inferior to the first light source.
4. A method according to claim 3, wherein the adjustment parameters comprise at least one of the following parameters:
a first white balance parameter corresponding to the light source weight parameter;
a second white balance parameter corresponding to the first light source confidence parameter, the first light source red channel gain parameter, the first light source red channel offset parameter;
A third white balance parameter corresponding to the first light source confidence parameter, the first light source blue channel gain parameter, the first light source blue channel offset parameter;
a fourth white balance parameter corresponding to the second light source confidence parameter, the second light source red channel gain parameter, the second light source red channel offset parameter;
a fifth white balance parameter corresponding to the second light source confidence parameter, the second light source blue channel gain parameter, the second light source blue channel offset parameter;
a first color correction parameter corresponding to the hue intensity parameter, the saturation intensity parameter;
a second color correction parameter corresponding to the tone weight parameter;
a third color correction parameter corresponding to the saturation weight parameter.
5. The method of claim 4, wherein the adjusting parameters and the color parameters correspond to the following:
first white balance parameter = light source weight parameter;
second white balance parameter = first light source confidence parameter x first light source red channel gain parameter x first light source red channel offset parameter;
third white balance parameter = first light source confidence parameter x first light source blue channel gain parameter x first light source blue channel offset parameter;
Fourth white balance parameter = second light source confidence parameter x second light source red channel gain parameter x second light source red channel offset parameter;
fifth white balance parameter = second light source confidence parameter x second light source blue channel gain parameter x second light source blue channel offset parameter;
the first color correction parameter corresponds to a hue intensity parameter and a saturation intensity parameter;
second color correction parameter=tone weight parameter;
third color correction parameter = saturation weight parameter.
6. The method of claim 1, wherein the first color parameter value of the color parameter is determined by:
acquiring M first adjustment parameter values of each adjustment parameter based on the M third images;
based on the corresponding relation between each adjusting parameter and the color parameter, N second color parameter values corresponding to the M first adjusting parameter values are calculated, wherein the N second color parameter values all meet the value range of the color parameter;
the first color parameter value for the color parameter is determined based on the N second color parameter values.
7. The method of claim 6, wherein said determining said first color parameter value for the color parameter based on said N second color parameter values comprises:
Calculating differences between the N second color parameter values and standard values of the color parameters;
selecting L second color parameter values with the difference value within a preset difference value range from the N second color parameter values;
the first color parameter value for the color parameter is determined based on the L second color parameter values.
8. The method of claim 7, wherein said determining said first color parameter value for the color parameter based on said L second color parameter values comprises:
if the color parameter belongs to the CCM module of the ISP, the first color parameter value X of the color parameter is:
wherein x is 1 、x 2 、…x L For said L second color parameter values of the color parameter,
A 1 、A 2 、…A L the weight of the L second color parameter values is determined based on the L second color parameter values and a standard value of the color parameter, wherein the smaller the difference value between the second color parameter values and the standard value is, the larger the weight of the second color parameter values is.
9. The method of claim 7, wherein said determining said first color parameter value for the color parameter based on said L second color parameter values comprises:
If the color parameter belongs to the AWB module of the ISP, the first color parameter value Y of the color parameter is:
wherein y is 1 、y 2 、…y L For said L second color parameter values of the color parameter,
B 1 、B 2 、…B L the weights of the L second color parameter values are determined based on the pregain parameter values of the third image corresponding to the L second color parameter values and the standard values of the pregain parameters, wherein the smaller the difference value between the pregain parameter values of the third image and the standard values is, the larger the weights of the second color parameter values corresponding to the third image are,
wherein the third image includes EXIF information including the pregain parameters.
10. The method according to claim 1, characterized in that the first shooting scene to which the first image and/or the third image belong is determined by:
acquiring EXIF information of the first image and/or the third image, wherein the EXIF information comprises CCT parameter values and/or LV parameter values;
determining the first shooting scene to which the third image belongs based on the CCT parameter values and/or LV parameter values of the first image and/or the third image,
Wherein the first shooting scene has a corresponding CCT parameter value interval and/or LV parameter value interval,
and, the CCT parameter values of the first image and/or the third image are located in the CCT parameter value interval and/or the LV parameter values are located in the LV parameter value interval.
11. The method according to claim 1, wherein the method further comprises:
detecting a first operation of a user, and displaying a first interface, wherein the first interface comprises the third image and a color adjustment control corresponding to the adjustment parameter;
and acquiring the first adjustment parameter value adjusted by the user on the color adjustment control.
12. The method of claim 11, wherein the first operation is a click operation of a display control in a second interface, wherein the second interface is a capture interface where a user captured the third image.
13. The method of claim 11, wherein the first operation is a click operation on an edit control of the selected third image.
14. An electronic device, comprising: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the image processing method of any of claims 1-13.
15. A computer readable medium having stored thereon instructions which, when executed on a computer, cause the computer to perform the image processing method of any of claims 1 to 13.
CN202311387337.8A 2023-10-25 2023-10-25 Image processing method, electronic device, and readable storage medium Pending CN117119316A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311387337.8A CN117119316A (en) 2023-10-25 2023-10-25 Image processing method, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311387337.8A CN117119316A (en) 2023-10-25 2023-10-25 Image processing method, electronic device, and readable storage medium

Publications (1)

Publication Number Publication Date
CN117119316A true CN117119316A (en) 2023-11-24

Family

ID=88813255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311387337.8A Pending CN117119316A (en) 2023-10-25 2023-10-25 Image processing method, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN117119316A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248170A (en) * 2018-03-09 2019-09-17 华为技术有限公司 Image color method of adjustment and device
CN110351537A (en) * 2019-07-31 2019-10-18 深圳前海达闼云端智能科技有限公司 White balance method, device, storage medium and the electronic equipment of Image Acquisition
CN114422682A (en) * 2022-01-28 2022-04-29 安谋科技(中国)有限公司 Photographing method, electronic device, and readable storage medium
CN115589539A (en) * 2022-11-29 2023-01-10 荣耀终端有限公司 Image adjusting method, device and storage medium
CN116055699A (en) * 2022-07-28 2023-05-02 荣耀终端有限公司 Image processing method and related electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248170A (en) * 2018-03-09 2019-09-17 华为技术有限公司 Image color method of adjustment and device
CN110351537A (en) * 2019-07-31 2019-10-18 深圳前海达闼云端智能科技有限公司 White balance method, device, storage medium and the electronic equipment of Image Acquisition
CN114422682A (en) * 2022-01-28 2022-04-29 安谋科技(中国)有限公司 Photographing method, electronic device, and readable storage medium
CN116055699A (en) * 2022-07-28 2023-05-02 荣耀终端有限公司 Image processing method and related electronic equipment
CN115589539A (en) * 2022-11-29 2023-01-10 荣耀终端有限公司 Image adjusting method, device and storage medium

Similar Documents

Publication Publication Date Title
CN113810602B (en) Shooting method and electronic equipment
CN112262563B (en) Image processing method and electronic device
US9117410B2 (en) Image display device and method
CN112449120B (en) High dynamic range video generation method and device
WO2021057277A1 (en) Photographing method in dark light and electronic device
WO2021013132A1 (en) Input method and electronic device
CN112153272B (en) Image shooting method and electronic equipment
CN112887582A (en) Image color processing method and device and related equipment
CN113810603B (en) Point light source image detection method and electronic equipment
CN114640783B (en) Photographing method and related equipment
CN113452969B (en) Image processing method and device
CN115359105B (en) Depth-of-field extended image generation method, device and storage medium
CN113891008B (en) Exposure intensity adjusting method and related equipment
CN110473156A (en) Processing method, device, storage medium and the electronic equipment of image information
CN117135471A (en) Image processing method and electronic equipment
CN113805830B (en) Distribution display method and related equipment
CN117119316A (en) Image processing method, electronic device, and readable storage medium
CN113518172A (en) Image processing method and device
CN115705663B (en) Image processing method and electronic equipment
CN116048323B (en) Image processing method and electronic equipment
CN115604572B (en) Image acquisition method, electronic device and computer readable storage medium
CN115460343B (en) Image processing method, device and storage medium
CN116668838B (en) Image processing method and electronic equipment
CN114463191A (en) Image processing method and electronic equipment
CN116723417A (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination