CN116977190A - Image processing method, apparatus, device, storage medium, and program product - Google Patents

Image processing method, apparatus, device, storage medium, and program product Download PDF

Info

Publication number
CN116977190A
CN116977190A CN202210398173.8A CN202210398173A CN116977190A CN 116977190 A CN116977190 A CN 116977190A CN 202210398173 A CN202210398173 A CN 202210398173A CN 116977190 A CN116977190 A CN 116977190A
Authority
CN
China
Prior art keywords
image
curve
mapping curve
feature representation
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210398173.8A
Other languages
Chinese (zh)
Inventor
曾仙芳
富宸
程培
俞刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210398173.8A priority Critical patent/CN116977190A/en
Publication of CN116977190A publication Critical patent/CN116977190A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, an image processing device, image processing equipment, a storage medium and a program product, and relates to the technical field of image processing. The method comprises the following steps: acquiring a first image; acquiring a preset mapping curve; performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image; correcting curve parameters of a preset mapping curve by using curve correction coefficients to obtain a target mapping curve; and adjusting color parameters of the first image based on the target mapping curve to obtain a second image corresponding to the first image. When the color parameters of the first image are adjusted, only the correction coefficient is needed to be calculated instead of the whole mapping curve, so that the calculated amount is greatly reduced, and the efficiency of the image color enhancement processing is improved.

Description

Image processing method, apparatus, device, storage medium, and program product
Technical Field
Embodiments of the present application relate to the field of image processing technologies, and in particular, to an image processing method, apparatus, device, storage medium, and program product.
Background
When an image is shot through a mobile terminal, the shot image is affected by factors such as equipment, environment, shooting technology and the like, the problems of local exposure, picture gray, color oligose and the like can occur, and the image is often required to be subjected to color enhancement processing to obtain an optimized image.
In the related art, a mapping curve between images can be fitted directly by training a depth neural network in a depth bilateral learning algorithm, and a target image to be subjected to color enhancement is input into the trained depth bilateral learning algorithm to obtain an optimized image.
However, the difficulty of the fitting process of the mapping curve in the algorithm is high, the deep neural network is required to contain more parameters, the calculation amount of the algorithm is high, the speed of the target image color enhancement processing is low, and the efficiency of the color enhancement processing is low.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and a program product, which can improve the efficiency of color enhancement processing. The technical scheme is as follows:
in one aspect, there is provided an image processing method, the method including:
acquiring a first image, wherein the first image is an image to be subjected to color parameter adjustment;
acquiring a preset mapping curve, wherein the preset mapping curve corresponds to preset curve parameters;
performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image;
correcting curve parameters of the preset mapping curve by the curve correction coefficient to obtain a target mapping curve;
And carrying out color parameter adjustment on the first image based on the target mapping curve to obtain a second image corresponding to the first image, wherein the second image is an image subjected to color parameter adjustment on the basis of the first image.
In another aspect, there is provided an image processing apparatus including:
the acquisition module is used for acquiring a first image, wherein the first image is an image to be subjected to color parameter adjustment;
the acquisition module is further used for acquiring a preset mapping curve, wherein the preset mapping curve corresponds to preset curve parameters;
the analysis module is used for carrying out feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image;
the correction module is used for correcting the curve parameters of the preset mapping curve by the curve correction coefficient to obtain a target mapping curve;
and the adjusting module is used for adjusting the color parameters of the first image based on the target mapping curve to obtain a second image corresponding to the first image, wherein the second image is an image after the color parameters are adjusted on the basis of the first image.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by the processor to implement the image processing method according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one program code is stored, where the at least one program code is loaded and executed by a processor to implement an image processing method according to any one of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the image processing method according to any one of the embodiments of the present application.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
the curve correction coefficient is obtained through characteristic analysis of the first image (the image to be subjected to color parameter adjustment), the preset mapping curve (the mapping curve of the preset curve parameter) is corrected based on the curve correction coefficient, and the color parameter adjustment is performed on the first image based on the adjusted preset mapping curve.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a gamma conversion curve provided by an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a process for obtaining a corrected map curve according to an exemplary embodiment of the present application;
FIG. 3 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 4 is a flowchart of an image processing method provided by an exemplary embodiment of the present application;
FIG. 5 is a model training flow diagram provided by an exemplary embodiment of the present application;
fig. 6 is a flowchart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of a process for obtaining prediction correction coefficients according to an exemplary embodiment of the present application;
fig. 8 is a flowchart illustrating an image processing method according to an exemplary embodiment of the present application;
Fig. 9 is a flowchart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 10 is a flow chart of parallel processing of image content provided by an exemplary embodiment of the present application;
FIG. 11 is a graph of subject evaluation results of an image processing method provided by an exemplary embodiment of the present application;
FIG. 12 is a velocity contrast diagram of an image processing method and a sample method III provided by an exemplary embodiment of the present application;
fig. 13 is a block diagram of an image processing apparatus according to an exemplary embodiment of the present application;
fig. 14 is a block diagram of an image processing apparatus provided in another exemplary embodiment of the present application;
fig. 15 is a block diagram of a server according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of promoting an understanding of the principles and advantages of the application, reference will now be made in detail to the embodiments of the application, some but not all of which are illustrated in the accompanying drawings. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," and the like in this disclosure are used for distinguishing between similar elements or items having substantially the same function and function, and it should be understood that there is no logical or chronological dependency between the terms "first," "second," and no limitation on the amount or order of execution.
First, a brief description will be made of terms involved in the embodiments of the present application:
a pixel: pixels are the smallest units that make up an image and can be understood as small squares of different colors. The image is digitized and then is in fact a digital matrix in a computer, and the pixels in the digital matrix have at least two properties: location and pixel value. The positions are represented by rows and columns, the pixel values represent the average brightness of the pixel square, and in the gray image, the gray value is the pixel value.
Color space: the image can be mapped to different color spaces in a computer for representation, and in the color image, each pixel corresponds to a plurality of channels, and common multi-channel color spaces are as follows: RGB (Red Green Blue) space and HSV (Hue Saturation Value, hue saturation brightness) space.
Image color enhancement: digital image processing refers to a process of processing and analyzing an image after the image is digitized by a computer, and the digital image processing includes image transformation, image color enhancement, image restoration, image segmentation, recognition classification of an image, and the like. Image color enhancement belongs to one type of digital image processing, and mainly adjusts color parameters of an image, optimizes image display, and improves human visual subjective feelings, for example: the saturation of the image is improved, the main body is highlighted from the background, and the effect of highlighting the main body of the image picture is achieved.
In some embodiments, the color parameters of the image may be adjusted manually directly on the camera or the image processing software, and with the development of computer technology, the automatic adjustment of the image parameters may also be achieved through an image enhancement algorithm, where common image enhancement algorithms include: an image enhancement algorithm based on histogram equalization, an image enhancement algorithm based on a Laplacian, an image enhancement algorithm based on logarithmic transformation, an image enhancement algorithm based on gamma transformation, and the like.
The embodiment of the application is mainly described by taking an image enhancement algorithm based on gamma transformation as an example, wherein the gamma transformation is mainly used for correcting images: the gray scale of the image with too high gray scale (overexposure) or too low gray scale (darkness) is adjusted, so that the display effect of the brightness of each part of the image is improved. The conversion formula of the gamma conversion is shown as the following formula one:
equation one: s is S 1 =cr γ
Wherein r is the initial gray scale, when gamma conversion is executed, the gray scale dynamic range of 0 to 255 is required to be scaled to the gray scale dynamic range of 0 to 1, and the original dynamic range is restored after gamma conversion is executed, so the value range of r is [0,1];S 1 The gray scale after gamma conversion; c is a gray scale coefficient, and is used for integrally stretching the gray scale of the image, and the value is usually 1; gamma is a gamma factor, the value of which determines the gray mapping mode between the input image and the output image.
Referring to fig. 1, a set of gamma transformation curves 100 is shown, wherein the abscissa represents the input gray scale, the ordinate represents the output gray scale (both the input gray scale and the output gray scale are scaled), and the number above each curve represents the magnitude of the gamma factor γ, for example: the gamma factor gamma of the gamma conversion curve 101 is 0.1.
In the related art, the processing method of image color enhancement is mainly implemented by the following modes:
(1) The professional designer is used for repairing the image to be subjected to color enhancement, the process is time-consuming and labor-consuming, the efficiency of image color enhancement is low, and the image color enhancement has no universality;
(2) The image with good color performance is automatically synthesized through an image color enhancement algorithm, the image color enhancement algorithm is trained through an original image and a labeling image (an optimized image obtained on the basis of the original image), then the image needing to be subjected to color enhancement is input into the image color enhancement algorithm, the mapping curve between the images is directly fitted through a depth neural network contained in the image color enhancement algorithm, and the optimized image is obtained, so that the color enhancement effect is achieved. The above description shows that the image color enhancement algorithm needs to adjust parameters of all pixel points in the image, so that the fitting process is difficult, and the depth neural network contained in the fitting process is required to have more parameters; and the calculation amount in the image enhancement process is large, and the efficiency of the image enhancement process is low.
Referring to fig. 2, firstly, a mapping curve is selected as a preset mapping curve 201, and curve parameters of the preset mapping curve 201 can be set; secondly, performing image feature extraction and fusion on the first image 202 by adopting a convolution layer network, so as to estimate a curve correction coefficient X of the first image 202 on a preset mapping curve 201, and correcting the preset mapping curve 201 based on the curve correction coefficient X to obtain a corrected mapping curve 203; finally, color parameter adjustment is performed on the first image 202 using the corrected mapping curve 203. Through the above process, when the color parameter adjustment is performed on the first image 202, only the curve correction coefficient X of the preset mapping curve 201 is needed to be calculated, the whole mapping curve is not needed to be calculated, and the calculated amount is greatly reduced, so that the efficiency of the image color enhancement processing is improved.
Alternatively, the convolutional layer network in the above method may be replaced by a fully-connected layer network or a converter (converter) layer network, and it is noted that the calculation speed of the scheme of the fully-connected layer network or the converter layer network is slower than that of the scheme of the convolutional layer network.
Fig. 3 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application, as shown in fig. 3, where the implementation environment includes a terminal 310, a server 320, and a communication network 330, where the terminal 310 and the server 320 are connected by the communication network 330, and in some alternative embodiments, the communication network 330 may be a wired network or a wireless network, and this embodiment is not limited to this.
In some alternative embodiments, terminal 310 is a smart phone, tablet, notebook, desktop computer, smart home appliance, smart car terminal, smart speaker, digital camera, etc., but is not limited thereto. Taking the terminal 310 implemented as a smart phone as an example, optionally, a target application program is installed in the terminal 310, which may be a conventional application program, a cloud application program, an applet or an application module in a host application program, or a web platform, which is not limited in this embodiment. Optionally, the target application is provided with an image color parameter adjustment function, which is schematically shown in fig. 3, when the color parameter adjustment is needed for the image, the terminal 310 uploads the first image (i.e. the image needed for the color parameter adjustment) to the server 320, the server 320 performs the color parameter adjustment for the first image, obtains the second image (i.e. the image after the color parameter adjustment is performed on the basis of the first image), and feeds back the second image to the terminal 310.
In some alternative embodiments, server 320 is configured to provide image color parameter adjustment services for a target application installed in terminal 310. Optionally, a curve correction model and a preset mapping curve are set in the server 320, where the preset mapping curve is a mapping curve with preset curve parameters. Illustratively, after receiving the first image, the server 320 inputs the first image into a curve correction model, extracts an image feature representation of the first image, performs feature analysis on the image feature representation to obtain a curve correction coefficient corresponding to the first image, corrects a preset mapping curve based on the curve correction coefficient to obtain a target mapping curve, adjusts color parameters of the first image based on the target mapping curve, obtains a second image, and finally feeds back the second image to the terminal 310, and after receiving the second image, the terminal 310 displays the second image through a target application program.
In some alternative embodiments, the curve correction model may also be deployed on the terminal 310 side, where the terminal 310 implements image color parameter adjustment locally, without the aid of the server 320, which is not limited in this embodiment.
It should be noted that the server 320 can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), and basic cloud computing services such as big data and artificial intelligence platforms.
Cloud Technology (Cloud Technology) refers to a hosting Technology that unifies serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business model, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing. Optionally, server 320 may also be implemented as a node in a blockchain system.
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of the related data is required to comply with the relevant laws and regulations and standards of the relevant countries and regions. For example, the sample image and the reference image referred to in the present application are acquired with sufficient authorization.
The image processing method provided by the embodiment of the application can be at least applied to the following application scenes:
1. when a mobile terminal (such as a mobile phone) is used for shooting images, the problems of local exposure, excessive darkness of picture colors and the like of the shot images exist often due to the influence of factors such as shooting environment, shooting skills and shooting equipment, and the like, the shot images cannot meet the requirements of people on image quality.
2. When a small video is watched on a small video application program, the quality level of the video is uneven, the video picture has the problems of dark color, uneven brightness distribution and the like, and the visual effect of the video picture is difficult to improve by manually adjusting color parameters.
It should be noted that the above application scenario is merely an illustrative example, and other application scenarios of the image processing method in the embodiment of the present application are not limited.
With reference to the above description and the implementation environment, the image processing method provided by the embodiment of the present application is described, and fig. 4 is a flowchart of an image processing method provided by the embodiment of the present application, and as shown in fig. 4, the method includes:
In step 401, a first image is acquired.
The first image is an image to be subjected to color parameter adjustment.
Optionally, the first image includes at least one of an independent picture, an area of a certain block in the picture that needs to be subjected to color parameter adjustment, a frame of picture in the video, and the like.
Optionally, the adjusting the color parameter of the first image refers to adjusting the color parameter of the pixel point in the first image.
Optionally, the color parameter refers to a parameter affecting the color performance of the image, and illustratively, the color parameter includes at least one of contrast, brightness, saturation, hue, gray-scale, and the like.
Step 402, obtaining a preset mapping curve.
Optionally, the preset mapping curve is used for performing reference adjustment on color parameters in the image, and illustratively, a process of adjusting gray scales of pixel points in the image by the preset mapping curve may be: the gray scale corresponding to each pixel point of the image is obtained, the obtained gray scale corresponding to each pixel point is input into a preset mapping curve, and the gray scale corresponding to each pixel point and mapped on the preset mapping curve is output.
Optionally, the preset mapping curve is a reference color parameter adjustment curve, for example: logarithmic transformation curve, gamma transformation curve. The logarithmic transformation curve can expand the low gray level part of the image, display more details of the low gray level part, compress the high gray level part, and reduce the details of the high gray level part, so as to achieve the purpose of emphasizing the low gray level part of the image; the gamma transformation curve is mainly used for correcting images, and correcting images with over-high gray level or under-gray level, so that the problem of image overexposure is solved.
The preset mapping curve corresponds to preset curve parameters.
Optionally, the curve parameters are used to adjust the curve shape of the preset mapping curve, the preset mapping curves with different shapes may have different adjustment effects on the first image, which is schematically shown in fig. 1, when the preset mapping curve is implemented as a gamma transformation curve, the gamma factor γ is one of the curve parameters in the gamma transformation curve, and the shapes of the gamma transformation curves corresponding to different gamma factors γ are different, where:
(1) When gamma is larger than 1, the gray scale of the brighter part of the image can be mapped to smaller gray scale, and the gray scale of the darker part is changed less, so that the whole image appears to be darkened;
(2) When gamma is less than 1, the gray scale of the darker part of the image can be mapped to larger gray scale, and the gray scale of the lighter part has smaller change, so that the whole image looks bright;
(3) When γ=1, the input original image is not changed.
Step 403, performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image.
The curve correction coefficient refers to a coefficient for correcting a curve parameter of a preset mapping curve.
In some alternative embodiments, performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image refers to inputting the first image into a curve correction model and outputting to obtain the curve correction coefficient corresponding to the first image.
Alternatively, the curve correction model is a model trained from a sample image pair.
The sample image pair comprises a sample image and a reference image which form an image pair, wherein the reference image is an image of the sample image after color parameter adjustment.
Illustratively, the manner of acquisition of the reference image and the sample image includes at least one of the following:
(1) Acquiring an image subjected to picture repair by a professional designer as a reference image, and taking an image before picture repair as a sample image;
(2) And automatically synthesizing a high-quality image by using a reference deep learning image color enhancement algorithm, and screening an image with reasonable color distribution by using a labeling person as a reference image, wherein the image before processing is used as a sample image.
The above-described reference image and sample image acquisition is merely illustrative, and embodiments of the present application are not limited thereto.
Referring to fig. 5, which illustrates a training process of a curve correction model according to an exemplary embodiment of the present application, as shown in fig. 5, training data of the curve correction model is a sample image 501 and a reference image 502, and the training steps of the curve correction model are as follows:
step one: the sample image 501 is amplified in a variety of images via the image amplification module 503, optionally the image amplification module 503 includes a plurality of amplification methods, and the sample image is amplified in at least one of the following ways:
(1) The image random rotation method is used for randomly rotating the image within a certain angle range;
(2) The image random scaling method scales the image within a certain resolution;
(3) The image random edge-supplementing method is that black edges are supplemented around the image, and the pixel width values of the black edges are random;
(4) The random horizontal image overturning method comprises the steps of overturning an image according to a random angle at a certain probability level;
(5) The random vertical image turning method is to turn the image vertically according to random angles with a certain probability.
The above-described sample image amplification method is merely illustrative, and the embodiment of the present application is not limited thereto.
Step two: and adding image noise to the amplified sample image.
Wherein adding image noise means generating a random number and then adding the random number to the pixel value of the image.
Illustratively, the image noise includes: at least one of gaussian noise, pretzel noise, rayleigh noise, exponential distribution noise, and uniform distribution noise.
It should be noted that, the first step and the second step are optional parallel steps, and the first step may be performed alone, the second step may be performed alone, and the first step and the second step may be performed simultaneously.
Step three: inputting the amplified sample image 501 added with image noise into a curve correction model 504 to obtain a first prediction correction coefficient, adjusting a preset mapping curve through the first prediction correction coefficient, and applying the adjusted preset mapping curve to the sample image 501 to obtain a predicted image I n The method comprises the steps of carrying out a first treatment on the surface of the Inputting the amplified sample image 501 without added image noise into a curve correction model 504 to obtain a second prediction correction coefficient, adjusting a preset mapping curve through the first prediction correction coefficient, and applying the adjusted preset mapping curve to the sample image 501 to obtain an obtained predicted image I o
Step four: reference image 502 is also passed throughThe image amplification module amplifies the diversity of the images to obtain an amplified reference image I t The amplification method refers to the amplification method of the sample image in the first step, and will not be described herein.
The sample image 501 and the reference image 502 may be selected from the same amplification method or different amplification methods, which is not limited in the embodiment of the present application.
Step five: will predict image I n Predictive image I o And amplified reference image I t The input is to the loss calculation module 505 to calculate a loss, and the curve correction model 504 is trained based on the loss.
Optionally, the loss calculated in the loss calculation module includes at least one of the following losses:
(1) Pixel consistency loss for constraining predicted image I o And amplified reference image I t The pixel values are consistent;
(2) Noise robustness loss for constrained prediction image I n And predictive image I o The pixel values are uniform.
Illustratively, the curve modification model 504 is trained based on pixel consistency loss, or the curve modification model 504 is trained based on noise robustness loss, or the curve modification model 504 is trained based on joint loss of pixel consistency loss and noise robustness loss.
If implemented to train the curve modification model 504 based on joint loss of pixel consistency loss and noise robustness loss, the curve modification model 504 may be trained by optionally weighting and summing the pixel consistency loss and the noise robustness loss to obtain the joint loss.
And step 404, correcting curve parameters of the preset mapping curve by using curve correction coefficients to obtain a target mapping curve.
Optionally, the curve correction coefficient may correct a curve parameter of the preset mapping curve so as to change a shape of the preset mapping curve, thereby obtaining the target mapping curve.
The above description indicates that the formula of the gamma transformation curve is formula one, and optionally, the curve correction coefficient is combined with the gamma factor c or gamma factor γ, where the combination mode includes at least one of summing, multiplying, and other calculation modes.
In some optional embodiments, the process of obtaining the target mapping curve further includes:
and carrying out weighted summation on the curve correction coefficient and the preset mapping curve to obtain a target mapping curve.
Schematically, if the weights of the curve correction coefficient and the preset mapping curve are 1, the target mapping curve is:
S 2 =cr γ +a
wherein r is the initial gray; s is S 2 Gray scale subjected to modified gamma conversion; c is the gray coefficient; a is a curve correction coefficient.
And step 405, performing color parameter adjustment on the first image based on the target mapping curve to obtain a second image corresponding to the first image.
The second image is an image with color parameters adjusted on the basis of the first image.
Illustratively, the gamma conversion curve is implemented as a preset map curve, and the corrected preset map curve, i.e., the target map curve, is set to S 3 Firstly, acquiring the gray scale of each pixel point in a first image, and respectively inputting the gray scale of each pixel point into S 3 The curve is corrected for gray, and pixels of the first image are modified based on the corrected gray to obtain a second image.
In summary, in the image processing method provided by the embodiment of the present application, the curve correction coefficient is obtained by performing feature analysis on the first image (refer to the image to be subjected to color parameter adjustment), the preset mapping curve (the mapping curve of the preset curve parameter) is corrected by the curve correction coefficient, and the color parameter adjustment is performed on the first image based on the adjusted preset mapping curve.
Fig. 6 is a flowchart of an image processing method according to an embodiment of the present application, as shown in fig. 6, where the method includes:
in step 601, a first image is acquired.
The first image is an image to be subjected to color parameter adjustment.
Optionally, the manner of acquiring the first image includes at least one of:
Firstly, an image is shot through an image shooting tool, the shot image is influenced by factors such as shooting environment, shooting skill and shooting equipment, the shot image has problems such as local exposure or excessively dark picture color, and the color parameters need to be adjusted, the shot image is taken as a first image, wherein the image shooting tool comprises at least one of a digital camera, a mobile phone, a tablet personal computer and the like.
Secondly, scanning an image on a printed matter or an entity photo by using a scanning tool, wherein the scanned image is influenced by factors such as a scanned object, a scanning environment, a scanning angle and the like, the scanned image has the problems of overexposure, dim picture and the like, and the color parameters are required to be adjusted, and the scanned image is taken as a first image, wherein the scanning tool can be a scanner, a target application program with a scanning function and the like arranged on a mobile phone or a tablet computer.
Thirdly, capturing images from a screen (for example, capturing images on a mobile phone screen through a screenshot function of the mobile phone), wherein the captured images have the problems of dark picture colors and the like due to different image quality of different equipment screens, and the captured images are taken as first images when color parameter adjustment is needed.
The above-mentioned first image acquisition manner is merely an illustrative example, and the embodiment of the present application is not limited thereto.
Step 602, obtaining a preset mapping curve.
The preset mapping curve corresponds to preset curve parameters.
Optionally, the predetermined mapping curve is a quantized curve.
The quantization processing refers to approximating a continuous curve to a limited number of mapping values, and the continuous curve is represented on an image, namely, a smooth curve is processed to a discrete image composed of a limited number of points, and the discrete image is a quantized preset mapping curve.
Step 603, inputting the first image into a curve correction model, and extracting an image feature representation of the first image.
The training process of the curve correction model is already described in step 403, and will not be described here again.
Optionally, the image features of the first image represent image features for indicating the first image.
Optionally, the image feature representation of the first image is extracted by a feature extraction network, and the curve correction model may include one feature extraction network or may include a plurality of feature extraction networks.
It should be noted that, the feature extraction network includes at least one of a convolutional layer network, a fully-connected layer network, a Transformer layer network, and the like, which is not limited in this embodiment.
Optionally, the image features of the first image include at least one of texture features, global features, local features, semantic features, and the like. The texture features are used for describing the surface properties of the whole image or the content corresponding to a certain area in the image, and schematically, if the water surface exists in the image, the texture features comprise features for representing the water surface ripple; global features are used for describing the overall attribute of the image, and the types of the global features comprise color features, shape features and the like; the local features refer to features extracted from a sub-graph region in an image, and include one of edges, corner points, lines, curves and the like; semantic features are used to describe the semantics of the content indication of an image, for example: if there is a cat in the image, then the "cat" is a semantic feature of the image.
Optionally, the image feature representation of the first image may be an image feature representation, or may be a fusion feature representation of two or more image feature representations, and the extracting the image feature representation of the first image includes at least one of the following ways:
(1) Extracting a texture feature representation of the first image, the texture feature representation being indicative of an image texture feature of the first image; extracting a global feature representation of the first image based on the texture feature representation, the global feature representation being indicative of global image features of the first image; extracting a local feature representation of the first image based on the texture feature representation, the local feature representation being used to indicate image features of a sub-image region in the first image; extracting a semantic feature representation of the first image based on the texture feature representation, the semantic feature representation being used to indicate semantic content contained by the first image; and fusing the texture feature representation, the global feature representation, the local feature representation and the semantic feature representation to obtain an image feature representation of the first image.
Optionally, the process of obtaining the image feature representation of the first image includes: inputting the first image into a first feature extraction network, and extracting a texture feature representation of the first image through the first feature extraction network; inputting the texture feature representation and the first image to a second feature extraction network, and extracting a global feature representation of the first image through the second feature extraction network; inputting the texture feature representation and the first image to a third feature extraction network, and extracting a local feature representation of the first image through the third feature extraction network; inputting the texture feature representation and the first image to a fourth feature extraction network, and extracting semantic feature representation of the first image through the fourth feature extraction network; and finally, fusing the texture feature representation, the global feature representation, the local feature representation and the semantic feature representation to obtain the image feature representation of the first image.
(2) Extracting a texture feature representation of the first image; extracting a global feature representation of the first image; extracting a local feature representation of the first image; extracting semantic feature representations of the first image; and fusing the texture feature representation, the global feature representation, the local feature representation and the semantic feature representation to obtain an image feature representation of the first image.
Optionally, the process of obtaining the image feature representation of the first image includes: inputting the first image into a fifth feature extraction network, and extracting a texture feature representation of the first image through the fifth feature extraction network; inputting the first image into a sixth feature extraction network, and extracting global feature representation of the first image through the sixth feature extraction network; inputting the first image into a seventh feature extraction network, and extracting a local feature representation of the first image through the seventh feature extraction network; inputting the first image into an eighth feature extraction network, and extracting semantic feature representations of the first image through the eighth feature extraction network; and finally, fusing the texture feature representation, the global feature representation, the local feature representation and the semantic feature representation to obtain the image feature representation of the first image.
(3) A global feature representation of the first image is extracted as an image feature representation of the first image.
(4) A local feature representation of the first image is extracted as an image feature representation of the first image.
(5) A texture feature representation of the first image is extracted as an image feature representation of the first image.
(6) A semantic feature representation of the first image is extracted as an image feature representation of the first image.
The above-mentioned representation of the image features of the first image is merely an illustrative example, and embodiments of the present application are not limited thereto.
And step 604, performing feature analysis on the image feature representation to obtain a curve correction coefficient corresponding to the first image.
Illustratively, the extraction of the image feature representation of the first image is implemented as mode (1), please refer to fig. 7, which illustrates a process of obtaining the curve correction coefficients, as illustrated in fig. 7:
firstly, extracting texture features 702 of a first image 701, then respectively extracting global features 703, semantic features 704 and local features 705 of the first image 701 based on the texture features 702, finally, fusing the texture features 702, the global features 703, the semantic features 704 and the local features 705 of the first image 701 to obtain fused features 706, and performing feature analysis on the fused features 706 to obtain prediction correction coefficients 707, namely curve correction coefficients.
Step 605, correcting curve parameters of a preset mapping curve by using curve correction coefficients to obtain a target mapping curve.
In some alternative embodiments, the curve correction coefficient includes n sub-coefficients, where n is an integer greater than 1, and the method for obtaining the target mapping curve includes:
carrying out segmentation processing on the preset mapping curves to obtain n segmentation mapping curves; and respectively correcting the n segmented mapping curves through the n sub-coefficients to obtain a target mapping curve.
The kth piecewise mapping curve is modified through the kth sub-coefficient, wherein k is a positive integer and is less than or equal to n.
Optionally, the method for obtaining n segmented mapping curves by segmenting the preset mapping curve includes at least one of the following modes:
1. and carrying out average segmentation processing on the preset mapping curve according to the abscissa of the coordinate axis where the preset mapping curve is located, so as to obtain n segmented mapping curves.
2. And carrying out average segmentation processing on the preset mapping curve according to the ordinate of the coordinate axis where the preset mapping curve is located, so as to obtain n segmented mapping curves.
Optionally, the method for obtaining n sub-coefficients in the curve correction coefficient includes at least one of the following ways:
1. carrying out average segmentation processing on a preset mapping curve according to the abscissa of the coordinate axis where the preset mapping curve is located to obtain n segmented mapping curves, taking the kth segmented mapping curve as the preset mapping curve, and obtaining a correction coefficient, namely the kth sub-coefficient, wherein the preset mapping curve is schematically assumed to be divided into two segments: the first preset mapping curve and the second preset mapping curve need to obtain a sub-coefficient A corresponding to the first preset mapping curve and a sub-coefficient B corresponding to the second preset mapping curve.
Illustratively, the process of obtaining the sub-coefficient a includes: acquiring a first preset mapping curve; inputting the first image into a curve correction model 1 to obtain a sub-coefficient A corresponding to the first image, wherein the sub-coefficient A is used for adjusting a first preset mapping curve, and the curve correction model 1 is a model obtained through pre-training.
Illustratively, the process of obtaining the sub-coefficient B includes: acquiring a second preset mapping curve; the first image is input into a curve correction model 2 to obtain a sub-coefficient B corresponding to the first image, wherein the sub-coefficient B is used for adjusting a second preset mapping curve, and the curve correction model 1 is a model obtained through training in advance.
Alternatively, the acquisition process of the sub-coefficients a and B may be performed synchronously or asynchronously.
2. Carrying out average segmentation processing on the preset mapping curve according to the ordinate of the coordinate axis where the preset mapping curve is located to obtain n segmentation mapping curves, taking the kth segmentation mapping curve as the preset mapping curve, and obtaining a correction coefficient, namely the kth sub-coefficient, wherein the sub-coefficient obtaining process refers to the description of the sub-coefficient A and the sub-coefficient B obtaining process in the method 1.
Referring to fig. 1, the gamma conversion curve 101 is schematically divided into 5 segments according to the abscissa: [0,0.2], (0.2, 0.4], (0.4,0.6 ], (0.6,0.8), and (0.8,1) respectively obtaining curve correction sub-coefficients of the 5 sections of gamma transformation curves, and respectively correcting the 5 sections of gamma transformation curves through the 5 curve correction sub-coefficients to obtain a target mapping curve, wherein the kth section of gamma transformation curve is corrected through the kth sub-coefficient, k is a positive integer and k is less than or equal to 5.
Step 606, performing pixel level feature transformation on the first image to obtain a full resolution feature map corresponding to the first image.
Optionally, the pixel level feature transformation is performed on the first image through a feature transformation network to obtain a full resolution feature map corresponding to the first image, where the feature transformation network includes at least one of a convolutional layer neural network, a full connection layer network, a transform layer network, and the like, which is not limited in this embodiment.
Optionally, the pixel level feature transformation refers to feature transformation of pixels in the input first image, and illustratively, converts pixels in HSV space into pixels in RGB space.
In step 607, the target mapping curve is applied to the full resolution feature map, and the pixel-level mapping coefficient is obtained through linear interpolation.
Optionally, the preset mapping curve is a quantized curve, and the target mapping curve belongs to the quantized curve, so that the linear interpolation process refers to expanding the target mapping curve, and illustratively, the linear interpolation process refers to calculating other curve points by using the existing curve points to realize expansion of the curve.
And (3) acting the target mapping curve on the full-resolution feature map, and performing linear interpolation processing on the target mapping curve to obtain a pixel-level mapping coefficient.
At step 608, the pixel-level mapping coefficients are applied to the first image to obtain a second image.
Optionally, the pixel-level mapping coefficient corresponding to each pixel in the first image is applied to each pixel itself, so as to obtain the second image.
In summary, in the image processing method provided by the embodiment of the present application, the curve correction coefficient is obtained by performing feature analysis on the first image (refer to the image to be subjected to color parameter adjustment), the preset mapping curve (the mapping curve of the preset curve parameter) is corrected based on the curve correction coefficient, and the color parameter adjustment is performed on the first image based on the adjusted preset mapping curve.
According to the image processing method provided by the embodiment of the application, the curve correction coefficient is obtained by carrying out feature analysis on the image feature representation of the first image, wherein the image feature representation comprises the fusion representation of a plurality of image features, so that the accuracy of the obtained curve correction coefficient is higher.
Referring to fig. 8, a complete flowchart of an image processing method according to an embodiment of the present application is shown in fig. 8:
the specific steps for acquiring the second image comprise:
step one, a preset mapping curve 801 is obtained.
Alternatively, a mapping curve is first selected which can enhance the color of the image, and then the curve parameters are set.
And step two, inputting the first image 802 into a curve correction model for feature extraction and fusion processing, so as to obtain a curve correction coefficient 803 of a preset mapping curve 801.
And thirdly, correcting the preset mapping curve 801 through a curve correction coefficient 803 to obtain a corrected mapping curve 804.
And step four, inputting the first image 802 into a deep convolutional neural network to perform pixel level feature transformation to obtain a full resolution feature map 805.
And fifthly, applying the corrected mapping curve 804 to the full resolution feature map 805, and obtaining a pixel-level mapping coefficient 806 through linear interpolation.
Step six, the pixel-level mapping coefficient 806 is applied to the first image 802 to obtain an enhanced image 807.
It should be noted that, the first step and the second step do not represent the sequence of the steps, and the first step and the fourth step may be performed simultaneously.
Fig. 9 is a flowchart of an image processing method according to an embodiment of the present application, as shown in fig. 9, where the method includes:
step 901, a first image is acquired.
The first image is an image to be subjected to color parameter adjustment.
The process of acquiring the first image is already described in step 601, and will not be described here again.
Step 902, obtaining a preset mapping curve.
The preset mapping curve corresponds to preset curve parameters.
The specific process of obtaining the preset mapping curve in step 402 is already described in detail, and will not be described here again.
And 903, performing blocking processing on the first image to obtain at least two image blocks.
Optionally, before the first image is subjected to the blocking processing, adjusting the first image so as to improve the resolution of the first image, so that the first image is realized as a high-resolution image.
Optionally, the method for performing blocking processing on the first image to obtain at least two image blocks includes at least one of the following methods:
1. And carrying out average blocking on the first image according to the preset blocking quantity to obtain at least two image blocks.
Optionally, the first image is rectangular in shape, the number of preset blocks is even, and at least two image blocks with the same size are obtained by equally dividing the first image according to the number of preset blocks.
2. And randomly partitioning the first image to obtain at least two image blocks.
Optionally, the first image is cut by a plurality of random line segments to obtain at least two image blocks with different sizes.
Alternatively, the positions of the random line segments may be crossed or parallel.
Illustratively, taking the example of performing average blocking on the first image according to the preset number of blocks to obtain at least two image blocks as an illustration, please refer to fig. 10, performing gridding processing on the first image 1001 to obtain a plurality of image blocks 1002, where the sizes of each of the plurality of image blocks 1002 are the same.
And step 904, respectively performing feature analysis on at least two image blocks to obtain at least two block correction coefficients corresponding to the at least two image blocks.
Wherein the ith image block corresponds to the ith block correction coefficient, and the curve correction coefficient is a set of at least two block correction coefficients.
Optionally, the process of obtaining at least two block correction coefficients specifically includes:
inputting at least two image blocks into a curve correction model, and extracting image block characteristic representations corresponding to each of the at least two image blocks in parallel.
Illustratively, the categories of image feature representations include global feature representations, local feature representations, semantic feature representations, texture feature representations, and the like.
Optionally, the image block feature representation of each of the at least two image blocks comprises one image block feature representation or two or more image block feature representations.
The method for extracting the image block feature representation corresponding to each of the at least two image blocks is substantially the same as the method for extracting the image feature representation of the first image, and is specifically described in step 603, which is not repeated herein.
And step two, respectively carrying out feature analysis on the image block feature representations corresponding to each image block in the at least two image blocks to obtain at least two block correction coefficients corresponding to the at least two image blocks.
Schematically, please refer to fig. 10, a plurality of image blocks 1002 obtained by the gridding processing are input into a curve correction model, global features, local features, semantic features and texture features corresponding to each of the plurality of image blocks 1002 are extracted in parallel through the curve correction model, and the features are fused to obtain image block feature representations corresponding to each of the plurality of image blocks 1002, and feature analysis is performed on the image block feature representations of each of the plurality of image blocks 1002 to obtain mapping curve correction coefficients corresponding to each of the plurality of image blocks 1002.
In step 905, the curve parameters of the preset mapping curve are corrected by the curve correction coefficient, so as to obtain the target mapping curve.
The process of obtaining the target mapping curve from the curve correction coefficient, which is implemented as the curve correction coefficient directly obtained from the first image, is specifically described in step 404, and will not be described herein.
Optionally, the curve correction coefficient is implemented as at least two sets of block correction coefficients, and the process of obtaining the target mapping curve through the at least two sets of block correction coefficients includes:
and respectively acting the at least two block correction coefficients on a preset mapping curve to obtain block target mapping curves corresponding to the at least two block correction coefficients.
The i-th block correction coefficient corresponds to an i-th block target mapping curve, and the target mapping curve is a set of at least two block target mapping curves.
Schematically, please refer to fig. 10, the mapping curve correction coefficients corresponding to each of the plurality of image blocks 1002 are applied to the preset mapping curve 1003 to obtain a corrected mapping curve corresponding to each of the plurality of image blocks 1002, and the corrected mapping curve corresponding to each of the plurality of image blocks 1002 corresponds to the content of each of the plurality of image blocks 1002 one by one, which is a unique mapping curve of each of the plurality of image blocks 1002.
Optionally, when the preset mapping curve is implemented as a more complex mapping curve, labeling each image block, and training the deep convolutional neural network by the given labeled image block, so as to enable smooth connection between the image blocks.
Step 906, performing color parameter adjustment on the first image based on the target mapping curve to obtain a second image corresponding to the first image.
The second image is an image after the color parameter adjustment is performed on the basis of the first image.
The process of obtaining the second image from the target mapping curve by directly obtaining the first image is specifically described in step 405, and will not be described herein.
Optionally, the target mapping curve is implemented as a set of at least two block target mapping curves, and the process of obtaining the second image by the set of at least two block target mapping curves includes:
and respectively carrying out color parameter adjustment on at least two image blocks corresponding to the at least two block target mapping curves based on the at least two block target mapping curves to obtain a second image corresponding to the first image.
The ith image block is adjusted by the ith block target mapping curve.
In summary, in the image processing method provided by the embodiment of the present application, the curve correction coefficient is obtained by performing feature analysis on the first image (refer to the image to be subjected to color parameter adjustment), the preset mapping curve (the mapping curve of the preset curve parameter) is corrected based on the curve correction coefficient, and the color parameter adjustment is performed on the first image based on the adjusted preset mapping curve.
According to the image processing method provided by the embodiment of the application, the parallelism of image processing is improved through the blocking processing of the first image, the block target mapping curve is independently obtained for each image block, then the block target mapping curve is respectively acted on the image block corresponding to the block target mapping curve, and the color parameters of the image block are adjusted, so that the more intelligent color enhancement effect is realized.
Fig. 11 is an object evaluation result of an image processing method according to an exemplary embodiment of the present application, as shown in fig. 11: of the 105 samples studied, a total of 54 samples (about 51% of the total) considered to be better for the method 1101; a total of 38 samples (about 36% of the ratio) was considered to be better for sample one 1102; a total of 25 samples (about 24% of the ratio) is considered to be better for sample method two 1103.
Under subjective evaluation, as can be seen from the data 1100 shown in fig. 11, the image processing method according to the exemplary embodiment of the present application has a better enhancement effect and a higher evaluation.
It is noted that the above-mentioned evaluation result is data acquired after the subject is authorized.
Fig. 12 is a velocity contrast diagram of an image processing method and a sample method three according to an exemplary embodiment of the present application, fig. 12 shows velocity contrast of image processing at 1080×1920 resolution, a black histogram is a method 1201 proposed by the present application, and a white histogram is a sample method three 1202.
From the data 1200 shown in fig. 12, it can be seen that the proposed method 1201 has a significant advantage over the sample method three 1202 in terms of processing speed, providing 40% -60% performance acceleration over different models.
It is noted that the above-mentioned speed contrast data is data acquired after authorization of the subject.
Referring to fig. 13, a block diagram of an image processing apparatus according to an exemplary embodiment of the present application is shown, where the apparatus includes the following modules:
an obtaining module 1310, configured to obtain a first image, where the first image is an image to be subjected to color parameter adjustment;
the obtaining module 1310 is further configured to obtain a preset mapping curve, where the preset mapping curve corresponds to a preset curve parameter;
an analysis module 1320, configured to perform feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image;
the correction module 1330 is configured to correct the curve parameter of the preset mapping curve with the curve correction coefficient to obtain a target mapping curve;
the adjusting module 1340 is configured to perform color parameter adjustment on the first image based on the target mapping curve, so as to obtain a second image corresponding to the first image, where the second image is an image after performing color parameter adjustment on the basis of the first image.
In some alternative embodiments, referring to fig. 14, the analysis module 1320 includes:
the processing submodule 1321 is configured to input the first image into a curve correction model, and output and obtain a curve correction coefficient corresponding to the first image, where the curve correction model is a model obtained by training a sample image pair;
the sample image pair comprises a sample image and a reference image which form an image pair, and the reference image is an image of the sample image after color parameter adjustment.
In some alternative embodiments, the processing sub-module 1321 includes:
a feature extraction unit 1322 for inputting the first image into the curve correction model, and extracting an image feature representation of the first image;
and a feature analysis unit 1323, configured to perform feature analysis on the image feature representation, so as to obtain a curve correction coefficient corresponding to the first image.
In some optional embodiments, the feature extraction unit 1322 is further configured to extract a texture feature representation of the first image, where the texture feature representation is used to indicate an image texture feature of the first image; the feature extraction unit 1322 is further configured to extract a global feature representation of the first image based on the texture feature representation, the global feature representation being used to indicate global image features of the first image; the feature extraction unit 1322 is further configured to extract a local feature representation of the first image based on the texture feature representation, the local feature representation being used for indicating an image feature of a sub-image region in the first image; the feature extraction unit 1322 is further configured to extract a semantic feature representation of the first image based on the texture feature representation, where the semantic feature representation is used to indicate semantic content contained in the first image; the feature extraction unit 1322 further includes:
A feature fusion subunit 1324, configured to fuse the texture feature representation, the global feature representation, the local feature representation, and the semantic feature representation to obtain an image feature representation of the first image.
In some alternative embodiments, the analysis module 1320 includes:
a partitioning sub-module 1325, configured to perform a partitioning process on the first image to obtain at least two image blocks;
the analysis module 1320 is further configured to perform feature analysis on the at least two image blocks, to obtain at least two block correction coefficients corresponding to the at least two image blocks, where an i-th image block corresponds to an i-th block correction coefficient, and the curve correction coefficient is a set of the at least two block correction coefficients.
In some optional embodiments, the blocking submodule 1325 is configured to average block the first image according to a preset number of blocks to obtain the at least two image blocks; or, the method is used for randomly partitioning the first image to obtain the at least two image blocks.
In some optional embodiments, the correction module 1330 is further configured to directly correct the preset mapping curve with the curve correction coefficient to obtain a target mapping curve.
In some alternative embodiments, the curve correction coefficients include n sub-coefficients, where n is an integer greater than 1; the correction module 1330 includes:
a segmentation submodule 1331, configured to segment the preset mapping curve to obtain n segmented mapping curves;
the correction module 1330 is further configured to correct the n segmented mapping curves through n sub-coefficients, so as to obtain the target mapping curve, where the kth segmented mapping curve is corrected through the kth sub-coefficient, k is a positive integer and k is less than or equal to n.
In some optional embodiments, the segmentation submodule 1331 is configured to perform average segmentation processing on the preset mapping curve according to an abscissa of a coordinate axis where the preset mapping curve is located, so as to obtain n segmentation mapping curves; or, the method is used for carrying out average segmentation processing on the preset mapping curve according to the ordinate of the coordinate axis where the preset mapping curve is located, so as to obtain n segmented mapping curves.
In some alternative embodiments, the adjustment module 1340 includes:
a feature transformation submodule 1341, configured to perform pixel-level feature transformation on the first image to obtain the full-resolution feature map corresponding to the first image;
A linear interpolation submodule 1342, configured to apply the target mapping curve to the full-resolution feature map, and obtain a pixel-level mapping coefficient through linear interpolation;
an action submodule 1343, configured to apply the pixel-level mapping coefficient to the first image to obtain the second image.
In summary, in the image processing apparatus provided in the embodiment of the present application, the curve correction coefficient is obtained by performing the feature analysis on the first image (refer to the image to be subjected to the color parameter adjustment), the preset mapping curve (the mapping curve of the preset curve parameter) is corrected based on the curve correction coefficient, and the color parameter adjustment is performed on the first image based on the adjusted preset mapping curve.
It should be noted that: the image processing apparatus provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the embodiments of the image processing apparatus and the image processing method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the embodiments of the method are detailed in the method embodiments, which are not repeated herein.
Fig. 15 shows a schematic structural diagram of a server according to an exemplary embodiment of the present application. The server may be a server as shown in fig. 3. Specifically, the structure comprises the following structures:
the server 1500 includes a central processing unit (Central Processing Unit, CPU) 1501, a system Memory 1504 including a random access Memory (Random Access Memory, RAM) 1502 and a Read Only Memory (ROM) 1503, and a system bus 1505 connecting the system Memory 1504 and the central processing unit 1501. The server 1500 also includes a mass storage device 1506 for storing an operating system 1513, application programs 1514, and other program modules 1515.
The mass storage device 1506 is connected to the central processing unit 1501 through a mass storage controller (not shown) connected to the system bus 1505. The mass storage device 1506 and its associated computer-readable media provide non-volatile storage for the server 1500. That is, the mass storage device 1506 may include a computer readable medium (not shown) such as a hard disk or compact disc read only memory (Compact Disc Read Only Memory, CD-ROM) drive.
Computer readable media may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, erasable programmable read-only memory (Erasable Programmable Read Only Memory, EPROM), electrically erasable programmable read-only memory (Electrically Erasable Programmable Read Only Memory, EEPROM), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (Digital Versatile Disc, DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that computer storage media are not limited to the ones described above. The system memory 1504 and mass storage device 1506 described above may be collectively referred to as memory.
According to various embodiments of the application, server 1500 may also be operated by a remote computer connected to the network through a network such as the Internet. That is, the server 1500 may be connected to the network 1512 via a network interface unit 1511 coupled to the system bus 1505, or alternatively, the network interface unit 1511 may be used to connect to other types of networks or remote computer systems (not shown).
The memory also includes one or more programs, one or more programs stored in the memory and configured to be executed by the CPU.
The embodiment of the application also provides a computer device which can be implemented as a terminal or a server as shown in fig. 3. The computer device includes a processor and a memory in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the image processing method provided by the above-described method embodiments.
Embodiments of the present application also provide a computer readable storage medium having stored thereon at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program, the code set, or the instruction set being loaded and executed by a processor to implement the image processing method provided by the above-mentioned method embodiments.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the image processing method provided by each of the above-described method embodiments.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (14)

1. An image processing method, the method comprising:
acquiring a first image, wherein the first image is an image to be subjected to color parameter adjustment;
acquiring a preset mapping curve, wherein the preset mapping curve corresponds to preset curve parameters;
performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image;
correcting curve parameters of the preset mapping curve by the curve correction coefficient to obtain a target mapping curve;
and carrying out color parameter adjustment on the first image based on the target mapping curve to obtain a second image corresponding to the first image, wherein the second image is an image subjected to color parameter adjustment on the basis of the first image.
2. The method of claim 1, wherein the performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image includes:
Inputting the first image into a curve correction model, and outputting to obtain a curve correction coefficient corresponding to the first image, wherein the curve correction model is a model obtained by training a sample image pair;
the sample image pair comprises a sample image and a reference image which form an image pair, and the reference image is an image of the sample image after color parameter adjustment.
3. The method according to claim 2, wherein inputting the first image into a curve correction model and outputting the curve correction coefficient corresponding to the first image includes:
inputting the first image into the curve correction model, and extracting image characteristic representation of the first image;
and carrying out feature analysis on the image feature representation to obtain a curve correction coefficient corresponding to the first image.
4. A method according to claim 3, wherein said extracting an image feature representation of the first image comprises:
extracting a texture feature representation of the first image, the texture feature representation being indicative of image texture features of the first image;
extracting a global feature representation of the first image based on the texture feature representation, the global feature representation being indicative of global image features of the first image;
Extracting a local feature representation of the first image based on the texture feature representation, the local feature representation being indicative of image features of sub-regions in the first image;
extracting a semantic feature representation of the first image based on the texture feature representation, the semantic feature representation being used to indicate semantic content contained by the first image;
and fusing the texture feature representation, the global feature representation, the local feature representation and the semantic feature representation to obtain an image feature representation of the first image.
5. The method according to any one of claims 1 to 4, wherein the performing feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image includes:
partitioning the first image to obtain at least two image blocks;
and respectively carrying out feature analysis on the at least two image blocks to obtain at least two block correction coefficients corresponding to the at least two image blocks, wherein the ith image block corresponds to the ith block correction coefficient, and the curve correction coefficient is a set of the at least two block correction coefficients.
6. The method of claim 5, wherein the performing the blocking process on the first image results in at least two image blocks, comprising:
Carrying out average partitioning on the first image according to the preset partitioning quantity to obtain at least two image blocks; or alternatively, the process may be performed,
and randomly partitioning the first image to obtain the at least two image blocks.
7. The method according to any one of claims 1 to 4, further comprising:
and directly correcting the preset mapping curve by using the curve correction coefficient to obtain a target mapping curve.
8. The method according to any one of claims 1 to 4, wherein the curve modification coefficients include n sub-coefficients, n being an integer greater than 1;
correcting the curve parameters of the preset mapping curve by the curve correction coefficient to obtain a target mapping curve, wherein the method comprises the following steps:
carrying out segmentation processing on the preset mapping curve to obtain n segmented mapping curves;
and respectively correcting the n segmented mapping curves through n sub-coefficients to obtain the target mapping curve, wherein the kth segmented mapping curve is corrected through the kth sub-coefficient, k is a positive integer and k is less than or equal to n.
9. The method of claim 7, wherein the segmenting the preset mapping curve to obtain n segmented mapping curves comprises:
Carrying out average segmentation processing on the preset mapping curve according to the abscissa of the coordinate axis where the preset mapping curve is located to obtain n segmented mapping curves;
or alternatively, the process may be performed,
and carrying out average segmentation processing on the preset mapping curve according to the ordinate of the coordinate axis where the preset mapping curve is located, so as to obtain n segmentation mapping curves.
10. The method according to any one of claims 1 to 4, wherein the performing color parameter adjustment on the first image based on the target mapping curve to obtain a second image corresponding to the first image includes:
performing pixel-level feature transformation on the first image to obtain the full-resolution feature map corresponding to the first image;
the target mapping curve is acted on the full-resolution feature map, and a pixel-level mapping coefficient is obtained through linear interpolation;
and applying the pixel-level mapping coefficient to the first image to obtain the second image.
11. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a first image, wherein the first image is an image to be subjected to color parameter adjustment;
the acquisition module is further used for acquiring a preset mapping curve, wherein the preset mapping curve corresponds to preset curve parameters;
The analysis module is used for carrying out feature analysis on the first image to obtain a curve correction coefficient corresponding to the first image;
the correction module is used for correcting the curve parameters of the preset mapping curve by the curve correction coefficient to obtain a target mapping curve;
and the adjusting module is used for adjusting the color parameters of the first image based on the target mapping curve to obtain a second image corresponding to the first image, wherein the second image is an image after the color parameters are adjusted on the basis of the first image.
12. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program that is loaded and executed by the processor to implement the image processing method of any of claims 1 to 10.
13. A computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement the image processing method of any one of claims 1 to 10.
14. A computer program product comprising a computer program which, when executed by a processor, implements the image processing method according to any one of claims 1 to 10.
CN202210398173.8A 2022-04-15 2022-04-15 Image processing method, apparatus, device, storage medium, and program product Pending CN116977190A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210398173.8A CN116977190A (en) 2022-04-15 2022-04-15 Image processing method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210398173.8A CN116977190A (en) 2022-04-15 2022-04-15 Image processing method, apparatus, device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN116977190A true CN116977190A (en) 2023-10-31

Family

ID=88483626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210398173.8A Pending CN116977190A (en) 2022-04-15 2022-04-15 Image processing method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN116977190A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117745713A (en) * 2024-01-24 2024-03-22 广东省建筑工程监理有限公司 Slope protection structure deformation detection method and system based on image processing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117745713A (en) * 2024-01-24 2024-03-22 广东省建筑工程监理有限公司 Slope protection structure deformation detection method and system based on image processing

Similar Documents

Publication Publication Date Title
Wang et al. Gladnet: Low-light enhancement network with global awareness
CN110046673B (en) No-reference tone mapping image quality evaluation method based on multi-feature fusion
CN110163808B (en) Single-frame high-dynamic imaging method based on convolutional neural network
CN111292264A (en) Image high dynamic range reconstruction method based on deep learning
Li et al. A multi-scale fusion scheme based on haze-relevant features for single image dehazing
CN111105376B (en) Single-exposure high-dynamic-range image generation method based on double-branch neural network
US20170316597A1 (en) Texturing a three-dimensional scanned model with localized patch colors
CN106485668A (en) Mthods, systems and devices for overexposure correction
CN110136055B (en) Super resolution method and device for image, storage medium and electronic device
CN111598799A (en) Image toning enhancement method and image toning enhancement neural network training method
JP2021531571A (en) Certificate image extraction method and terminal equipment
CN111047543A (en) Image enhancement method, device and storage medium
CN113096029A (en) High dynamic range image generation method based on multi-branch codec neural network
US11887218B2 (en) Image optimization method, apparatus, device and storage medium
Park et al. High dynamic range and super-resolution imaging from a single image
Steffens et al. Cnn based image restoration: Adjusting ill-exposed srgb images in post-processing
CN111951172A (en) Image optimization method, device, equipment and storage medium
CN111489322A (en) Method and device for adding sky filter to static picture
KR20140035273A (en) Image processing device, image processing program, computer-readable recording medium storing image processing program, and image processing method
Shutova et al. NTIRE 2023 challenge on night photography rendering
CN115205160A (en) No-reference low-illumination image enhancement method based on local scene perception
Feng et al. Low-light image enhancement algorithm based on an atmospheric physical model
Wang et al. Low-light image enhancement based on virtual exposure
Jang et al. Dynamic range expansion using cumulative histogram learning for high dynamic range image generation
Liu et al. Progressive complex illumination image appearance transfer based on CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination