CN113139942A - Training method and device of image processing model, electronic equipment and storage medium - Google Patents

Training method and device of image processing model, electronic equipment and storage medium Download PDF

Info

Publication number
CN113139942A
CN113139942A CN202110432554.9A CN202110432554A CN113139942A CN 113139942 A CN113139942 A CN 113139942A CN 202110432554 A CN202110432554 A CN 202110432554A CN 113139942 A CN113139942 A CN 113139942A
Authority
CN
China
Prior art keywords
image
fuzzy
blurred
blur
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110432554.9A
Other languages
Chinese (zh)
Other versions
CN113139942B (en
Inventor
胡杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110432554.9A priority Critical patent/CN113139942B/en
Publication of CN113139942A publication Critical patent/CN113139942A/en
Application granted granted Critical
Publication of CN113139942B publication Critical patent/CN113139942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a training method and a device of an image processing model, electronic equipment and a storage medium, wherein the method comprises the following steps: performing fuzzy processing on each clear image in the first image sample set according to a plurality of fuzzy kernels to obtain a plurality of fuzzy images, wherein the plurality of fuzzy kernels comprise fuzzy kernels with a plurality of fuzzy grades, and the fuzzy grade of each fuzzy image corresponds to the fuzzy grade of the corresponding fuzzy kernel; generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image; obtaining a second image sample set according to the plurality of blurred images and the corresponding blurred degree images; and training the initial model according to the second image sample set to obtain an image processing model. The method can automatically calibrate the fuzzy degree score of the fuzzy sample image when the sample set is constructed, and reduces the calibration cost.

Description

Training method and device of image processing model, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for training an image processing model, an electronic device, and a storage medium.
Background
With rapid progress of the technology level and the living standard, electronic devices (such as smart phones, tablet computers, and the like) are widely used by people, the shooting function of the electronic devices is stronger and stronger, and non-professionals can shoot images with higher quality through the electronic devices. In an image deblurring scheme, in order to effectively deblur and save computing resources, the blurring degree of some interested objects (such as human faces and the like) is usually detected so as to determine whether to perform deblurring processing according to the blurring degree. However, in the scheme of performing blur degree detection through artificial intelligence, when a model is trained, the blur degree of an image needs to be calibrated manually, which results in higher labor cost.
Disclosure of Invention
In view of the foregoing problems, the present application provides a method and an apparatus for training an image processing model, an electronic device, and a storage medium.
In a first aspect, an embodiment of the present application provides a method for training an image processing model, where the method includes: acquiring a first image sample set, wherein the first image sample set comprises a plurality of clear images containing a target object; performing blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel; generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image; obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image; and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In a second aspect, an embodiment of the present application provides an apparatus for training an image processing model, where the apparatus includes: the system comprises a first sample set acquisition module, a first image acquisition module, a second sample set acquisition module and a model training module, wherein the first sample set acquisition module is used for acquiring a first image sample set, and the first image sample set comprises a plurality of clear images containing a target object; the first image acquisition module is used for carrying out fuzzy processing on each clear image in the first image sample set according to a plurality of fuzzy kernels to obtain a plurality of fuzzy images, wherein the plurality of fuzzy kernels comprise fuzzy kernels with a plurality of fuzzy grades, and the fuzzy grade of each fuzzy image corresponds to the fuzzy grade of the corresponding fuzzy kernel; the second image acquisition module is used for generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image; the second sample set acquisition module is used for obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image; and the model training module is used for training an initial model according to the second image sample set to obtain an image processing model, and the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of training an image processing model as provided in the first aspect above.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the training method for the image processing model provided in the first aspect.
According to the scheme provided by the application, a first image sample set is obtained, the first image sample set comprises a plurality of clear images containing a target object, then each clear image in the first image sample set is subjected to blurring processing according to a plurality of blurring kernels to obtain a plurality of blurred images, the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel, a blurring degree image corresponding to each blurred image is generated according to the blurring level score of each blurred image, the blurring level score of each blurred image corresponds to the blurring level of each blurred image, the gray value of each pixel point in each blurring degree image corresponds to the blurring level score of the blurred image corresponding to each blurring degree image, and then according to the plurality of blurred images and the blurring degree image corresponding to each blurred image, and obtaining a second image sample set, training the initial model according to the second image sample set to obtain an image processing model, wherein the obtained image processing model can be used for outputting a fuzzy degree image corresponding to an input image according to the input image, so that when the image processing model for detecting the fuzzy degree of a target object is trained, the fuzzy degree score can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and the model training efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 shows a flowchart of a method for training an image processing model according to one embodiment of the present application.
Fig. 2 is a schematic diagram illustrating a display effect of a blur kernel according to an embodiment of the present application.
Fig. 3 is a schematic diagram illustrating another display effect of the blur kernel according to an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating still another display effect of the blur kernel according to an embodiment of the present application.
Fig. 5 is a schematic diagram illustrating still another display effect of a blur kernel according to an embodiment of the present application.
FIG. 6 shows a flowchart of a method for training an image processing model according to another embodiment of the present application.
Fig. 7 is a flowchart illustrating step S240 in a training method of an image processing model according to another embodiment of the present application.
FIG. 8 shows a flowchart of a method of training an image processing model according to yet another embodiment of the present application.
FIG. 9 illustrates a schematic diagram of a composite image provided by an embodiment of the present application.
Fig. 10 is a schematic view illustrating an application scenario of an image processing model according to an embodiment of the present application.
FIG. 11 shows a flowchart of a method for training an image processing model according to yet another embodiment of the present application.
FIG. 12 shows a block diagram of an apparatus for training an image processing model according to an embodiment of the present application.
Fig. 13 is a block diagram of an electronic device for executing a training method of an image processing model according to an embodiment of the present application.
Fig. 14 is a storage unit for storing or carrying program code implementing a training method of an image processing model according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
At present, electronic equipment uses in daily life, and the prevalence has been nearly all people's coverage, and wherein, camera module has become electronic equipment main function point, and user's accessible electronic equipment's camera function shoots photo and video, and the image of gained is taken promptly and is obtained, convenient and fast, and moreover, the user also often uploads the image of shooing to the net and shares together with other people. Electronic devices have essentially replaced traditional cameras for taking photographs.
When the electronic device is used for photographing, natural shake may be generated in the handheld electronic device, or photographing may be performed while the handheld electronic device is moving, so that a photographed image may be blurred. Techniques for deblurring are generated for blur in an image. Image deblurring is an important image processing research direction, and the aim is to recover detail information lost due to blurring in a blurred image. In the related scheme of image deblurring, if related target objects (such as human faces, animals, vehicles and the like) appear in an image, the attention of the target objects in the image is the highest, so that defects of the area where the target objects are located relative to a background area can be amplified, and particularly, the blurred target objects are difficult to accept by a user. Generally, in order to effectively remove the blur of the blurred face region and save the computing resources, the blur degree of the target object needs to be detected first to determine whether the region where the target object is located needs to be deblurred.
Currently, for an estimation method of the degree of blur of an image, a method based on deep learning is mainly adopted. The general method for face blur estimation is as follows: detecting an area where a target object is located in a shot image, performing level estimation on the fuzzy degree of the area where the target object is located through a marker (for example, the level range is 0-16, 0 represents clear and no fuzzy, and 16 represents severe fuzzy), forming a sample pair with the detected area where the target object is located by using a final marking result as a fuzzy degree graph of a fuzzy human face, making a training data set, and then training a model to obtain the model for fuzzy degree estimation.
The inventor finds that a data set is manufactured by detecting and cutting out the area where the target object is located and manually calibrating the fuzzy degree, so that on one hand, a sample image with uniformly distributed fuzzy degree is difficult to find, the sample distribution of the final data set is not uniform, and on the other hand, the manual calibration speed is low, and the labor cost is high.
In view of the above problems, the inventor provides a training method and apparatus for an image processing model, an electronic device, and a storage medium, which can automatically calibrate a blur degree score when constructing an image sample set in a training process of an image processing model for detecting a blur degree of a target object, reduce a manual calibration cost, and increase a calibration speed, thereby improving a training efficiency of the model. The specific training method of the image processing model is described in detail in the following embodiments.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for training an image processing model according to an embodiment of the present application. In a specific embodiment, the training method of the image processing model is applied to the training apparatus 400 of the image processing model shown in fig. 12 and the electronic device 100 (fig. 13) equipped with the training apparatus 400 of the image processing model. The following will describe a specific process of this embodiment by taking an electronic device as an example, and it is understood that the electronic device applied in this embodiment may be a smart phone, a tablet computer, a smart watch, smart glasses, a notebook computer, a server, and the like, which is not limited herein. As will be described in detail with respect to the flow shown in fig. 1, the method for training the image processing model may specifically include the following steps:
step S110: a first set of image samples is acquired, the first set of image samples comprising a plurality of sharp images containing a target object.
In the embodiment of the present application, the first image sample set may include a plurality of sharp images, and each sharp image includes a target object. The target object may be a human face, a vehicle, an animal, or the like, which can be detected and recognized. The first image sample set may include sharp images corresponding to a plurality of types of target objects, and may also include sharp images of one type of target object. As can be understood, if the first image sample set includes sharp images corresponding to multiple types of target objects, then the image processing model obtained by training according to the second image sample set generated by the first image sample set can perform blur degree detection on images of different types of target objects; if the first image sample set comprises a clear image of a type of target object, the image processing model obtained by subsequent training can more accurately detect the blurring degree of the image containing the type of target object.
In some embodiments, the above sharp images may be obtained from an open-source image library, for example, when the above target object is a human face, the sharp human face images may be obtained from the open-source human face image library to construct the first image sample set. Of course, the specific manner of acquiring the above clear image may not be limited, and for example, the above clear image may be obtained by shooting the target object.
Step S120: and carrying out blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel.
In this embodiment, the electronic device may perform blurring processing on each sharp image in the first image sample set according to a plurality of kinds of blurring kernels to obtain a plurality of blurred images.
In the embodiment of the present application, the blur kernel (blu kernel) is actually a matrix, and the convolution of the sharp image and the blur kernel causes the image to become blurred, so that the blur kernel is called a blur kernel. The blur kernel is one of convolution kernels, and the essence of the image convolution operation is matrix convolution. That is, the image blur can be regarded as a process of obtaining a blurred image by convolving a blur kernel on a sharp image, for example, the blurred image can be obtained according to the following formula:
Figure BDA0003031942070000051
wherein B is a blurred image (blur image),i is a sharp image, K is a blur kernel, N is additive noise,
Figure BDA0003031942070000052
is a convolution operation.
In the embodiment of the application, the multiple blurring kernels comprise blurring kernels with multiple blurring levels, so that data in a subsequently generated sample set can be uniformly distributed, and sample images with different blurring levels can be contained. The blur level of each blurred image corresponds to the blur level of its corresponding blur kernel. That is, for a certain blurred image, the blur level of the blurred image is the blur level of the blurred image. Wherein the blur level is indicative of the degree of blur.
In some embodiments, the plurality of blur kernels may be blur kernels of different blur degrees acquired in advance, for example, blur kernels obtained in advance by performing blur kernel estimation on blur images of different blur degrees.
In other embodiments, in order to enrich the blur kernel for generating the blurred image, that is, enrich the subsequently generated sample image, a plurality of initial blur kernels may be obtained, then the initial blur kernels may be subjected to preset enhancement processing to obtain a plurality of blur kernels corresponding to the initial blur kernels, and the initial blur kernels and the plurality of blur kernels corresponding to the initial blur kernels may be used as blur kernels for blurring the sharp image. The plurality of initial blur kernels may include blur kernels of a plurality of blur degree levels. The image obtained after the blur kernel is visualized is an image containing an ellipse, and the image is subjected to corresponding transformation processing to change the shape of the image, so that different blur kernels are obtained.
As a possible implementation, performing the preset enhancement processing on each initial blur kernel may include performing a size scaling processing on each initial blur kernel. The scaling process may be to change the length and width of the initial blur kernel to change its size, that is, to change the lengths of the long axis and the short axis of the blur kernel, so as to obtain a new blur kernel. It is understood that the blur degree of the blur kernel can be measured according to the length weighting of the long axis and the short axis of the blur kernel, that is, the blur degree is quantized according to the lengths of the long axis and the short axis of the blur kernel, so that the lengths of the long axis and the short axis of the initial blur kernel can be changed through the size scaling process, thereby obtaining blur kernels with different blur levels.
As a possible implementation, performing the preset enhancement processing on each initial blur kernel may include performing a rotation processing on the initial blur kernel. The initial blur kernel is rotated to change the directions of the long axis and the short axis of the blur kernel to obtain a new blur kernel, and the blur kernel obtained at this time has a changed blur direction compared with the initial blur kernel.
As a possible implementation, performing the preset enhancement processing on each initial blur kernel includes performing affine transformation processing on each initial blur kernel. The affine transformation processing is that in geometry, one vector space is subjected to linear transformation once and then translated to be transformed into the other vector space, and for a fuzzy core, the shape of the fuzzy core can be changed through affine transformation processing, so that different fuzzy cores can be obtained.
As a possible implementation, the pre-set enhancement processing on each initial blur kernel includes performing a translation processing on each initial blur kernel. The translation processing of the initial blur kernel means that the region corresponding to the blur kernel is translated in the visual image corresponding to the blur kernel, so that the region corresponding to the blur kernel changes in the visual image corresponding to the blur kernel to obtain a new blur kernel, and for the change of the blur kernel, different regions in the image of the blur kernel can be blurred.
As a possible implementation, performing the preset enhancement processing on each initial blur kernel includes performing an edge filling processing on each initial blur kernel. The edge filling processing on the initial blur kernel may be to fill the edge of a region corresponding to the blur kernel in the visual image corresponding to the blur kernel, specifically, to fill the region corresponding to the blur kernel, so that the size of the blur kernel is increased, and thus the lengths of the long axis and the short axis of the initial blur kernel are changed, and thereby blur kernels with different blur levels are obtained.
It is understood that the preset enhancement processing for the initial blur kernel may include one of the above embodiments, and may also include a combination of multiple above embodiments.
In addition, when the preset enhancement processing of the initial fuzzy core includes a combination of multiple above embodiments, the processing may be performed on the initial fuzzy core as above; for example, please refer to image 2, the blur kernel may be visualized to show the blur kernel, fig. 2 shows an initial blur kernel, the visualized blur kernel is a binary image, a white portion of the binary image is a region corresponding to the blur kernel, and the size of the initial blur kernel is scaled to obtain the blur kernel shown in fig. 3; rotating the fuzzy core shown in fig. 3 to obtain the fuzzy core shown in fig. 4; the blur kernel shown in fig. 4 is then scaled in size, so that the blur kernel shown in fig. 5 can be obtained. Of course, the above fuzzy core is only an example, and does not represent a limitation on the specific fuzzy core.
By enhancing the fuzzy cores with different fuzzy grades, the fuzzy cores used for generating the sample images can be enriched, so that the fuzzy grades, the fuzzy directions and the like corresponding to the subsequently generated concentrated fuzzy images of the sample are enriched, and the subsequently trained image processing model can be suitable for processing different images to be processed.
In some embodiments, in the obtained clear image, if the proportion of the other areas except the area where the target object is located is larger, the clear image can be cut, so that the proportion of the area where the target object is located in the clear image is increased; then, the cut sharp image is subjected to the blurring processing.
Step S130: and generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image.
In the embodiment of the application, in order to save the manual estimation of the blur level of the blurred image and label the blurred image, the blur level of each blurred image can be quantized into a blur level score so as to label the blurred image. In addition, since the region where the target object is located in the to-be-processed image to be processed may be only a part of the to-be-processed image, or the to-be-processed image includes a plurality of target objects, in order to distinguish the blur degrees of the region corresponding to the target object from other regions, a blur degree image corresponding to each blurred image may be generated according to the blur level score of the blurred image, and the gray value of each pixel point in each blur degree image corresponds to the blur level score of the blurred image corresponding to each blur degree image. Different blur level scores correspond to different gray values so that the blur level scores can be distinguished according to the gray values.
In some embodiments, if the range of the blur degree score is 0 to N, where N is a positive integer not greater than 255, and the blur degree score is an integer, the gray value of the pixel point in the blur degree image may be 0 to N. For example, if the range of the blur degree score is 0 to 16, the gray value of the pixel point in the blur degree image is 0 to 16.
Step S140: and obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image.
In the embodiment of the present application, after the blur degree image corresponding to each blurred image is generated, a second image sample set may be constructed according to the plurality of blurred images and the blur degree image corresponding to each blurred image.
In some embodiments, each blurred image may be treated as a sample pair with its corresponding blurred degree image, i.e., each blurred image is labeled as its corresponding blurred degree image; a plurality of blurred images may be combined into one blurred image, the obtained blurred image may include a plurality of target objects, a blurred level image corresponding to the combined blurred image may be formed according to blurred level images corresponding to the blurred images, and the combined blurred image and the blurred level image corresponding to the combined blurred image may be used as one sample pair.
Step S150: and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In this embodiment, after obtaining the second image sample set, the initial model may be trained according to the second image sample set to obtain an image processing model for outputting a blur degree image corresponding to the input image according to the input image.
In some embodiments, the initial model may be a machine learning model, or a deep learning model, such as an encoder-decoder model (encoder-decoder model), an hourglass network, a self-encoding network, or the like; for another example, to facilitate the deployment of the image processing model to an electronic device of the mobile terminal class, the initial model may be a model based on the Unet network.
Alternatively, the initial model may be a convolutional neural network. Illustratively, the initial model comprises a coding network and a decoding network, wherein an image input into the coding network is activated by convolution, Batch Normalization (BN) and an activation function (Relu) and then outputs an image characteristic, and the decoding network performs convolution, batch normalization and activation of the Relu function on the input image characteristic and then outputs a fuzzy degree image after passing through a plurality of residual blocks and convolution layers.
In some embodiments, when the initial model is trained according to the second image sample set, the blurred images in the second image sample set may be used as input and input to the initial model, so as to obtain an output result of the initial model; and then obtaining the difference between the output result and the fuzzy degree image corresponding to the input fuzzy image as a loss value, and performing iterative training on the initial model until the initial model meets the preset condition.
Optionally, after the loss value is obtained, the parameters in the initial model may be updated according to the gradient of the back propagation algorithm back propagation loss value until the preset condition is satisfied. The preset conditions may be: the loss value is smaller than the preset value, the loss value is not changed any more, or the training times reach the preset times, and the like. It can be understood that after the initial model is subjected to iterative training for a plurality of training periods according to the second image sample set, wherein each training period includes iterative training for a plurality of times, and parameters of the initial model are continuously optimized, so that the loss value is smaller and smaller, and finally the loss value is reduced to a fixed value or smaller than the preset value, at this time, the initial model is converged; of course, it may be determined that the initial model has converged after the training times reach the preset times.
The training method of the image processing model provided in the embodiment of the application obtains a first image sample set including a plurality of sharp images including a target object, performs blur processing on each sharp image according to blur levels of the plurality of blur levels to obtain a plurality of blurred images, generates a blur level image corresponding to each blurred image according to a blur level score of each blurred image, obtains a second image sample set according to the plurality of blurred images and the blur level image corresponding to each blurred image, performs training on the initial model according to the second image sample set to obtain the image processing model, and the obtained image processing model can be used for outputting the blur level image corresponding to an input image according to the input image, so that in the training process of the image processing model for detecting the blur level of the target object, when an image sample set is constructed, the fuzzy degree score can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and therefore the training efficiency of the model is improved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for training an image processing model according to another embodiment of the present application. The method for training the image processing model is applied to the electronic device, and will be described in detail with respect to the flow shown in fig. 6, where the method for training the image processing model may specifically include the following steps:
step S210: a first set of image samples is acquired, the first set of image samples comprising a plurality of sharp images containing a target object.
Step S220: and carrying out blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel.
In the embodiment of the present application, step S210 and step S220 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S230: and acquiring the length of the long axis and the length of the short axis of the ellipse corresponding to each fuzzy core in the fuzzy cores.
In the embodiment of the present application, the image obtained after the blur kernel is visualized is an image including an ellipse, and the lengths of the major axis and the minor axis of the ellipse are different if the blur degree of the blur kernel is different. Therefore, the blur degree of the blur kernel may be measured according to the lengths of the long axis and the short axis of the blur kernel, thereby quantifying the blur degree according to the lengths of the long axis and the short axis of the blur kernel. Specifically, the major axis length and the minor axis length of the ellipse corresponding to each of the plurality of blur kernels may be acquired.
Step S240: and generating a fuzzy grade score corresponding to each fuzzy core based on the length of the long axis corresponding to each fuzzy core and the length of the short axis.
In this embodiment of the present application, after the major axis length and the minor axis length of the ellipse corresponding to each of the multiple kinds of blur kernels are obtained, the blur level score corresponding to each kind of blur kernel may be generated based on the major axis length and the minor axis length of the ellipse corresponding to each kind of blur kernel.
In some embodiments, referring to fig. 7, generating the blur level score corresponding to each blur kernel based on the length of the long axis and the length of the short axis corresponding to each blur kernel may include:
step S241: based on the first weight of the long axis and the second weight of the short axis, carrying out weighted summation on the length of the long axis and the length of the short axis corresponding to each fuzzy core to obtain a sum value corresponding to each fuzzy core;
step S242: acquiring a ratio of a sum value corresponding to each fuzzy core to a maximum sum value, and acquiring the ratio corresponding to each fuzzy core, wherein the maximum sum value is a sum value obtained by weighting and summing the long axis length and the short axis length of the fuzzy core corresponding to the maximum fuzzy degree based on the first weight and the second weight;
step S243: and acquiring the product of the ratio corresponding to each fuzzy core and the maximum fuzzy grade score value to obtain the fuzzy grade score corresponding to each fuzzy core.
In this embodiment, the blur level score corresponding to each blur kernel may be calculated based on the following formula:
Figure BDA0003031942070000091
wherein L is1Is the length of the major axis; l is2Is the minor axis length; b is fuzzy grade score; a1 is a first weight corresponding to the long axis of the blur kernel; a2 is a second weight corresponding to the short axis of the blur kernel; mLBased on the first weight and the second weight, carrying out weighted summation on the long axis length of the fuzzy kernel with the maximum fuzzy degree in all fuzzy kernels by the short axis length to obtain a sum value, and obtaining a fuzzy kernel long and short axis weighted value corresponding to the maximum fuzzy in all fuzzy kernels; n is a radical ofBThe specified maximum blur level rating value. For example, if the fuzzy scale score ranges from 0 to 16, then NBAnd taking 16.
When the calculated product is not an integer, the calculated product may be rounded (for example, rounded) to generate blur degree images of different gray values in the following.
In other embodiments, the long axis length corresponding to each fuzzy core may be weighted and summed according to the short axis length to obtain a sum value corresponding to each fuzzy core, and the sum value is divided by a preset value to obtain a value of 0 to 255 as a fuzzy grade score; when the obtained value is not an integer, it may be rounded (for example, rounded) to generate blur degree images of different gray values in the following.
In still other embodiments, after the long axis length and the short axis length corresponding to each blur kernel are weighted and summed to obtain a sum value corresponding to each blur kernel, if the range of all the sum values does not exceed the range of 0 to 255, the sum value may be rounded and directly used as a blur level score.
Of course, the specific manner of quantifying the blur degree according to the length of the long axis and the length of the short axis of the blur kernel may not be limited.
Step S250: and determining the fuzzy grade score of each fuzzy image based on the fuzzy grade score corresponding to each fuzzy core.
In this embodiment of the application, after the blur level score corresponding to each blur kernel is obtained, the blur level score of each blurred image may be determined based on the blur level score corresponding to each blur kernel. The blur level score of each blurred image may be determined according to the blur level score corresponding to the corresponding blur kernel, that is, the blur level score of the blur kernel corresponding to each blurred image is the blur level score of the blurred image.
Step S260: and generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image.
Step S270: and obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image.
Step S280: and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In the embodiment of the present application, the contents of the steps S260 to S270 may refer to the contents of other embodiments, and are not described herein again.
The training method of the image processing model provided in the embodiment of the application obtains a plurality of clear images including a target object, then performs fuzzy processing on each clear image according to fuzzy cores of a plurality of fuzzy grades to obtain a plurality of fuzzy images, generates a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, then obtains a second image sample set according to the plurality of fuzzy images and the fuzzy degree image corresponding to each fuzzy image, trains an initial model according to the second image sample set to obtain the image processing model, and the obtained image processing model can be used for outputting a fuzzy degree image corresponding to an input image according to the input image, so that in the training process of the image processing model for detecting the fuzzy degree of the target object, when the image sample set is constructed, the fuzzy degree scoring can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and therefore the training efficiency of the model is improved. In addition, a fuzzy grade quantification mode is provided, and the fuzzy degree corresponding to the fuzzy core is quantified according to the lengths of the long axis and the short axis of the fuzzy core, so that the fuzzy degree of the generated fuzzy image is quantified.
Referring to fig. 8, fig. 8 is a flowchart illustrating a method for training an image processing model according to another embodiment of the present application. The method for training the image processing model is applied to the electronic device, and will be described in detail with respect to the flow shown in fig. 8, where the method for training the image processing model may specifically include the following steps:
step S310: a first set of image samples is acquired, the first set of image samples comprising a plurality of sharp images containing a target object.
Step S320: and carrying out blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel.
Step S330: and generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image.
In the embodiment of the present application, steps S310 to S330 may refer to contents of other embodiments, and are not described herein again
Step S340: and randomly scaling the image size of each blurred image into an image size corresponding to a target size grade, wherein the target size grade is any one of a plurality of size grades.
In the embodiment of the present application, considering that there is a certain difference in the sizes of the regions corresponding to the target objects in the actual image to be processed, if the images with different sizes are scaled to a uniform size to predict the blur degree, there is an excessive scaling, which may result in inaccurate final blur prediction. For example, when shooting, there is a certain difference in the size of the face in the picture, and if the faces with different sizes are scaled to a uniform size to predict the blur degree, there is an excessive scaling, which may result in inaccurate final face blur prediction. Therefore, the scaling of the images with different sizes can be controlled within a certain range, and the problem of inaccurate fuzzy prediction caused by large-scale scaling is solved.
In the model training process, the image size of each blurred image may be randomly scaled to an image size corresponding to the target size level. Wherein the target size level is any one of a plurality of size levels. Optionally, the target size level may be divided into three size levels, the three size levels correspond to a first size level, a second size level and a third size level respectively, and the first size level, the second size level and the third size level are sequentially increased corresponding to different image sizes, so that the blurred image may be scaled into images of the three size levels.
Step S350: and dividing the blurred images into different blurred image combinations according to the size grades corresponding to the image sizes of the zoomed blurred images to obtain a plurality of blurred image combinations, wherein the blurred image combinations comprise a plurality of blurred images with different size grades.
In the embodiment of the application, considering that a plurality of target objects may appear in an actual graph to be processed, the number of the target objects is uncertain, and if each target object is cut out in sequence and then sent to a model for fuzzy degree detection, the detection efficiency is low. For example, when shooting, 0 to n faces appear in a picture, where n is a positive integer, the number of faces is uncertain, and n faces need to be cut out and then sent to a network in sequence for fuzzy prediction, so that the prediction time is not fixed, and the method consumes long time and is inefficient. Thus, in constructing the sample image set, a sample image containing a plurality of target objects may be generated. Specifically, the plurality of blurred images may be divided into different blurred image combinations according to the size grade corresponding to the image size of each zoomed blurred image, so as to obtain a plurality of blurred image combinations, where each blurred image combination includes blurred images with different size grades.
Step S360: and generating a composite image corresponding to each blurred image combination in the plurality of blurred image combinations, wherein the composite image comprises the blurred images in the blurred image combination corresponding to the composite image.
In the embodiment of the present application, after obtaining the above multiple blurred image combinations, the blurred images in each blurred image combination may be combined to generate a combined image corresponding to each blurred image combination, where each combined image includes the blurred image in the blurred image combination corresponding to the combined image.
In some embodiments, when the image includes a plurality of target objects, to normalize the arrangement of the target objects, so as to improve accuracy of detecting the blur degree by the model, when a composite image corresponding to each of the plurality of blurred image combinations is generated, each blurred image of each of the blurred image combinations may be set in a corresponding region in the image template according to a size level of the blurred image in each of the blurred image combinations, so as to obtain the composite image corresponding to each of the blurred image combinations, where the image template includes the set region of the blurred image of each size level. In this embodiment, the position of the setting region only needs to be fixed, the specific position thereof may not be limited, and the image template only needs to provide fixed setting regions corresponding to different size levels.
Illustratively, when the size levels include a first size level, a second size level and a third size level which are increased in sequence, the number of target objects is limited, generally not more than 10, considering the influence of the size of the picture when in actual shooting, and the number is widened to 16 at most in the embodiment of the application; the image template can be uniformly divided into 16 areas, and the image template can be uniformly divided into 4 areas, wherein the 16 areas serve as the setting areas of the blurred images of the first size level, the 4 areas serve as the setting areas of the blurred images of the second size level, and the whole area of the image template serves as the setting area of the blurred images of the third size level. In addition, if a plurality of blurred images of the first size level exist in the blurred image combination, the blurred images of the first size level can be superposed on the blurred images of the second size level, so that the placement space is saved; if the blurred image of the third size level exists, the blurred image of the third size level may be set behind the image template, and the blurred image of the second size level and the blurred image of the first size level may be sequentially superimposed on the blurred image of the third size level, so as to save the placement space.
Step S370: and generating a blurring degree image of the composite image according to a blurring degree image of a blurred image included in the composite image, and taking a set of the composite image and the blurring degree image corresponding to the composite image as a second image sample set.
In the embodiment of the present application, after the composite image corresponding to each blurred image combination is generated, the blurred degree images of the blurred images included in the composite image may be synthesized according to a synthesis method of synthesizing the blurred images in the composite blurred image combination, so as to obtain the blurred degree image of the blurred image included in the composite image. After obtaining the blur degree image corresponding to each composite image, a second image sample set, that is, a sample set subsequently used for training the initial model, may be obtained in a manner that the blur degree image of the blur image included in each composite image and the composite image corresponding to each composite image is used as a sample pair.
Step S380: and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In the embodiment of the present application, step S380 may refer to the contents of other embodiments, which are not described herein again.
In some embodiments, when actually using the image processing model to detect the blur degree, if the second image sample set is obtained according to the above embodiment that the blurred image of each blurred image combination is set in the corresponding region of the image template according to the size level of the blurred image in each blurred image combination to obtain the composite image corresponding to each blurred image combination, during application, the target object corresponding region is identified and detected, and then the small range of the target object corresponding region is scaled to correspond to the specified size level (i.e. to meet one of the above size levels) according to the size level close to the size of the target object corresponding region, and the target object corresponding region is placed in the above image template to obtain the composite input image, and then the input image is input to the image processing model, so as to effectively avoid the influence of the large scale on the accuracy of blur prediction, in addition, the arrangement of the corresponding areas of the target objects in the sample image is specified during model training, and the input image is normalized according to the arrangement during actual application, so that the accuracy of fuzzy degree detection can be further improved. For example, referring to fig. 9, if the target object is a human face, if the size levels include a first size level, a second size level, and a third size level that are sequentially increased, the setting manner of the blurred image in the image template may be such that a composite image is obtained as shown in fig. 9; referring to fig. 10, the composite image shown in fig. 9 is input into the image processing model 301, and a blur degree image corresponding to the composite image can be effectively obtained, where the blur degree image includes a plurality of face regions, and the blur degrees of the face regions are different, so that gray values of regions corresponding to different faces in the obtained blur degree image are also different.
The training method of the image processing model provided in the embodiment of the application obtains a plurality of clear images including a target object, then performs fuzzy processing on each clear image according to fuzzy cores of a plurality of fuzzy grades to obtain a plurality of fuzzy images, generates a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, then obtains a second image sample set according to the plurality of fuzzy images and the fuzzy degree image corresponding to each fuzzy image, trains an initial model according to the second image sample set to obtain the image processing model, and the obtained image processing model can be used for outputting a fuzzy degree image corresponding to an input image according to the input image, so that in the training process of the image processing model for detecting the fuzzy degree of the target object, when the image sample set is constructed, the fuzzy degree scoring can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and therefore the training efficiency of the model is improved. In addition, considering that a plurality of target objects may exist in an actually processed image, a plurality of blurred images are included when a sample image is generated. In addition, in order to avoid the problem that the fuzzy detection is inaccurate due to the fact that the input image is subjected to unified size scaling, fuzzy images of multiple size grades are generated and combined to generate a composite image, so that in practical application, the area where the target object is located does not need to be subjected to large-scale scaling, one size grade can be met, the standard of the input image is met, and the input image is input to an image processing model to obtain a result.
Referring to fig. 11, fig. 11 is a flowchart illustrating a method for training an image processing model according to yet another embodiment of the present application. The method for training the image processing model is applied to the electronic device, and will be described in detail with respect to the flow shown in fig. 11, where the method for training the image processing model may specifically include the following steps:
step S410: a first set of image samples is acquired, the first set of image samples comprising a plurality of sharp images containing a target object.
Step S420: and carrying out blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel.
Step S430: and generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image.
Step S440: and obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image.
Step S450: and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
In the embodiment of the present application, steps S410 to S450 may refer to contents of other embodiments, and are not described herein again
Step S460: and inputting the image to be processed into the image processing model to obtain a fuzzy degree image corresponding to the image to be processed and output by the image processing model.
It should be noted that, for the image processing model in the foregoing embodiment, the embodiment of the present application further includes an application method of the image processing model. It should be noted that the above training of the image processing model may be performed in advance, and then, when the image to be processed needs to be processed each time, the image processing model obtained by the training may be used, instead of training the image processing model each time the image to be processed is processed.
In the embodiment of the application, the electronic device may acquire an image to be processed, which needs to be subjected to blur degree detection, where the image to be processed includes a target object. It can be understood that when people take images of target objects, the images may be in a moving state, or the images may be blurred due to hand shaking during the shooting, or the images may be blurred due to hardware factors during the imaging, and users generally need clear images, so that there is a need to deblur the images. Before the blur removal is needed to be performed on the image to be processed, the blur degree of the image to be processed can be detected.
In some embodiments, when the electronic device is a mobile terminal provided with a camera, such as a smart phone, a tablet computer, or a smart watch, image acquisition may be performed through a front camera or a rear camera, so as to obtain an image to be processed.
In other embodiments, the electronic device may obtain the image to be processed locally, that is, the electronic device may obtain the image to be processed from a file stored locally, for example, when the electronic device is a mobile terminal, the image to be processed may be obtained from an album, that is, the electronic device may collect the image by a camera in advance and store the image in a local album, or download the image from a network in advance and store the image in a local album, and then read the image to be processed from the album when the detection of the degree of blur is required.
In still other embodiments, when the electronic device is a mobile terminal or a computer, the image to be processed may also be downloaded from a network, for example, the electronic device may download the required image to be processed from a corresponding server through a wireless network, a data network, and the like.
In still other embodiments, the electronic device may also receive an input to-be-processed image through an input operation of the user on another device, so as to obtain the to-be-processed image.
Of course, the way in which the electronic device specifically acquires the image to be processed may not be limiting.
After the image to be processed is acquired, the image to be processed can be input to the above image processing model to obtain a blur degree image output by the image processing model.
Step S470: and determining the fuzzy grade score of the image to be processed according to the gray value of each pixel point in the fuzzy degree image.
In the embodiment of the application, after the blur degree image output by the image processing model is obtained, the blur degree score of the image to be processed can be determined according to the gray value of each pixel point in the blur degree image.
In some embodiments, if the region corresponding to the target object in the image to be processed is only a part of the image to be processed, the region corresponding to the target object may be identified, the gray values of the pixel points in the region may be counted, and finally, the blur level score corresponding to the region may be obtained as the blur level score of the image to be processed. The statistics of the gray values of the pixels in the region may be performed in a manner of averaging the gray values of all the pixels in the region, and the specific statistical manner may not be limited.
In other embodiments, if the image to be processed includes a plurality of target objects, the regions where the target objects are located may be identified respectively, and then the blur level score of the region where the target objects are located may be determined according to the gray value of each pixel point in the blur degree image. That is to say, the gray values of all the pixel points in the region where each target object is located can be counted to obtain the blur level score of the region where each target object is located, and the blur level score is used as the blur level score of the image to be processed.
Step S480: and when the blur grade score of the image to be processed is larger than a score threshold value, performing deblurring processing on the image to be processed.
In the embodiment of the application, after determining the blur level score of the image to be processed, the blur level score may be compared with a score threshold; if the fuzzy grade score of the image to be processed is greater than the score threshold value, the fuzzy degree of the image to be processed is high, so that the image to be processed can be deblurred; if the blur grade score of the image to be processed is not larger than the score threshold value, the blur degree of the image to be processed is low, and the image to be processed can not be deblurred.
In some embodiments, a pre-trained deblurring model may be utilized to deblur the image to be processed; the image to be processed can also be deblurred by identifying the corresponding blur kernel of the image to be processed and then adopting a deconvolution mode. Of course, the specific implementation manner of the deblurring may not be limited in the embodiments of the present application.
In some embodiments, if the image to be processed includes a plurality of target objects, the blur level score of the region where each target object is located may be compared with a grayscale threshold, and when the blur level score of any region where any target object is located is greater than the score threshold, the image to be processed may be deblurred. In this case, the target region in the image to be processed may be deblurred, and the target region may be a region where the target object whose blur level score is greater than the score threshold value is located. That is to say, only the region with the fuzzy grade score larger than the score threshold value is subjected to deblurring processing, so that the power consumption and the processing time are saved, and the processing efficiency is improved.
In the embodiment of the application, in the training process of the image processing model for detecting the fuzzy degree of the target object, when the image sample set is constructed, the fuzzy degree score can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and therefore the training efficiency of the model is improved. In addition, an application scheme of the image processing model is provided, the image to be processed is input into the image processing model, the output fuzzy degree image is obtained, then the fuzzy grade score corresponding to the image to be processed can be rapidly determined according to the fuzzy degree image, and when the fuzzy grade score is larger than the score threshold value, the image to be processed is subjected to deblurring processing, so that the fuzzy removal can be effectively realized, and the computing resource is saved.
Referring to fig. 12, a block diagram of an image processing model training apparatus 400 according to an embodiment of the present disclosure is shown. The electronic device is applied to the training apparatus 400 for the image processing model, and the training apparatus 400 for the image processing model includes: a first sample set acquisition module 410, a first image acquisition module 420, a second image acquisition module 430, a second sample set acquisition module 440, and a model training module 450. The first sample set acquiring module 410 is configured to acquire a first image sample set, where the first image sample set includes a plurality of sharp images containing a target object; the first image obtaining module 420 is configured to perform blurring processing on each sharp image in the first image sample set according to multiple blurring kernels to obtain multiple blurred images, where the multiple blurring kernels include blurring kernels with multiple blurring levels, and a blurring level of each blurred image corresponds to a blurring level of a corresponding blurring kernel; the second image obtaining module 430 is configured to generate a blur degree image corresponding to each blurred image according to the blur level score of each blurred image, where the blur level score of each blurred image corresponds to the blur level of each blurred image, and a gray value of each pixel in each blur degree image corresponds to the blur level score of the blurred image corresponding to each blur degree image; the second sample set obtaining module 440 is configured to obtain a second image sample set according to the multiple blurred images and the blur degree image corresponding to each blurred image; the model training module 450 is configured to train an initial model according to the second image sample set to obtain an image processing model, where the image processing model is configured to output a blur degree image corresponding to an input image according to the input image.
In some embodiments, the training apparatus 400 for the image processing model may further include: the device comprises a length acquisition module, a score generation module and a first score determination module. The length acquisition module is used for acquiring the length of the long axis and the length of the short axis of the ellipse corresponding to each fuzzy core in the fuzzy cores; the score generation module is used for generating a fuzzy grade score corresponding to each fuzzy core according to the length of the long axis corresponding to each fuzzy core and the length of the short axis; the first score determining module is used for determining a fuzzy grade score of each fuzzy image based on the fuzzy grade score corresponding to each fuzzy core.
In one possible implementation, the length obtaining module may be configured to: based on the first weight of the long axis and the second weight of the short axis, carrying out weighted summation on the length of the long axis and the length of the short axis corresponding to each fuzzy core to obtain a sum value corresponding to each fuzzy core; acquiring a ratio of a sum value corresponding to each fuzzy core to a maximum sum value, and acquiring the ratio corresponding to each fuzzy core, wherein the maximum sum value is a sum value obtained by weighting and summing the long axis length and the short axis length of the fuzzy core corresponding to the maximum fuzzy degree based on the first weight and the second weight; and acquiring the product of the ratio corresponding to each fuzzy core and the maximum fuzzy grade score value to obtain the fuzzy grade score corresponding to each fuzzy core.
In some embodiments, the second sample set acquisition module 440 may be configured to: randomly scaling the image size of each blurred image into an image size corresponding to a target size grade, wherein the target size grade is any one of a plurality of size grades; dividing the blurred images into different blurred image combinations according to the size grade corresponding to the image size of each zoomed blurred image to obtain a plurality of blurred image combinations, wherein the blurred image combinations comprise a plurality of blurred images with different size grades; generating a composite image corresponding to each blurred image combination in the plurality of blurred image combinations, wherein the composite image comprises blurred images in the blurred image combination corresponding to the composite image; and generating a blurring degree image of the composite image according to a blurring degree image of a blurred image included in the composite image, and taking a set of the composite image and the blurring degree image corresponding to the composite image as a second image sample set.
In one possible implementation, the second sample set obtaining module 440, according to the generating of the composite image corresponding to each blurred image combination in the plurality of blurred image combinations, includes: and according to the size grade of the blurred image in each blurred image combination, setting each blurred image of each blurred image combination in a corresponding area in an image template to obtain a composite image corresponding to each blurred image combination, wherein the image template comprises the setting area of the blurred image of each size grade.
In some embodiments, the training apparatus 400 for the image processing model may further include: the fuzzy core processing device comprises a fuzzy core obtaining module and a fuzzy core processing module. The fuzzy core acquiring module is used for acquiring a plurality of initial fuzzy cores, and the initial fuzzy cores comprise fuzzy cores with a plurality of fuzzy degree grades; the fuzzy kernel processing module is used for carrying out preset enhancement processing on each initial fuzzy kernel to obtain a plurality of fuzzy kernels corresponding to each initial fuzzy kernel, and the plurality of initial fuzzy kernels and the plurality of fuzzy kernels corresponding to each initial fuzzy kernel are used as fuzzy kernels for carrying out fuzzy processing on a clear image.
In one possible implementation, the preset enhancement processing of each initial blur kernel by the blur kernel processing module includes one or more of the following manners:
performing size scaling processing on each initial fuzzy core;
performing rotation processing on each initial fuzzy core;
carrying out affine transformation processing on each initial fuzzy core;
performing translation processing on each initial fuzzy core; and
and performing edge filling processing on each initial fuzzy core.
In some embodiments, the training apparatus 400 for the image processing model may further include: the image processing device comprises an image input module, a second grading determination module and a deblurring module. The image input module is used for inputting an image to be processed into the image processing model to obtain a fuzzy degree image corresponding to the image to be processed and output by the image processing model; the second grading determination module is used for determining the grade grading of the image to be processed according to the gray value of each pixel point in the fuzzy degree image; and the deblurring module is used for performing deblurring processing on the image to be processed when the blur grade score of the image to be processed is greater than a score threshold value.
In a possible implementation manner, the image to be processed includes a plurality of target objects, and the second score determining module may be configured to determine, according to a gray value of each pixel point in the blur degree image, a blur level score of a region in which each target object in the image to be processed is located, as the blur level score of the image to be processed; the deblurring module can be used for performing deblurring processing on the image to be processed when the blur level score of the region where any target object is located is larger than a score threshold value.
In this embodiment, the deblurring module may be configured to: and when the fuzzy grade score of the area where any target object is located is greater than the score threshold, performing deblurring processing on the target area in the image to be processed, wherein the target area is the area where the target object with the fuzzy grade score greater than the score threshold is located.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, in the scheme provided by the application, a first image sample set is obtained, the first image sample set comprises a plurality of clear images containing a target object, then each clear image in the first image sample set is subjected to blurring processing according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel, a blurring degree image corresponding to each blurred image is generated according to the blurring level score of each blurred image, the blurring level score of each blurred image corresponds to the blurring level of each blurred image, the gray value of each pixel point in each blurring degree image corresponds to the blurring level score of the blurred image corresponding to each blurring degree image, and then according to the plurality of images, and a fuzzy degree image corresponding to each fuzzy image to obtain a second image sample set, training the initial model according to the second image sample set to obtain an image processing model, and outputting a fuzzy degree image corresponding to the input image according to the input image by using the obtained image processing model, so that in the training process of the image processing model for detecting the fuzzy degree of the target object, when the image sample set is constructed, the fuzzy degree score can be automatically calibrated, the manual calibration cost is reduced, the calibration speed is increased, and the training efficiency of the model is improved.
Referring to fig. 13, a block diagram of an electronic device according to an embodiment of the present application is shown. The electronic device 100 may be a computer device capable of running an application, such as a smart phone, a tablet computer, a smart watch, smart glasses, a notebook computer, and a server. The electronic device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall electronic device 100 using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The data storage area may also store data created by the electronic device 100 during use (e.g., phone book, audio-video data, chat log data), and the like.
Referring to fig. 14, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. A method of training an image processing model, the method comprising:
acquiring a first image sample set, wherein the first image sample set comprises a plurality of clear images containing a target object;
performing blurring processing on each sharp image in the first image sample set according to a plurality of blurring kernels to obtain a plurality of blurred images, wherein the plurality of blurring kernels comprise blurring kernels with a plurality of blurring levels, and the blurring level of each blurred image corresponds to the blurring level of the corresponding blurring kernel;
generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image;
obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image;
and training an initial model according to the second image sample set to obtain an image processing model, wherein the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
2. The method according to claim 1, wherein before the generating of the blur degree image corresponding to each blurred image according to the blur level score of each blurred image, the method further comprises:
acquiring the length of the long axis and the length of the short axis of the ellipse corresponding to each fuzzy core in the plurality of fuzzy cores;
generating a fuzzy grade score corresponding to each fuzzy core according to the length of the long axis corresponding to each fuzzy core and the length of the short axis;
and determining the fuzzy grade score of each fuzzy image based on the fuzzy grade score corresponding to each fuzzy core.
3. The method of claim 2, wherein the generating the blur level score for each blur kernel based on the length of the long axis and the length of the short axis for each blur kernel comprises:
based on the first weight of the long axis and the second weight of the short axis, carrying out weighted summation on the length of the long axis and the length of the short axis corresponding to each fuzzy core to obtain a sum value corresponding to each fuzzy core;
acquiring a ratio of a sum value corresponding to each fuzzy core to a maximum sum value, and acquiring the ratio corresponding to each fuzzy core, wherein the maximum sum value is a sum value obtained by weighting and summing the long axis length and the short axis length of the fuzzy core corresponding to the maximum fuzzy degree based on the first weight and the second weight;
and acquiring the product of the ratio corresponding to each fuzzy core and the maximum fuzzy grade score value to obtain the fuzzy grade score corresponding to each fuzzy core.
4. The method according to claim 1, wherein obtaining a second image sample set according to the plurality of blurred images and the blur degree image corresponding to each blurred image comprises:
randomly scaling the image size of each blurred image into an image size corresponding to a target size grade, wherein the target size grade is any one of a plurality of size grades;
dividing the blurred images into different blurred image combinations according to the size grade corresponding to the image size of each zoomed blurred image to obtain a plurality of blurred image combinations, wherein the blurred image combinations comprise a plurality of blurred images with different size grades;
generating a composite image corresponding to each blurred image combination in the plurality of blurred image combinations, wherein the composite image comprises blurred images in the blurred image combination corresponding to the composite image;
and generating a blurring degree image of the composite image according to a blurring degree image of a blurred image included in the composite image, and taking a set of the composite image and the blurring degree image corresponding to the composite image as a second image sample set.
5. The method of claim 4, wherein generating the composite image corresponding to each of the plurality of blurred image combinations comprises:
and according to the size grade of the blurred image in each blurred image combination, setting each blurred image of each blurred image combination in a corresponding area in an image template to obtain a composite image corresponding to each blurred image combination, wherein the image template comprises the setting area of the blurred image of each size grade.
6. The method of claim 1, wherein prior to said blurring each sharp image in said first sample set of images according to a plurality of blur kernels, the method further comprises:
acquiring a plurality of initial fuzzy cores, wherein the plurality of initial fuzzy cores comprise fuzzy cores with a plurality of fuzzy degree grades;
and performing preset enhancement processing on each initial fuzzy core to obtain a plurality of fuzzy cores corresponding to each initial fuzzy core, and taking the plurality of initial fuzzy cores and the plurality of fuzzy cores corresponding to each initial fuzzy core as fuzzy cores for performing fuzzy processing on a clear image.
7. The method according to claim 6, wherein the pre-determined enhancement processing for each initial blur kernel comprises one or more of the following ways:
performing size scaling processing on each initial fuzzy core;
performing rotation processing on each initial fuzzy core;
carrying out affine transformation processing on each initial fuzzy core;
performing translation processing on each initial fuzzy core; and
and performing edge filling processing on each initial fuzzy core.
8. The method of any of claims 1-7, wherein after said training an initial model from said second set of image samples to obtain an image processing model, said method further comprises:
inputting an image to be processed into the image processing model to obtain a fuzzy degree image corresponding to the image to be processed and output by the image processing model;
determining a fuzzy grade score of the image to be processed according to the gray value of each pixel point in the fuzzy degree image;
and when the blur grade score of the image to be processed is larger than a score threshold value, performing deblurring processing on the image to be processed.
9. The method according to claim 8, wherein the image to be processed includes a plurality of target objects, and the determining the blur level score of the image to be processed according to the gray-level value of each pixel point in the blur degree image includes:
determining a fuzzy grade score of a region where each target object in the image to be processed is located according to the gray value of each pixel point in the fuzzy degree image, and taking the fuzzy grade score as the fuzzy grade score of the image to be processed;
when the blur grade score of the image to be processed is greater than a score threshold value, performing deblurring processing on the image to be processed, wherein the deblurring processing comprises the following steps:
and when the fuzzy grade score of the region where any target object is located is greater than the score threshold value, performing deblurring processing on the image to be processed.
10. The method according to claim 9, wherein when the blur level score of the region where any target object is located is greater than the score threshold, performing deblurring processing on the image to be processed includes:
and when the fuzzy grade score of the area where any target object is located is greater than the score threshold, performing deblurring processing on the target area in the image to be processed, wherein the target area is the area where the target object with the fuzzy grade score greater than the score threshold is located.
11. An apparatus for training an image processing model, the apparatus comprising: a first sample set acquisition module, a first image acquisition module, a second sample set acquisition module, and a model training module,
the first sample set acquisition module is used for acquiring a first image sample set, and the first image sample set comprises a plurality of clear images containing target objects;
the first image acquisition module is used for carrying out fuzzy processing on each clear image in the first image sample set according to a plurality of fuzzy kernels to obtain a plurality of fuzzy images, wherein the plurality of fuzzy kernels comprise fuzzy kernels with a plurality of fuzzy grades, and the fuzzy grade of each fuzzy image corresponds to the fuzzy grade of the corresponding fuzzy kernel;
the second image acquisition module is used for generating a fuzzy degree image corresponding to each fuzzy image according to the fuzzy grade score of each fuzzy image, wherein the fuzzy grade score of each fuzzy image corresponds to the fuzzy grade of each fuzzy image, and the gray value of each pixel point in each fuzzy degree image corresponds to the fuzzy grade score of the fuzzy image corresponding to each fuzzy degree image;
the second sample set acquisition module is used for obtaining a second image sample set according to the plurality of blurred images and the blurred degree image corresponding to each blurred image;
and the model training module is used for training an initial model according to the second image sample set to obtain an image processing model, and the image processing model is used for outputting a fuzzy degree image corresponding to the input image according to the input image.
12. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-10.
13. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 10.
CN202110432554.9A 2021-04-21 2021-04-21 Training method and device for image processing model, electronic equipment and storage medium Active CN113139942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110432554.9A CN113139942B (en) 2021-04-21 2021-04-21 Training method and device for image processing model, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110432554.9A CN113139942B (en) 2021-04-21 2021-04-21 Training method and device for image processing model, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113139942A true CN113139942A (en) 2021-07-20
CN113139942B CN113139942B (en) 2023-10-31

Family

ID=76813131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110432554.9A Active CN113139942B (en) 2021-04-21 2021-04-21 Training method and device for image processing model, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113139942B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920229A (en) * 2017-01-22 2017-07-04 北京奇艺世纪科技有限公司 Image obscuring area automatic testing method and system
CN108898579A (en) * 2018-05-30 2018-11-27 腾讯科技(深圳)有限公司 A kind of image definition recognition methods, device and storage medium
CN111462010A (en) * 2020-03-31 2020-07-28 腾讯科技(深圳)有限公司 Training method of image processing model, image processing method, device and equipment
US20200349680A1 (en) * 2018-04-04 2020-11-05 Tencent Technology (Shenzhen) Company Limited Image processing method and device, storage medium and electronic device
CN111914939A (en) * 2020-08-06 2020-11-10 平安科技(深圳)有限公司 Method, device and equipment for identifying blurred image and computer readable storage medium
CN112102185A (en) * 2020-09-04 2020-12-18 腾讯科技(深圳)有限公司 Image deblurring method and device based on deep learning and electronic equipment
CN112561879A (en) * 2020-12-15 2021-03-26 北京百度网讯科技有限公司 Ambiguity evaluation model training method, image ambiguity evaluation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920229A (en) * 2017-01-22 2017-07-04 北京奇艺世纪科技有限公司 Image obscuring area automatic testing method and system
US20200349680A1 (en) * 2018-04-04 2020-11-05 Tencent Technology (Shenzhen) Company Limited Image processing method and device, storage medium and electronic device
CN108898579A (en) * 2018-05-30 2018-11-27 腾讯科技(深圳)有限公司 A kind of image definition recognition methods, device and storage medium
CN111462010A (en) * 2020-03-31 2020-07-28 腾讯科技(深圳)有限公司 Training method of image processing model, image processing method, device and equipment
CN111914939A (en) * 2020-08-06 2020-11-10 平安科技(深圳)有限公司 Method, device and equipment for identifying blurred image and computer readable storage medium
CN112102185A (en) * 2020-09-04 2020-12-18 腾讯科技(深圳)有限公司 Image deblurring method and device based on deep learning and electronic equipment
CN112561879A (en) * 2020-12-15 2021-03-26 北京百度网讯科技有限公司 Ambiguity evaluation model training method, image ambiguity evaluation method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄绿娥;吴禄慎;陈华伟;: "融合LSTM的DCNN及其图像模糊类型和参数识别", 应用基础与工程科学学报, no. 5 *

Also Published As

Publication number Publication date
CN113139942B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN108229526B (en) Network training method, network training device, image processing method, image processing device, storage medium and electronic equipment
CN108961303B (en) Image processing method and device, electronic equipment and computer readable medium
CN111415358B (en) Image segmentation method, device, electronic equipment and storage medium
US8861884B1 (en) Training classifiers for deblurring images
CN109583449A (en) Character identifying method and Related product
CN112329702B (en) Method and device for rapid face density prediction and face detection, electronic equipment and storage medium
CN109035257B (en) Portrait segmentation method, device and equipment
CN113139917A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111753839A (en) Text detection method and device
CN112132017B (en) Image processing method and device and electronic equipment
CN111814913A (en) Training method and device for image classification model, electronic equipment and storage medium
CN112418327A (en) Training method and device of image classification model, electronic equipment and storage medium
CN113888431A (en) Training method and device of image restoration model, computer equipment and storage medium
CN111882565B (en) Image binarization method, device, equipment and storage medium
CN115187715A (en) Mapping method, device, equipment and storage medium
CN110689496B (en) Method and device for determining noise reduction model, electronic equipment and computer storage medium
JP2006304062A5 (en)
CN111144156B (en) Image data processing method and related device
Wu et al. Towards robust text-prompted semantic criterion for in-the-wild video quality assessment
CN108734712B (en) Background segmentation method and device and computer storage medium
CN110516202B (en) Document generator acquisition method, document generation device and electronic equipment
CN113139942B (en) Training method and device for image processing model, electronic equipment and storage medium
JP6432182B2 (en) Service providing apparatus, method, and program
CN112991212A (en) Image processing method, image processing device, electronic equipment and storage medium
Nguyen et al. Joint image deblurring and binarization for license plate images using deep generative adversarial networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant