CN117437521A - Pruning method and device of image processing model and electronic equipment - Google Patents

Pruning method and device of image processing model and electronic equipment Download PDF

Info

Publication number
CN117437521A
CN117437521A CN202311598978.8A CN202311598978A CN117437521A CN 117437521 A CN117437521 A CN 117437521A CN 202311598978 A CN202311598978 A CN 202311598978A CN 117437521 A CN117437521 A CN 117437521A
Authority
CN
China
Prior art keywords
image processing
image
processing model
target
channels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311598978.8A
Other languages
Chinese (zh)
Inventor
周凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN202311598978.8A priority Critical patent/CN117437521A/en
Publication of CN117437521A publication Critical patent/CN117437521A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a pruning method and device of an image processing model and electronic equipment, and relates to the technical field of software testing, wherein the method comprises the following steps: acquiring a sample, wherein the sample comprises a sample image and a reference image corresponding to the sample image; determining a target channel in a plurality of channels of a plurality of convolution layers of the trained image processing model, and acquiring a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that the target channel is applied with a plurality of different channel parameters; determining a first contribution degree of the target channel according to the difference degrees between the plurality of processing result images and the reference image; according to the first contribution degree of the plurality of target channels, the importance of each channel to the image processing model is determined, so that pruning processing is carried out on the channels in the image processing model to obtain the target image processing model, and therefore the image processing effect of the image processing model can be ensured while the parameter number and the calculated amount of the image processing model are obviously reduced.

Description

Pruning method and device of image processing model and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a pruning method and apparatus for an image processing model, and an electronic device.
Background
With the development of technology, neural network models for implementing image processing tasks such as image recognition and object detection have been widely used in various fields, and in general, the neural network models for performing the image processing tasks have a huge number of parameters, and a great amount of computation resources are consumed during computation.
At present, a channel pruning method is generally adopted to process an image processing model so as to reduce the parameter quantity of the model, but the pruning method is difficult to ensure the image processing effect of the pruned image processing model, possibly causes poor image processing effect and cannot meet the requirement of image processing.
Disclosure of Invention
In order to at least overcome the above-mentioned shortcomings in the prior art, an object of the present application is to provide a pruning method, device and electronic equipment of an image processing model.
In a first aspect, an embodiment of the present application provides a pruning method of an image processing model, where the pruning method of the image processing model includes:
acquiring a sample, wherein the sample comprises a sample image and a reference image corresponding to the sample image;
determining a target channel in a plurality of channels of a plurality of convolution layers of a trained image processing model, and acquiring a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that a plurality of different channel parameters are applied to the target channel;
determining a first contribution degree of the target channel according to the difference degrees between the processing result images and the reference image;
and pruning the channels of the image processing model according to the first contribution degrees of the target channels to obtain a target image processing model.
In one possible implementation manner, the step of pruning the channels of the image processing model according to the first contribution degrees of the target channels includes:
for each target channel, acquiring an average value of a plurality of first contribution degrees determined by the target channel based on a plurality of different samples as a second contribution degree of the target channel;
and pruning the channels of the image processing model according to the second contribution degrees of the target channels.
In one possible implementation manner, the step of pruning the channels of the image processing model according to the first contribution degrees of the multiple target channels to obtain a target image processing model includes:
pruning the channels of the image processing model according to the second contribution degrees of the target channels for each convolution layer;
and after each convolution layer completes channel pruning processing, obtaining the target image processing model.
In one possible implementation manner, the step of pruning the channels of the image processing model according to the first contribution degrees of the multiple target channels to obtain a target image processing model includes:
for each convolution layer, sorting the plurality of target channels according to a second contribution of the plurality of target channels;
and pruning the channels of the image processing model according to the sorting of the target channels and the preset channel removal proportion to obtain the target image processing model.
In one possible implementation, the degree of difference L (θ, x) between the processing result image and the reference image is calculated by:
L(θ,x)=||y-x hq || 2 =||A(θ,x)-x hq || 2
wherein x represents a sample image, x hq Representing a reference image corresponding to the sample image, y representing a processing result image, a representing a trained image processing model, θ representing model parameters corresponding to the image processing model, | 2 Represents an L2 norm;
first contribution P of the c-th channel of the j-th convolution layer j,c Calculated by the following method:
wherein θ α,j,c The channel parameters of the c-th channel of the j-th convolution layer are represented, S represents discrete steps, s=0, 1,2, … S.
In one possible implementation, the image processing model includes an image quality enhancement model; the sample image in the sample is an image with relatively low image quality, and the reference image in the sample is an image with relatively high image quality corresponding to the sample image.
In one possible implementation manner, the step of pruning the channels of the image processing model according to the first contribution degrees of the multiple target channels to obtain a target image processing model includes:
after the channel pruning processing is completed on each convolution layer, a reconstructed image processing model is obtained, the reconstructed image processing model is tested, and the reconstructed image processing model is adjusted according to a test result to obtain the target image processing model.
In a second aspect, an embodiment of the present application further provides a pruning device of an image processing model, including:
a receiving module, configured to obtain a sample, where the sample includes a sample image and a reference image corresponding to the sample image;
the processing module is used for determining a target channel in a plurality of channels of a plurality of convolution layers of the trained image processing model and acquiring a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that a plurality of different channel parameters are applied to the target channel;
a determining module, configured to determine a first contribution of the target channel according to a degree of difference between a plurality of processing result images and the reference image;
and the pruning processing module is used for pruning the channels of the image processing model according to the first contribution degrees of the target channels to obtain a target image processing model.
In a third aspect, embodiments of the present application further provide an electronic device, including:
a memory for storing one or more programs;
and a processor, when the one or more programs are executed by the processor, implementing the pruning method of the image processing model provided in the first aspect.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the pruning method of the image processing model provided in the first aspect.
Based on any one of the aspects described above, the pruning method, the pruning device and the electronic device for the image processing model provided by the embodiments of the present application, by calculating the first contribution degree of each channel, the importance of each channel to the image processing model can be determined, so that the pruning process is performed on the channels in the image processing model to obtain the target image processing model, and thus, the image processing effect of the image processing model can be ensured while the parameter number and the calculation amount of the image processing model are obviously reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly explain the drawings required for the embodiments, it being understood that the following drawings illustrate only some embodiments of the present application and are therefore not to be considered limiting of the scope, and that other related drawings may be obtained according to these drawings without the inventive effort of a person skilled in the art.
Fig. 1 is a schematic flowchart of a pruning method of an image processing model provided in the present embodiment;
fig. 2 is one of the sub-step diagrams of step S400 provided in the present embodiment;
FIG. 3 is a second schematic diagram of the sub-steps of the step S400 according to the present embodiment;
FIG. 4 is a third schematic diagram of the sub-steps of the step S400 according to the present embodiment;
fig. 5 is a schematic structural diagram of an image quality enhancement model according to the present embodiment;
fig. 6 is a schematic block diagram of an electronic device according to the present embodiment;
fig. 7 is a schematic functional block diagram of a pruning device of an image processing model according to the present embodiment.
Icon: 700-an electronic device; 710-a processor; 720-a computer-readable storage medium; 730-pruning means of the image processing model; 731-a receiving module; 732-a processing module; 733-a determination module; 734-pruning processing module.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present application, it should be noted that, the azimuth or positional relationship indicated by the terms "upper", "lower", etc. are based on the azimuth or positional relationship shown in the drawings, or the azimuth or positional relationship that is commonly put when the product of the application is used, are merely for convenience of describing the present application and simplifying the description, and do not indicate or imply that the device or element to be referred to must have a specific azimuth, be configured and operated in a specific azimuth, and therefore should not be construed as limiting the present application. Furthermore, the terms "first," "second," and the like, are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
It should be noted that, in the case of no conflict, different features in the embodiments of the present application may be combined with each other.
The inventor researches that the current general model channel pruning algorithm mainly considers the effect and performance of the balanced general model, but the application effect of the general channel pruning algorithm is poor in the visual task of image processing, because the general channel pruning algorithm only usually considers the data characteristics of the convolution kernel, and does not consider the image processing effect of the convolution kernel applied to the image.
The present embodiment provides a solution to the above problem, and a detailed description of a specific embodiment of the present application will be given below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 illustrates a flowchart of a pruning method of an image processing model according to the present embodiment, where the method may include the following steps.
Step S100, a sample is obtained, wherein the sample comprises a sample image and a reference image corresponding to the sample image.
In this embodiment, a plurality of different samples may be obtained as a sample set, and the sample set may include a data set formed by a plurality of sample images extracted by a computer device from a data source (such as a database, a verification data set, etc.) according to actual requirements and the reference images corresponding to the plurality of sample images.
Specifically, the image processing model to be trained can be pre-trained according to the acquired sample set, so as to obtain a trained image enhancement model.
Step S200, determining a target channel from a plurality of channels of a plurality of convolution layers of a trained image processing model, and obtaining a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that a plurality of different channel parameters are applied to the target channel.
In this embodiment, the sample image obtained in step S100 may be input into the image processing model for processing, to obtain the processing result image. In the process of processing the sample image through the image processing model, only the channel parameters of the target channel can be changed, and the channel parameters of other channels are not changed, so that a plurality of processing result images are obtained, wherein each processing result image corresponds to a different channel parameter. The target channel may be any one of a plurality of channels of a plurality of convolutional layers. Since the channel parameters of the other channels are consistent, a plurality of processing result images output by the channel parameter variation of a single channel can be obtained.
Specifically, a part of the typical sample may be selected as an evaluation data set from among the plurality of different samples acquired in step S100, and the sample image in the evaluation data set may be processed by the image processing model.
And step S300, determining a first contribution degree of the target channel according to the difference degrees between the processing result images and the reference image.
In this embodiment, in the case where a plurality of different channel parameters are applied to the target channel, the degree of difference in image quality between the processing result image and the corresponding reference image may be calculated from the obtained processing result image of step S200, and the first contribution degree of the target channel may be calculated from the degree of difference, so as to determine the importance of the target channel. If the value of the first contribution degree is larger, the influence of the parameter change of the target channel on the processing result of the image processing model is larger, and the importance of the target channel is higher; if the value of the first contribution degree is not large, the fact that the influence of the parameter change of the target channel on the processing result of the image processing model is not large is indicated, and the importance of the target channel is low.
Specifically, for any one of the sample images in the evaluation data set, in the case that a plurality of different channel parameters are applied to the target channel, a plurality of processing result images may be obtained by processing the sample image through the image processing model, and the degree of difference in image quality between the processing result image and the corresponding reference image may be calculated from the obtained plurality of processing result images, thereby obtaining the first contribution of the target channel corresponding to the sample image. Wherein the degree of difference between the processing result image and the reference image can be calculated by one of the processing result image and one of the reference images.
And step S400, pruning is carried out on the channels of the image processing model according to the first contribution degrees of the target channels, so as to obtain a target image processing model.
In this embodiment, the channels to be pruned may be determined according to the first contribution degrees of the multiple target channels obtained in step S300, and pruning processing may be performed on the channels of the image processing model to remove the channels with the smaller first contribution degrees, so as to obtain the target image processing model, where the image processing effect of the obtained target image processing model is not significantly affected.
Based on the above design, the pruning method of the image processing model provided by the embodiment of the present application may determine the importance of each channel to the image processing model by calculating the first contribution degree of each channel, so as to prune channels in the image processing model to obtain the target image processing model, so that the image processing effect of the image processing model can be ensured while the parameter number and the calculation amount of the image processing model are obviously reduced.
In one possible implementation, referring to fig. 2, step S400 may include the following sub-steps.
Step S411, for each target channel, acquiring an average value of the first contribution degrees determined by the target channel based on a plurality of different samples as a second contribution degree of the target channel.
In this embodiment, for each of the target channels, a plurality of first contribution degrees determined by the target channel based on a plurality of different samples may be calculated, and an average process is performed on the plurality of first contribution degrees of the target channel, so as to obtain an average value of the plurality of first contribution degrees as the second contribution degree of the target channel, where one sample corresponds to one first contribution degree of the target channel.
Specifically, the degree of difference between the reference image and the processing result image may be calculated according to the reference image and the processing result image corresponding to a plurality of different samples, and a plurality of first contribution degrees corresponding to a plurality of different samples may be calculated according to the degree of difference, so as to calculate an average value of a plurality of first contribution degrees, and obtain the second contribution degree.
And step S412, pruning is carried out on the channels of the image processing model according to the second contribution degrees of the target channels.
In this embodiment, the channel to be pruned may be determined according to the second contribution degree calculated in step S411, and pruning processing may be performed on the channel of the image processing model to remove the channel with the smaller second contribution degree, so as to obtain the target image processing model. If the value of the second contribution degree is larger, the influence of the parameter change of the target channel on the processing result of the image processing model is larger, and the target channel is more important to the image processing model; and if the value of the second contribution degree is not large, the parameter change of the target channel is not greatly influenced on the processing result of the image processing model, and the target channel is not important to the image processing model.
In the above design, by performing the average processing on the plurality of first contribution degrees, obtaining an average value of the plurality of first contribution degrees as the second contribution degree, and performing pruning processing on the channel of the image processing model according to the second contribution degree, accuracy and reliability of pruning results can be improved.
In one possible implementation, referring to fig. 3, step S400 may include the following sub-steps.
Step S421, pruning is carried out on the channels of the image processing model according to the second contribution degree of the target channels for each convolution layer.
Step S422, after each convolution layer completes the channel pruning processing, obtaining the target image processing model.
In this embodiment, the second contribution degree may be calculated for all channels in the image processing model, and then channel pruning processing may be performed for different convolution layers according to the second contribution degrees of the multiple target channels in each convolution layer, where after the channel pruning processing is completed for all the convolution layers, the target image processing model may be obtained.
In the above design, by calculating the second contribution degrees of the plurality of target channels in each convolution layer, the sensitivity degree of each target channel in each convolution layer can be analyzed separately, and the channel with the smaller second contribution degree is removed, so that the parameter number and the calculated amount of the image processing model can be reduced obviously, and the image processing effect of the image processing model can be maintained.
In one possible implementation, referring to fig. 4, step S400 may include the following sub-steps.
Step S431, for each of the convolution layers, ordering a plurality of the target channels according to the second contribution degrees of the target channels.
In this embodiment, the calculation of the second contribution may be performed on all channels in the image processing model, and then the plurality of target channels in each convolution layer may be sorted from large to small according to the magnitudes of the second contribution of the plurality of target channels in each convolution layer.
Step S432, pruning the channels of the image processing model according to the sorting of the plurality of target channels and the preset channel removal ratio, to obtain the target image processing model.
In this embodiment, for each of the convolution layers, pruning processing may be performed on each of the convolution layers according to the sorting result obtained in step S431 according to a preset channel removal ratio, for example, if the preset channel removal ratio is 0.2, 20% of the target channels at the end of sorting in each of the convolution layers are removed, and after each of the convolution layers completes channel pruning processing, the target image processing model may be obtained.
It should be noted that, for the same image processing model, the preset channel removal ratio of each convolution layer is the same in the pruning process.
In one possible implementation, the degree of difference L (θ, x) between the processing result image and the reference image may be calculated by:
L(θ,x)=||y-x hq || 2 =||A(θ,x)-x hq || 2
wherein x represents a sample image, x hq Representing a reference image corresponding to the sample image, y representing a processing result image, a representing a trained image processing model, θ representing model parameters corresponding to the image processing model, | 2 Representing the L2 norm.
Specifically, x ε R C×H×W ,x hq ∈R C×H1×W1 ,y∈R C×H1×W1
Wherein C represents the number of channels of the convolutional layer input sample image, H represents the height of the sample image, W represents the width of the sample image, H1 represents the height of the reference image, and W represents the width of the reference image.
In this embodiment, the number of channels C of the convolution layer input sample image may be 3 (RGB), and the number of channels C of the convolution layer input sample image may be 1 (YUV).
The channel parameter of the c-th channel at the j-th convolution layer is from θ 0,j,c Change to theta 1,j,c In the process of (1), the first contribution degree P of the c-th channel of the j-th convolution layer j,c The method can be calculated by the following steps:
wherein θ α,j,c =θ 0,j,c +(θ 1,j,c0,j,c )×α,α∈[0,1]。
Due toFirst contribution P of the c-th channel of the j-th convolution layer j,c May be changed to a discrete form:
wherein θ α,j,c The channel parameters of the c-th channel of the j-th convolution layer are represented, S represents discrete steps, s=0, 1,2, … S.
Thus, the second contribution of the c-th channel of the j-th convolutional layer may be expressed asWherein N represents the number of samples, x n Representing the sample image.
In one possible implementation, the image processing model may include an image quality enhancement model. The sample image in the sample may be an image with relatively low image quality, and the reference image in the sample may be an image with relatively high image quality corresponding to the sample image.
Specifically, the image quality enhancement model may use a model structure including global residuals, and the image quality enhancement model may be used to enhance a low quality image, thereby obtaining a high quality image. The image quality enhancement model may be different for different scenes, as may the sample image and the reference image in the corresponding sample. For example, referring to fig. 5, for an image quality enhancement scene with the same resolution, the sample image and the features generated by the backbone network may be directly added according to the corresponding elements and then output as the processing result image, where if the resolution of the sample image is 720P, the resolution of the reference image and the processing result image is 720P. For the super-resolution image quality enhancement scene, the sample image may be first resized (resized), and the adjusted sample image and the features generated by the backbone network may be added according to corresponding elements and then output as the processing result image, where if the resolution of the sample image is 720P, the resolutions of the reference image and the processing result image may be 1440P.
When the image processing model is an image quality enhancement model, in step S400, an enhancement effect of a channel parameter variation of a single channel on the image quality enhancement model may be determined according to the first contribution degree. When the first contribution degree is larger, the enhancement result is more sensitive to the change of the channel parameters of the target channel; when the first contribution degree is smaller, the enhancement result is less sensitive to the change of the channel parameters of the target channel, and whether the channel parameters of the target channel change has little influence on the enhancement effect is described, so that the target channel with the smaller first contribution degree can be removed, the parameter number and the calculated amount of the image quality enhancement model can be reduced, and meanwhile, the enhancement effect of the image quality enhancement model cannot be obviously influenced.
In a possible implementation manner, in step S400, when performing pruning processing on the channels of the image processing model according to the first contribution degrees of the multiple target channels to obtain a target image processing model, after each convolution layer completes channel pruning processing, a reconstructed image processing model may be obtained, a test is performed on the reconstructed image processing model, and the reconstructed image processing model is adjusted according to a test result to obtain the target image processing model.
In this embodiment, after the channel pruning process is completed for each convolution layer, the reconstructed image processing model may be obtained, and at this time, the reconstructed image processing model may be tested by a test data set to determine whether a test result of the reconstructed image processing model with respect to the image processing model before pruning has a loss, and if the test result has a loss, training and fine-tuning may be performed on the reconstructed image processing model, so as to obtain the target image processing model. The test data set may be the evaluation data set acquired in step S200.
Based on the same inventive concept, the present embodiment further provides an electronic device 700, please refer to fig. 6, and fig. 6 illustrates a block schematic diagram of the electronic device 700. The electronic device 700 includes an image processing apparatus 730, a computer readable storage medium 720, and a processor 710.
The computer readable storage medium 720 and the processor 710 are directly or indirectly electrically connected to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The image processing apparatus 730 includes a plurality of software functional modules that may be stored in the computer readable storage medium 720 in the form of software or firmware (firmware) or cured in an Operating System (OS) of the image processing apparatus 730. The processor 710 is configured to execute executable modules stored in the computer-readable storage medium 720, such as software functional modules and computer programs included in the image processing apparatus 730.
The computer readable storage medium 720 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. Wherein the computer readable storage medium 720 is used for storing a program, and the processor 710 executes the program after receiving an execution instruction.
The processor 710 may be an integrated circuit chip with signal processing capabilities. The processor 710 may be a general-purpose processor 710, including a central processor 710 (Central Processing Unit, CPU for short), a network processor 710 (Network Processor, NP for short), etc.; but may also be a digital signal processor 710 (DSP), application Specific Integrated Circuit (ASIC), field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. The general purpose processor 710 may be a microprocessor 710 or the processor 710 may be any conventional processor 710 or the like.
Referring to fig. 7, an image processing apparatus 730 is further provided in the embodiment of the present application. The image processing apparatus 730 includes a plurality of functional modules that may be stored in the form of software in the computer-readable storage medium 720. Functionally divided, the image processing device 730 may include a receiving module 731, a processing module 732, a determining module 733, and a pruning processing module 734. Wherein:
the receiving module 731 may be configured to obtain a sample, where the sample includes a sample image and a reference image corresponding to the sample image.
In this embodiment, the receiving module 731 may be used to perform step S100 shown in fig. 1, and for a specific description of the receiving module 731, reference may be made to the description of step S100.
The processing module 732 may be configured to determine a target channel from a plurality of channels of a plurality of convolution layers of a trained image processing model, and obtain a plurality of processing result images obtained by processing the sample image by the image processing model when a plurality of different channel parameters are applied to the target channel.
In this embodiment, the processing module may be used to perform step S200 shown in fig. 1, and for a specific description of the processing module 732, reference may be made to the description of step S200.
The determining module 733 may be configured to determine a first contribution of the target channel according to a degree of difference between a plurality of the processing result images and the reference image.
In this embodiment, the determining module 733 may be used to perform step S300 shown in fig. 1, and for a specific description of the determining module 733, reference may be made to a description of the step S300.
The pruning processing module 734 may be configured to prune the channels of the image processing model according to the first contribution degrees of the multiple target channels to obtain a target image processing model.
In this embodiment, the pruning processing module 734 may be configured to perform step S400 shown in fig. 1, and for a specific description of the pruning processing module 734, reference may be made to the description of step S400.
In summary, according to the pruning method, the pruning device and the electronic device for the image processing model provided by the embodiment of the application, the importance of each channel on the image processing model can be determined by calculating the first contribution degree of each channel, so that the pruning process is performed on the channels in the image processing model to obtain the target image processing model, and therefore, the image processing effect of the image processing model can be ensured while the parameter number and the calculation amount of the image processing model are obviously reduced.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the same, but rather, various modifications and variations may be made by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A pruning method of an image processing model, the method comprising:
acquiring a sample, wherein the sample comprises a sample image and a reference image corresponding to the sample image;
determining a target channel in a plurality of channels of a plurality of convolution layers of a trained image processing model, and acquiring a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that a plurality of different channel parameters are applied to the target channel;
determining a first contribution degree of the target channel according to the difference degrees between the processing result images and the reference image;
and pruning the channels of the image processing model according to the first contribution degrees of the target channels to obtain a target image processing model.
2. The pruning method of an image processing model according to claim 1, wherein the step of pruning the channels of the image processing model according to the first contribution degrees of the plurality of target channels includes:
for each target channel, acquiring an average value of a plurality of first contribution degrees determined by the target channel based on a plurality of different samples as a second contribution degree of the target channel;
and pruning the channels of the image processing model according to the second contribution degrees of the target channels.
3. The pruning method of an image processing model according to claim 2, wherein the step of pruning the channels of the image processing model according to the first contribution degrees of the plurality of target channels to obtain a target image processing model includes:
pruning the channels of the image processing model according to the second contribution degrees of the target channels for each convolution layer;
and after each convolution layer completes channel pruning processing, obtaining the target image processing model.
4. The pruning method of an image processing model according to claim 2, wherein the step of pruning the channels of the image processing model according to the first contribution degrees of the plurality of target channels to obtain a target image processing model includes:
for each convolution layer, sorting the plurality of target channels according to a second contribution of the plurality of target channels;
and pruning the channels of the image processing model according to the sorting of the target channels and the preset channel removal proportion to obtain the target image processing model.
5. The pruning method of an image processing model according to claim 1, wherein the degree of difference L (θ, x) between the processing result image and the reference image is calculated by:
L(θ,x)=||y-x hq || 2 =||A(θ,x)-x hq || 2
wherein x represents a sample image, x hq Representing a reference image corresponding to the sample image, y representing a processing result image, A representing a trained image processing model, θ representing model parameters corresponding to the image processing model, | … || 2 Represents an L2 norm;
first contribution P of the c-th channel of the j-th convolution layer j,c Calculated by the following method:
wherein θ α,j,c The channel parameters of the c-th channel of the j-th convolution layer are represented, S represents discrete steps, s=0, 1,2, … S.
6. The pruning method of an image processing model according to claim 1, wherein the image processing model includes an image quality enhancement model; the sample image in the sample is an image with relatively low image quality, and the reference image in the sample is an image with relatively high image quality corresponding to the sample image.
7. The pruning method of an image processing model according to claim 1, wherein the step of pruning the channels of the image processing model according to the first contribution degrees of the plurality of target channels to obtain a target image processing model includes:
after the channel pruning processing is completed on each convolution layer, a reconstructed image processing model is obtained, the reconstructed image processing model is tested, and the reconstructed image processing model is adjusted according to a test result to obtain the target image processing model.
8. A pruning device of an image processing model, comprising:
a receiving module, configured to obtain a sample, where the sample includes a sample image and a reference image corresponding to the sample image;
the processing module is used for determining a target channel in a plurality of channels of a plurality of convolution layers of the trained image processing model and acquiring a plurality of processing result images obtained by processing the sample image by the image processing model under the condition that a plurality of different channel parameters are applied to the target channel;
a determining module, configured to determine a first contribution of the target channel according to a degree of difference between a plurality of processing result images and the reference image;
and the pruning processing module is used for pruning the channels of the image processing model according to the first contribution degrees of the target channels to obtain a target image processing model.
9. An electronic device, comprising:
a memory for storing one or more programs;
a processor, which when executed by the processor, implements the method of any of claims 1-7.
10. A computer readable storage medium, having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the method according to any of claims 1-7.
CN202311598978.8A 2023-11-27 2023-11-27 Pruning method and device of image processing model and electronic equipment Pending CN117437521A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311598978.8A CN117437521A (en) 2023-11-27 2023-11-27 Pruning method and device of image processing model and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311598978.8A CN117437521A (en) 2023-11-27 2023-11-27 Pruning method and device of image processing model and electronic equipment

Publications (1)

Publication Number Publication Date
CN117437521A true CN117437521A (en) 2024-01-23

Family

ID=89548072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311598978.8A Pending CN117437521A (en) 2023-11-27 2023-11-27 Pruning method and device of image processing model and electronic equipment

Country Status (1)

Country Link
CN (1) CN117437521A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808072A (en) * 2024-02-28 2024-04-02 苏州元脑智能科技有限公司 Model pruning method, image processing method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808072A (en) * 2024-02-28 2024-04-02 苏州元脑智能科技有限公司 Model pruning method, image processing method, device, equipment and medium
CN117808072B (en) * 2024-02-28 2024-05-14 苏州元脑智能科技有限公司 Model pruning method, image processing method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN109977191B (en) Problem map detection method, device, electronic equipment and medium
CN108228761B (en) Image retrieval method and device supporting region customization, equipment and medium
CN110705718A (en) Model interpretation method and device based on cooperative game and electronic equipment
CN117437521A (en) Pruning method and device of image processing model and electronic equipment
CN112348765A (en) Data enhancement method and device, computer readable storage medium and terminal equipment
CN114155244B (en) Defect detection method, device, equipment and storage medium
US20230214989A1 (en) Defect detection method, electronic device and readable storage medium
CN111027450A (en) Bank card information identification method and device, computer equipment and storage medium
CN111523558A (en) Ship shielding detection method and device based on electronic purse net and electronic equipment
CN112233077A (en) Image analysis method, device, equipment and storage medium
CN112767354A (en) Defect detection method, device and equipment based on image segmentation and storage medium
CN112465050B (en) Image template selection method, device, equipment and storage medium
CN112329810B (en) Image recognition model training method and device based on significance detection
CN113723467A (en) Sample collection method, device and equipment for defect detection
CN116309612B (en) Semiconductor silicon wafer detection method, device and medium based on frequency decoupling supervision
CN110210314B (en) Face detection method, device, computer equipment and storage medium
CN111476144A (en) Pedestrian attribute identification model determination method and device and computer readable storage medium
CN111179245A (en) Image quality detection method, device, electronic equipment and storage medium
US20030185432A1 (en) Method and system for image registration based on hierarchical object modeling
CN115620083A (en) Model training method, face image quality evaluation method, device and medium
CN115374517A (en) Testing method and device for wiring software, electronic equipment and storage medium
CN114677504A (en) Target detection method, device, equipment terminal and readable storage medium
CN113139617A (en) Power transmission line autonomous positioning method and device and terminal equipment
CN112528500A (en) Evaluation method and evaluation equipment for scene graph construction model
US20230401670A1 (en) Multi-scale autoencoder generation method, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination