CN106228556B - image quality analysis method and device - Google Patents

image quality analysis method and device Download PDF

Info

Publication number
CN106228556B
CN106228556B CN201610587219.5A CN201610587219A CN106228556B CN 106228556 B CN106228556 B CN 106228556B CN 201610587219 A CN201610587219 A CN 201610587219A CN 106228556 B CN106228556 B CN 106228556B
Authority
CN
China
Prior art keywords
image
preset
neural network
gradient map
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610587219.5A
Other languages
Chinese (zh)
Other versions
CN106228556A (en
Inventor
龙飞
杨松
陈志军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201610587219.5A priority Critical patent/CN106228556B/en
Publication of CN106228556A publication Critical patent/CN106228556A/en
Application granted granted Critical
Publication of CN106228556B publication Critical patent/CN106228556B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Image Analysis (AREA)

Abstract

the disclosure relates to an image quality analysis method and device, and belongs to the technical field of image processing. The method comprises the following steps: gradient extraction is carried out on the picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network; processing edge characteristics of the gradient map according to a plurality of convolution kernels trained in advance in a preset convolution neural network; and determining the image quality of the picture to be processed according to the matching degree between the edge feature of the gradient map and the preset image edge feature. According to the method, the image quality of the picture to be processed is analyzed through the convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.

Description

Image quality analysis method and device
Technical Field
the present disclosure relates to the field of image processing technologies, and in particular, to an image quality analysis method and apparatus.
Background
With the popularization of terminal devices such as smart phones, people are used to take pictures through the terminal devices to record life drops, and pictures taken by some users may be blurred, so that the quality of the pictures is not high, and therefore, in order to save the memory of the terminal devices, the pictures with low quality need to be detected to perform related processing, such as deletion and the like.
Disclosure of Invention
To overcome the problems in the related art, embodiments of the present disclosure provide an image quality analysis method and apparatus. The technical scheme is as follows:
According to a first aspect of embodiments of the present disclosure, there is provided an image quality analysis method, including:
Gradient extraction is carried out on a picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network;
processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
And determining the image quality of the picture to be processed according to the matching degree between the edge features of the gradient map and the edge features of a preset image.
the method for determining the image quality of the picture to be processed according to the matching degree between the edge feature of the gradient map and the edge feature of the preset image includes:
when the edge feature of the gradient image is matched with a preset clear image edge feature, determining that the image quality of the to-be-processed image is clear;
And when the edge feature of the gradient map is matched with a preset fuzzy image edge feature, determining the image quality of the picture to be processed to be fuzzy.
The method as described above, the inputting the gradient map into a preset convolutional neural network, comprising:
Normalizing the pixels in the gradient map to obtain a gray scale map corresponding to the gradient map;
And inputting the gray scale map into the preset convolution neural network.
The method as described above, further comprising:
carrying out gradient extraction on the sample picture to obtain a sample gradient map;
and inputting the sample gradient map into the preset convolutional neural network, and training a plurality of preset convolution kernels in the preset convolutional neural network to classify the edge characteristics of the blurred image and the edge characteristics of the sharp image.
The method as described above, the predetermined convolutional neural network comprising: a layer of convolution layers, and a plurality of convolution kernels.
According to a second aspect of the embodiments of the present disclosure, there is provided an image quality analysis apparatus, the apparatus including:
the first extraction module is used for carrying out gradient extraction on the picture to be processed to obtain a gradient map;
The input module is used for inputting the gradient map into a preset convolutional neural network;
The processing module is used for processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
And the determining module is used for determining the image quality of the picture to be processed according to the matching degree between the edge feature of the gradient map and the edge feature of a preset image.
the apparatus as described above, the determining module to:
when the edge feature of the gradient image is matched with a preset clear image edge feature, determining that the image quality of the to-be-processed image is clear;
and when the edge feature of the gradient map is matched with a preset fuzzy image edge feature, determining the image quality of the picture to be processed to be fuzzy.
The apparatus as described above, the input module to:
Normalizing the pixels in the gradient map to obtain a gray scale map corresponding to the gradient map;
and inputting the gray scale map into the preset convolution neural network.
The apparatus as described above, further comprising:
the second extraction module is used for carrying out gradient extraction on the sample picture to obtain a sample gradient image;
And the training module is used for inputting the sample gradient map into the preset convolutional neural network and training a plurality of preset convolutional kernels in the preset convolutional neural network so as to classify the edge characteristics of the blurred image and the edge characteristics of the clear image.
the apparatus as described above, the predetermined convolutional neural network comprising: a layer of convolution layers, and a plurality of convolution kernels.
According to a third aspect of the embodiments of the present disclosure, there is provided another image quality analysis apparatus, the apparatus including:
A processor;
A memory for storing processor-executable instructions;
Wherein the processor is configured to:
gradient extraction is carried out on a picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network;
Processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
And determining the image quality of the picture to be processed according to the matching degree between the edge features of the gradient map and the edge features of a preset image.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
gradient extraction is carried out on the picture to be processed to obtain a gradient map, the gradient map is input into a preset convolution neural network, edge features of the gradient map are processed according to a plurality of convolution cores trained in the convolution neural network in advance, and the image quality of the picture to be processed is determined according to the matching degree between the edge features of the gradient map and the edge features of the preset image. Therefore, the image quality of the picture to be processed is analyzed through the preset convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.
it is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
drawings
the accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, are configured to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating a method of image quality analysis in accordance with an exemplary embodiment;
FIG. 2 is a diagram illustrating an example method for image recognition based on a deep learning technique, according to an example embodiment;
FIG. 3 is a flow chart illustrating a method of image quality analysis according to another exemplary embodiment of the present disclosure;
FIG. 4 is a block diagram illustrating an image quality analysis apparatus according to an exemplary embodiment;
Fig. 5 is a block diagram illustrating an image quality analyzing apparatus according to another exemplary embodiment; and
Fig. 6 is a block diagram illustrating an image analysis apparatus according to still another exemplary embodiment.
with the foregoing drawings in mind, certain embodiments of the disclosure have been shown and described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
an image quality analysis method and apparatus of an embodiment of the present disclosure are described below with reference to the drawings.
Fig. 1 is a flow chart illustrating a method of image quality analysis according to an exemplary embodiment.
As shown in fig. 1, the image quality analysis method may include the following steps:
in step S110, gradient extraction is performed on the image to be processed to obtain a gradient map, and the gradient map is input to a preset convolutional neural network.
in step S120, the edge features of the gradient map are processed according to a plurality of convolution kernels trained in advance in a preset convolution neural network.
At present, the convolutional neural network has the advantages of high identification efficiency, invariance of rotary scaling and the like, does not need to carry out complex pre-processing on a picture to be processed in an early stage, can directly input an original picture, and has strong practicability, so that the convolutional neural network is more and more widely applied to the field of image identification.
Generally, a convolutional neural network has a multi-layer structure including one input layer, several down-sampling layers and several convolutional layers after the one input layer, and finally, the input layer. The input layer is used for receiving the pictures to be processed, each convolution layer comprises a plurality of feature maps with the same size, pixels of each feature map correspond to a pixel set of corresponding window positions of a plurality of feature maps formulated by the previous layer, each down-sampling layer comprises a plurality of feature maps with the same size, each feature map of the down-sampling layer corresponds to one feature map of the convolution layer of the previous layer, and the feature map pixels of the down-sampling layer correspond to the sampling area of the corresponding feature map of the previous layer.
that is to say, the convolutional neural network performs image recognition based on a deep learning technique, and the significant feature of the deep learning technique is that when image recognition is performed, image features of each level of a to-be-processed picture from a bottom layer (edge and the like) to an upper layer (specific shape) can be extracted. Namely, the more layers of the convolutional neural network, the more upper layers of the image features are extracted.
As shown in fig. 2, when performing face recognition on a picture to be processed, the convolutional neural network extracts edge features of the picture to be processed at the bottom layer, extracts partial organs of a face and local textures in the picture to be processed, and finally extracts image features of the face.
however, in the image analysis method according to the embodiment of the present disclosure, the image quality of the picture to be processed is identified by using the preset convolutional neural network, and the image quality depends on the edge feature of the bottom layer of the picture to be processed. In the embodiment of the present disclosure, the predetermined convolutional neural network may include one convolutional layer but includes a plurality of convolutional kernels (e.g., 30), for example, so that the image quality is analyzed by the predetermined convolutional neural network. In this embodiment, the convolution kernel is adapted to classify edge features of blurred and sharp images.
In practical application, in order to improve the efficiency of the convolutional neural network in identifying the image quality, gradient extraction may be performed on the image to be processed in advance, for example, a sobel edge detection algorithm is used to perform gradient extraction on the image to be processed in advance, and then the gradient image instead of the image to be processed is used as the input of the convolutional neural network, so that the relevant features of the image to be processed can be directly extracted on the edge features according to a plurality of convolutional kernels trained in advance in the convolutional neural network.
In step S130, according to the matching degree between the edge feature of the gradient map and the edge feature of the preset image, the image quality of the picture to be processed is determined.
it can be understood that the edge features of the blurred image and the edge features of the sharp image are preset in the convolutional neural network, and then the edge features of the gradient map are classified through a plurality of convolutional kernels trained in the preset convolutional neural network in advance, so that the image quality of the image to be processed can be determined according to the classification result.
in one embodiment of the disclosure, if the preset convolutional neural network identifies that the edge feature of the gradient map matches with the edge feature of a fuzzy image preset in the preset convolutional neural network, and the pre-trained convolution kernels classify the edge feature of the gradient map into the edge feature with fuzzy image quality, so as to determine the fuzzy image quality of the picture to be processed.
In another embodiment of the present disclosure, if the preset convolutional neural network identifies that the edge feature of the gradient map matches with the preset edge feature of the clear image, and the pre-trained convolution kernels classify the edge feature of the gradient map into the edge feature with clear image quality, thereby determining that the image quality of the picture to be processed is clear.
In one embodiment, for example, a threshold may be preset, and when the matching degree between the edge feature of the gradient map and the edge feature of the preset image is smaller than the threshold, it is determined that the edge feature of the gradient map does not match the edge feature of the preset image; and when the matching degree between the edge feature of the gradient map and the edge feature of the preset image is greater than or equal to the threshold value, determining that the edge feature of the gradient map is matched with the edge feature of the preset image.
to sum up, in the image quality analysis method according to the embodiment of the present disclosure, gradient extraction is performed on a to-be-processed image to obtain a gradient map, the gradient map is input to a preset convolutional neural network, edge features of the gradient map are processed according to a plurality of convolutional cores trained in the convolutional neural network in advance, and image quality of the to-be-processed image is determined according to a matching degree between the edge features of the gradient map and the preset image edge features. Therefore, the image quality of the picture to be processed is analyzed through the convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.
Based on the above embodiments, it should be noted that after the convolutional neural network having one convolutional layer and a plurality of convolutional kernels is built, the convolutional neural network needs to be trained, so as to continuously adjust values of the preset plurality of convolutional kernels and the like through repeated stimulation on the convolutional neural network, thereby obtaining the convolutional neural network capable of accurately identifying image quality through training.
the following description of the image quality analysis method with training of the convolutional neural network is provided as follows:
Fig. 3 is a flowchart illustrating an image quality analysis method according to another exemplary embodiment of the present disclosure. The image quality analysis method comprises the following steps:
In step S310, gradient extraction is performed on the sample picture to obtain a sample gradient map.
in step S320, the sample gradient map is input into a preset convolutional neural network, and a plurality of convolutional kernels preset in the preset convolutional neural network are trained to classify the blurred image edge feature and the sharp image edge feature.
for example, the sample image may be subjected to gradient extraction, and the sample gradient image may be input into a preset neural network, so as to continuously modify the values of the convolution kernels by training a plurality of preset convolution kernels in the convolution neural network. Therefore, through training of the preset convolution kernels, a plurality of converged convolution kernels can be obtained, such as a plurality of 7 × 7 convolution kernels, and the convolution kernels are used for accurately classifying the blurred image edge feature and the sharp image edge feature.
it should be emphasized that the above training of the plurality of convolution kernels preset in the convolutional neural network through the sample picture may be performed a plurality of times, where the sample picture of each training is different, and the more times of training, the more accurate the classification of the blurred image edge feature and the sharp image edge feature of the plurality of convolution kernels obtained through training is.
For example, after the training is completed for a plurality of preset convolution kernels in the convolution neural network, only the to-be-processed picture or the sample picture needs to be input into the trained convolution neural network, and the to-be-processed picture can be subjected to secondary classification through the plurality of pre-trained convolution kernels and is divided into an image with fuzzy image quality and an image with clear image quality, so that the image quality of the to-be-processed image can be determined according to the classification result.
in step S330, gradient extraction is performed on the image to be processed to obtain a gradient map, and normalization processing is performed on pixels in the gradient map to obtain a gray scale map corresponding to the gradient map.
In step S340, the gray map is input to a preset convolutional neural network.
In step S350, the edge features of the gradient map are processed according to a plurality of convolution kernels trained in advance in a preset convolution neural network.
specifically, in an embodiment of the present disclosure, as an implementation manner, when image quality analysis is performed on a to-be-processed image, gradient extraction may be performed on the to-be-processed image to obtain a gradient map, normalization processing is performed on pixels in the gradient map, a pixel value is normalized to a range of 0 to 255 to obtain a gray scale map corresponding to the gradient map, and the gray scale map is input into a preset convolutional neural network, so that a relevant feature of the to-be-processed image may be directly extracted on the gray scale map according to a plurality of convolutional kernels trained in advance in the convolutional neural network.
in step S360, if the edge feature of the gradient map matches the preset blurred image edge feature, it is determined that the image quality of the picture to be processed is blurred.
in step S370, if the edge feature of the gradient map matches the preset sharp image edge feature, it is determined that the image quality of the picture to be processed is sharp.
In one embodiment of the present disclosure, if the edge features of the gradient map match with preset edge features of a blurred image, a plurality of pre-trained convolution kernels classify the edge features of the gradient map as edge features with blurred image quality, thereby determining the image quality blur of the picture to be processed.
if the edge features of the gradient map are matched with the preset edge features of the clear image, the edge features of the gradient map are classified into the edge features of clear and fuzzy images by a plurality of pre-trained convolution kernels, and therefore the image quality of the to-be-processed image is determined to be clear.
in summary, in the image quality analysis method according to the embodiment of the present disclosure, the gradient of the sample picture is used to train a plurality of convolution kernels preset in a preset convolution neural network, so as to classify the blurred image edge feature and the sharp image edge feature according to the plurality of convolution kernels. Therefore, the image quality of the picture to be processed is analyzed through the convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.
the following are embodiments of the disclosed image quality analysis apparatus, which may be configured to perform embodiments of the disclosed image quality analysis method. For details not disclosed in the embodiments of the image quality analyzing apparatus of the present disclosure, please refer to the embodiments of the image quality analyzing method of the present disclosure.
Fig. 4 is a block diagram illustrating an image quality analysis apparatus according to an exemplary embodiment.
As shown in fig. 4, the image quality analyzing apparatus includes:
The first extraction module 410 is configured to perform gradient extraction on the picture to be processed to obtain a gradient map.
an input module 420 configured to input the gradient map into a preset convolutional neural network.
And the processing module 430 is configured to process the edge features of the gradient map according to a plurality of convolution kernels trained in advance in a preset convolution neural network.
the determining module 440 is configured to determine the image quality of the to-be-processed picture according to the matching degree between the edge feature of the gradient map and the preset image edge feature.
In practical application, in order to improve the efficiency of the convolutional neural network in identifying the image quality, the first extraction module 410 may perform gradient extraction on the image to be processed in advance, for example, the first extraction module 410 performs gradient extraction on the image to be processed in advance by using a sobel edge detection algorithm, and then the input module 420 uses a gradient map instead of the image to be processed as the input of the convolutional neural network, so that the processing module 430 may directly extract the relevant features of the image to be processed on the edge features according to a plurality of convolutional kernels trained in advance in the convolutional neural network.
In an embodiment of the present disclosure, if the convolutional neural network identifies that the edge feature of the gradient map matches with an edge feature of a blurred image preset in the convolutional neural network, the processing module 430 classifies the edge feature of the gradient map into an edge feature with blurred image quality through a plurality of convolutional kernels trained in advance, so that the determining module 440 determines the image quality blur of the picture to be processed.
In another embodiment of the present disclosure, if the preset convolutional neural network identifies that the edge feature of the gradient map matches the preset sharp image edge feature, the processing module 430 classifies the edge feature of the gradient map into the edge feature with sharp image quality through a plurality of convolutional kernels trained in advance, so that the determining module 440 determines that the image quality of the picture to be processed is sharp.
To sum up, the image quality analysis apparatus according to the embodiment of the present disclosure performs gradient extraction on a to-be-processed image to obtain a gradient map, inputs the gradient map into a preset convolutional neural network, processes edge features of the gradient map according to a plurality of convolutional cores trained in the convolutional neural network in advance, and determines image quality of the to-be-processed image according to a matching degree between the edge features of the gradient map and the preset image edge features. Therefore, the image quality of the picture to be processed is analyzed through the convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.
Based on the above embodiments, it should be noted that after the convolutional neural network having one convolutional layer and a plurality of convolutional kernels is built, the convolutional neural network needs to be trained, so as to continuously adjust values of the preset plurality of convolutional kernels and the like through repeated stimulation on the convolutional neural network, thereby obtaining the convolutional neural network capable of accurately identifying image quality through training.
the following describes an image quality analysis apparatus with reference to training of a convolutional neural network, and the following description is provided:
Fig. 5 is a block diagram illustrating an image quality analysis apparatus according to another exemplary embodiment. On the basis as shown in fig. 4, the image analysis apparatus further includes:
And a second extraction module 450 configured to perform gradient extraction on the sample picture to obtain a sample gradient map.
and a training module 460 configured to input the sample gradient map into a preset convolutional neural network, train a plurality of preset convolution kernels in the convolutional neural network, and classify the blurred image edge feature and the sharp image edge feature.
Specifically, the second extraction module 450 may perform gradient extraction on the sample image, and the training module 460 inputs the sample gradient image into a preset neural network so as to continuously modify the values of the convolution kernels by training a plurality of preset convolution kernels in the convolution neural network. Therefore, through training of the preset convolution kernels, a plurality of converged convolution kernels can be obtained, such as a plurality of 7 × 7 convolution kernels, and the convolution kernels are used for accurately classifying the blurred image edge feature and the sharp image edge feature.
Further, after the training is completed for the plurality of convolution kernels preset in the convolution neural network, the input module 420 only needs to input the gradient map of the to-be-processed picture or the sample picture extracted by the first extraction module 410 into the trained convolution neural network, and the processing module 430 can perform secondary classification on the to-be-processed picture through the plurality of convolution kernels trained in advance, so as to divide the to-be-processed picture into an image with blurred image quality and an image with clear image quality, so that the determining module 440 can determine the image quality of the to-be-processed picture according to the classification result.
Specifically, in an embodiment of the present disclosure, as an implementation manner, when performing image quality analysis on a to-be-processed image, the processing module 430 may obtain a gradient map by performing gradient extraction on the to-be-processed image extracted by the first extraction module 410, the input module 420 performs normalization processing on pixels in the gradient map, and normalizes the pixel values to the range of 0 to 255 to obtain a gray scale map corresponding to the gradient map, and then inputs the gray scale map into a preset convolutional neural network, so that the processing module 430 may directly extract relevant features of the to-be-processed image on the gray scale map according to a plurality of convolutional kernels trained in the convolutional neural network in advance.
in summary, the image quality analysis apparatus according to the embodiment of the present disclosure trains a plurality of convolution kernels preset in a convolution neural network using a gradient of a sample picture, so as to classify a blurred image edge feature and a sharp image edge feature according to the plurality of convolution kernels. Therefore, the image quality of the picture to be processed is analyzed through the convolutional neural network, and the accuracy and efficiency of determining the image quality of the picture to be processed are improved.
Fig. 6 is a block diagram illustrating an image analysis apparatus according to still another exemplary embodiment. For example, the apparatus 1000 may be a mobile phone, a computer, a tablet device, a personal digital assistant, and the like.
referring to fig. 6, the apparatus 1000 may include one or more of the following components: processing component 1002, memory 1004, power component 1006, multimedia component 1008, audio component 1010, input/output (I/O) interface 1012, sensor component 1014, and communications component 1016.
The processing component 1002 generally controls the overall operation of the device 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1002 may include one or more processors 1020 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 1002 may include one or more modules that facilitate interaction between processing component 1002 and other components. For example, the processing component 1002 may include a multimedia module to facilitate interaction between the multimedia component 1008 and the processing component 1002.
The memory 1004 is configured to store various types of data to support operations at the apparatus 1000. Examples of such data include instructions for any application or method configured to operate on device 1000, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1004 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
the power supply component 1006 provides power to the various components of the device 1000. The power components 1006 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1000.
the multimedia component 1008 includes a touch-sensitive display screen that provides an output interface between the device 1000 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1008 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1000 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1010 is configured to output and/or input audio signals. For example, audio component 1010 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 1004 or transmitted via the communication component 1016. In some embodiments, audio component 1010 further comprises a speaker configured to output audio signals.
I/O interface 1012 provides an interface between processing component 1002 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1014 includes one or more sensors configured to provide status assessment of various aspects to the apparatus 1000. For example, sensor assembly 1014 may detect an open/closed state of device 1000, the relative positioning of components, such as a display and keypad of device 1000, the change in position of device 1000 or a component of device 1000, the presence or absence of user contact with device 1000, the orientation or acceleration/deceleration of device 1000, and the change in temperature of device 1000. The sensor assembly 1014 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, configured for use in imaging applications. In some embodiments, the sensor assembly 1014 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
the communication component 1016 is configured to facilitate communications between the apparatus 1000 and other devices in a wired or wireless manner. The device 1000 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1016 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1016 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components configured to perform the image quality analysis methods described above (the methods shown in fig. 1-3).
in an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1004 comprising instructions, executable by the processor 1020 of the device 1000 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium in which instructions, when executed by a processor of a terminal, enable the terminal to perform a method of image quality analysis, the method comprising:
Gradient extraction is carried out on a picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network;
processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
and determining the image quality of the picture to be processed according to the matching degree between the edge features of the gradient map and the edge features of a preset image.
other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (8)

1. An image quality analysis method, characterized by comprising the steps of:
Gradient extraction is carried out on a picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network;
Processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
Determining the image quality of the picture to be processed according to the matching degree between the edge features of the gradient map and preset image edge features, wherein the image quality depends on the edge features of the bottommost layer of the picture to be processed;
the preset convolutional neural network comprises: a layer of convolution layers, and a plurality of convolution kernels;
Further comprising:
carrying out gradient extraction on the sample picture to obtain a sample gradient map;
And inputting the sample gradient map into the preset convolutional neural network, and training a plurality of preset convolution kernels in the preset convolutional neural network to classify the edge characteristics of the blurred image and the edge characteristics of the sharp image.
2. the method as claimed in claim 1, wherein the determining the image quality of the picture to be processed according to the matching degree between the edge feature of the gradient map and the edge feature of the preset image comprises:
When the edge feature of the gradient image is matched with a preset clear image edge feature, determining that the image quality of the to-be-processed image is clear;
And when the edge feature of the gradient map is matched with a preset fuzzy image edge feature, determining the image quality of the picture to be processed to be fuzzy.
3. The method of claim 1 or 2, wherein said inputting the gradient map into a preset convolutional neural network comprises:
Normalizing the pixels in the gradient map to obtain a gray scale map corresponding to the gradient map;
and inputting the gray scale map into the preset convolution neural network.
4. an image quality analyzing apparatus, characterized by comprising:
The first extraction module is used for carrying out gradient extraction on the picture to be processed to obtain a gradient map;
The input module is used for inputting the gradient map into a preset convolutional neural network;
The processing module is used for processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
the determining module is used for determining the image quality of the picture to be processed according to the matching degree between the edge feature of the gradient map and the edge feature of a preset image, wherein the image quality depends on the edge feature of the bottommost layer of the picture to be processed;
the preset convolutional neural network comprises: a layer of convolution layers, and a plurality of convolution kernels;
further comprising:
the second extraction module is used for carrying out gradient extraction on the sample picture to obtain a sample gradient image;
And the training module is used for inputting the sample gradient map into the preset convolutional neural network and training a plurality of preset convolutional kernels in the preset convolutional neural network so as to classify the edge characteristics of the blurred image and the edge characteristics of the clear image.
5. the apparatus of claim 4, wherein the determination module is to:
when the edge feature of the gradient image is matched with a preset clear image edge feature, determining that the image quality of the to-be-processed image is clear;
and when the edge feature of the gradient map is matched with a preset fuzzy image edge feature, determining the image quality of the picture to be processed to be fuzzy.
6. The apparatus of claim 4 or 5, wherein the input module is to:
Normalizing the pixels in the gradient map to obtain a gray scale map corresponding to the gradient map;
and inputting the gray scale map into the preset convolution neural network.
7. An image quality analyzing apparatus, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
Wherein the processor is configured to:
Gradient extraction is carried out on a picture to be processed to obtain a gradient map, and the gradient map is input into a preset convolutional neural network;
processing the edge characteristics of the gradient map according to a plurality of convolution cores trained in advance in the preset convolution neural network;
determining the image quality of the picture to be processed according to the matching degree between the edge features of the gradient map and preset image edge features, wherein the image quality depends on the edge features of the bottommost layer of the picture to be processed;
The preset convolutional neural network comprises: a layer of convolution layers, and a plurality of convolution kernels;
the processor is further configured to:
Carrying out gradient extraction on the sample picture to obtain a sample gradient map;
And inputting the sample gradient map into the preset convolutional neural network, and training a plurality of preset convolution kernels in the preset convolutional neural network to classify the edge characteristics of the blurred image and the edge characteristics of the sharp image.
8. a non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201610587219.5A 2016-07-22 2016-07-22 image quality analysis method and device Active CN106228556B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610587219.5A CN106228556B (en) 2016-07-22 2016-07-22 image quality analysis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610587219.5A CN106228556B (en) 2016-07-22 2016-07-22 image quality analysis method and device

Publications (2)

Publication Number Publication Date
CN106228556A CN106228556A (en) 2016-12-14
CN106228556B true CN106228556B (en) 2019-12-06

Family

ID=57532670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610587219.5A Active CN106228556B (en) 2016-07-22 2016-07-22 image quality analysis method and device

Country Status (1)

Country Link
CN (1) CN106228556B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241821A (en) * 2016-12-23 2018-07-03 北京三星通信技术研究有限公司 Image processing equipment and method
CN107945134B (en) * 2017-11-30 2020-10-09 北京小米移动软件有限公司 Image processing method and device
CN109960581B (en) * 2017-12-26 2021-06-01 Oppo广东移动通信有限公司 Hardware resource allocation method and device, mobile terminal and storage medium
CN108537787B (en) * 2018-03-30 2020-12-15 中国科学院半导体研究所 Quality judgment method for face image
CN109615620B (en) * 2018-11-30 2021-01-08 腾讯科技(深圳)有限公司 Image compression degree identification method, device, equipment and computer readable storage medium
CN109785312B (en) * 2019-01-16 2020-10-09 创新奇智(广州)科技有限公司 Image blur detection method and system and electronic equipment
US10891537B2 (en) 2019-03-20 2021-01-12 Huawei Technologies Co., Ltd. Convolutional neural network-based image processing method and image processing apparatus
CN111311620A (en) * 2020-01-19 2020-06-19 贵州黔驰信息股份有限公司 Method, device, computer storage medium and terminal for realizing edge detection
CN113326720A (en) * 2020-02-29 2021-08-31 湖南超能机器人技术有限公司 Image blur detection method and device based on contour depth learning
CN114242209A (en) * 2021-11-02 2022-03-25 深圳市智影医疗科技有限公司 Medical image preprocessing method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544705A (en) * 2013-10-25 2014-01-29 华南理工大学 Image quality testing method based on deep convolutional neural network
CN104318562A (en) * 2014-10-22 2015-01-28 百度在线网络技术(北京)有限公司 Method and device for confirming quality of internet images
CN105069779A (en) * 2015-07-20 2015-11-18 童垸林 Building ceramic surface pattern quality detection method
CN105550750A (en) * 2015-12-21 2016-05-04 长沙网动网络科技有限公司 Method for improving identification precision of convolutional neural network
CN105631457A (en) * 2015-12-17 2016-06-01 小米科技有限责任公司 Method and device for selecting picture

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100559881C (en) * 2008-05-09 2009-11-11 中国传媒大学 A kind of method for evaluating video quality based on artificial neural net
US8712157B2 (en) * 2011-04-19 2014-04-29 Xerox Corporation Image quality assessment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544705A (en) * 2013-10-25 2014-01-29 华南理工大学 Image quality testing method based on deep convolutional neural network
CN104318562A (en) * 2014-10-22 2015-01-28 百度在线网络技术(北京)有限公司 Method and device for confirming quality of internet images
CN105069779A (en) * 2015-07-20 2015-11-18 童垸林 Building ceramic surface pattern quality detection method
CN105631457A (en) * 2015-12-17 2016-06-01 小米科技有限责任公司 Method and device for selecting picture
CN105550750A (en) * 2015-12-21 2016-05-04 长沙网动网络科技有限公司 Method for improving identification precision of convolutional neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Convolutional Neural Networks for No-Reference Image Quality Assessment;Le Kang, Peng Ye1, Yi Li, David Doermann;《CVPR2014》;20141231;第1-8页 *
基于梯度的多输入卷积神经网络;费建超, 芮挺, 周遊, 方虎生, 朱会杰;《光电工程》;20150331;第42卷(第3期);第33-38页 *

Also Published As

Publication number Publication date
CN106228556A (en) 2016-12-14

Similar Documents

Publication Publication Date Title
CN106228556B (en) image quality analysis method and device
CN106557768B (en) Method and device for recognizing characters in picture
CN111310616B (en) Image processing method and device, electronic equipment and storage medium
CN109446994B (en) Gesture key point detection method and device, electronic equipment and storage medium
CN108629354B (en) Target detection method and device
US10007841B2 (en) Human face recognition method, apparatus and terminal
US11455491B2 (en) Method and device for training image recognition model, and storage medium
WO2023087741A1 (en) Defect detection method and apparatus, and electronic device, storage medium and computer program product
CN110619350B (en) Image detection method, device and storage medium
CN106127751B (en) Image detection method, device and system
CN109934275B (en) Image processing method and device, electronic equipment and storage medium
CN107784279B (en) Target tracking method and device
CN108062547B (en) Character detection method and device
CN107563994B (en) Image significance detection method and device
CN108668080B (en) Method and device for prompting degree of dirt of lens and electronic equipment
CN106557759B (en) Signpost information acquisition method and device
CN111461182B (en) Image processing method, image processing apparatus, and storage medium
CN110569835B (en) Image recognition method and device and electronic equipment
CN110532956B (en) Image processing method and device, electronic equipment and storage medium
CN106295499A (en) Age estimation method and device
CN109034150B (en) Image processing method and device
CN108717542B (en) Method and device for recognizing character area and computer readable storage medium
US11961278B2 (en) Method and apparatus for detecting occluded image and medium
CN110717399A (en) Face recognition method and electronic terminal equipment
CN111666941A (en) Text detection method and device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant