CN113052815A - Image definition determining method and device, storage medium and electronic equipment - Google Patents

Image definition determining method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113052815A
CN113052815A CN202110309640.0A CN202110309640A CN113052815A CN 113052815 A CN113052815 A CN 113052815A CN 202110309640 A CN202110309640 A CN 202110309640A CN 113052815 A CN113052815 A CN 113052815A
Authority
CN
China
Prior art keywords
gradient
value
target object
image
object image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110309640.0A
Other languages
Chinese (zh)
Other versions
CN113052815B (en
Inventor
黄加紫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110309640.0A priority Critical patent/CN113052815B/en
Publication of CN113052815A publication Critical patent/CN113052815A/en
Application granted granted Critical
Publication of CN113052815B publication Critical patent/CN113052815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/469Contour-based spatial representations, e.g. vector-coding
    • G06V10/473Contour-based spatial representations, e.g. vector-coding using gradient analysis

Abstract

The disclosure provides an image definition determining method, an image definition determining device, a computer readable storage medium and an electronic device, and relates to the technical field of image processing. The image definition determining method comprises the following steps: extracting outline information of the target object image, and performing gradient calculation on the outline information to obtain a first gradient value; determining a gradient statistic based on the noise gradient threshold and the process parameter value; wherein the process parameter values are determined based on the horizontal and vertical gradients of the profile information; performing gradient calculation on the target object image to obtain a second gradient value; and calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image. The method and the device can obtain the robust image definition evaluation result, and are beneficial to improving the processing effect of the subsequent algorithm.

Description

Image definition determining method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image sharpness determining method, an image sharpness determining apparatus, a computer-readable storage medium, and an electronic device.
Background
With the popularization of electronic devices with a photographing function, more and more users use the electronic devices to acquire images. In application, it is often necessary to perform algorithm processing on a captured image to meet the requirements of different scenes. Taking the face image as an example, after the original face image is shot, the follow-up treatments such as beautifying, makeup supplementing, background blurring and the like can be carried out.
Due to the difference of shooting scenes and the difference of different electronic equipment cameras, the definition of the shot images may have larger difference, and the definition of the images is not distinguished in the subsequent processing process, so that the problem of poor image processing effect may occur.
In order to solve the problem, the definition of the image can be determined before the subsequent processing, and then the differentiation processing is carried out according to the definition. However, the current process of determining image sharpness is not robust.
Disclosure of Invention
The present disclosure provides an image sharpness determining method, an image sharpness determining apparatus, a computer-readable storage medium, and an electronic device, thereby overcoming, at least to some extent, a problem of poor robustness in a process of determining image sharpness.
According to a first aspect of the present disclosure, there is provided an image sharpness determining method, including: extracting outline information of the target object image, and performing gradient calculation on the outline information to obtain a first gradient value; determining a gradient statistic based on the noise gradient threshold and the process parameter value; wherein the process parameter values are determined based on the horizontal and vertical gradients of the profile information; performing gradient calculation on the target object image to obtain a second gradient value; and calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image.
According to a second aspect of the present disclosure, there is provided an image sharpness determining apparatus including: the first gradient calculation module is used for extracting contour information of the target object image and performing gradient calculation on the contour information to obtain a first gradient value; a gradient statistics determination module for determining a gradient statistics value based on the noise gradient threshold value and the process parameter value; wherein the process parameter values are determined based on the horizontal and vertical gradients of the profile information; the second gradient calculation module is used for performing gradient calculation on the target object image to obtain a second gradient value; and the definition score calculating module is used for calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image sharpness determination method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising a processor; a memory for storing one or more programs which, when executed by the processor, cause the processor to implement the image sharpness determination method described above.
In the technical solutions provided by some embodiments of the present disclosure, contour information of a target object image is extracted, a first gradient value corresponding to the contour information is obtained, a gradient statistic value and a noise gradient threshold value are determined based on a horizontal gradient and a vertical gradient of the contour information, the gradient statistic value is determined, a second gradient value is directly obtained according to the target object image, and a sharpness score of the target object image is calculated by combining the first gradient value, the gradient statistic value, the second gradient value and size information of the target object image. The image definition score obtained by the scheme is determined based on the characteristics of the image, other reference data are not needed, the robustness of the algorithm is strong, and a robust image definition evaluation result can be obtained. In addition, subsequent image processing is performed based on the image definition obtained by the method, and the subsequent image processing effect can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture for an image sharpness determination scheme of an embodiment of the present disclosure;
FIG. 2 illustrates a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure;
fig. 3 schematically illustrates a flow chart of an image sharpness determination method according to an exemplary embodiment of the present disclosure;
FIG. 4 shows a schematic block diagram of calculating a sharpness score according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart of the overall process of determining the sharpness of a face image according to one embodiment of the present disclosure;
FIG. 6 is a flow chart schematically illustrating the use of scoring factors to derive human face image sharpness;
fig. 7 schematically shows a block diagram of an image sharpness determining apparatus according to an exemplary embodiment of the present disclosure;
fig. 8 schematically shows a block diagram of an image sharpness determining apparatus according to another exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation. In addition, all of the following terms "first", "second", and "third" are used for distinguishing purposes only, and should not be construed as limiting the present disclosure.
In some techniques for determining the sharpness of an image, the sharpness needs to be calculated in combination with auxiliary information other than the image itself, such as a reference image, multi-frame video information, and the like, and only works for a small-sized object, and robustness is limited and cannot meet the scene requirement for capturing a high-resolution image.
In view of this, the present disclosure provides a new image sharpness determination scheme.
Fig. 1 shows a schematic diagram of an exemplary system architecture of an image sharpness determination scheme of an embodiment of the present disclosure.
As shown in fig. 1, the system architecture may include a terminal device 11 and a server 12. The terminal device 11 and the server 12 may be connected via a network, and the connection type of the network may include, for example, a wired line, a wireless communication link, or an optical fiber cable.
It should be understood that the number of terminal devices 11 and servers 12 is merely illustrative. There may be any number of terminal devices and servers, as desired for implementation. For example, the server 12 may be a server cluster composed of a plurality of servers, and the like. The server 12 may also be referred to as a cloud or cloud server.
The terminal device 11 may interact with the server 12 through a network to receive or transmit messages or the like. Although fig. 1 illustrates a smart phone as an example, the terminal device 11 further includes a tablet computer, a smart wearable device, a personal computer, and other devices having a shooting function. The terminal device 11 may also be referred to as a terminal, a mobile terminal, a smart terminal, etc.
In the case where the image sharpness determining process of the exemplary embodiment of the present disclosure is executed by the terminal device 11, the terminal device 11 may extract contour information of the target object image, perform gradient calculation on the contour information, and obtain a first gradient value. A gradient statistic is determined based on a noise gradient threshold and a process parameter value utilized in calculating the first gradient value, the process parameter value determined based on a horizontal gradient and a vertical gradient of the contour information.
In addition, the terminal device 11 may also perform gradient calculation on the target object image to obtain a second gradient value.
Next, the terminal device 11 may calculate a sharpness score of the target object image in combination with the first gradient value, the gradient statistic value, the second gradient value, and the size information of the target object image.
After calculating the sharpness score, the terminal device 11 may invoke a subsequent algorithm corresponding to the score according to the sharpness score to further process the target object image. It should be understood that, image processing algorithms invoked by different definition scores are different, or image processing algorithms corresponding to definition scores belonging to different score ranges are different, a mapping relationship between the image processing algorithms and the definition scores may be pre-constructed, and the disclosure is not limited thereto. Wherein the subsequent algorithm may include, but is not limited to, beauty, background blurring, color enhancement, blurring, and the like.
In addition, the terminal device 11 may also upload the calculated definition score to the server 12, and the server 12 stores the definition score, so that the terminal device 11 acquires the definition score from the server 12 when the target object image needs to be further processed, and invokes a corresponding subsequent algorithm.
In the case where the server 12 executes the image sharpness determining process according to the exemplary embodiment of the present disclosure, first, the server 12 may acquire a target object image from the terminal device 11, and in the case where the target object image is an image including a target object that is cut out from an original image, the server 12 may also acquire the original image from the terminal device 11, recognize the target object by an image recognition algorithm, and obtain the target object image after performing a cutting-out (or cutting-out, cropping, or the like) operation on the target object.
Next, the server 12 may extract contour information of the target object image, and perform gradient calculation on the contour information to obtain a first gradient value. A gradient statistic is determined based on a noise gradient threshold and a process parameter value utilized in calculating the first gradient value, the process parameter value determined based on a horizontal gradient and a vertical gradient of the contour information.
In addition, the server 12 may also perform gradient calculation on the target object image to obtain a second gradient value.
The server 12 may then calculate a sharpness score for the target object image in conjunction with the first gradient value, the gradient statistic, the second gradient value, and the size information of the target object image.
After calculating the sharpness score, the server 12 may invoke a subsequent algorithm corresponding to the score according to the sharpness score to further process the target object image. It should be understood that, image processing algorithms invoked by different definition scores are different, or image processing algorithms corresponding to definition scores belonging to different score ranges are different, a mapping relationship between the image processing algorithms and the definition scores may be pre-constructed, and the disclosure is not limited thereto. Wherein the subsequent algorithm may include, but is not limited to, beauty, background blurring, color enhancement, blurring, and the like.
Furthermore, the server 12 may feed back the calculated sharpness score to the terminal device 11, and the sharpness score may be stored by the terminal device 11. Therefore, when the terminal device 11 further processes the target object image, a corresponding subsequent algorithm can be called according to the definition score. Or, when the terminal device 11 performs subsequent processing on the target object image, the terminal device 11 may send a score obtaining request to the server 12, in which case, the server 12 may feed back the score to the terminal device 11 in response to the request, so that the terminal device 11 invokes a corresponding subsequent algorithm to perform further processing on the target object image.
It should be noted that any step of determining the image definition in the scheme may be performed by the terminal device 11 or the server 12, and the present disclosure is not limited thereto.
FIG. 2 shows a schematic diagram of an electronic device suitable for use in implementing exemplary embodiments of the present disclosure. The terminal device of the exemplary embodiment of the present disclosure may be configured as in fig. 2. It should be noted that the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
The electronic device of the present disclosure includes at least a processor and a memory for storing one or more programs, which when executed by the processor, make the processor implement the image sharpness determining method of the exemplary embodiments of the present disclosure.
Specifically, as shown in fig. 2, the electronic device 200 may include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management Module 240, a power management Module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication Module 250, a wireless communication Module 260, an audio Module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor Module 280, a display 290, a camera Module 291, a pointer 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. The sensor module 280 may include a depth sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the embodiments of the present disclosure does not constitute a specific limitation to the electronic device 200. In other embodiments of the present disclosure, electronic device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors. Additionally, a memory may be provided in processor 210 for storing instructions and data.
The electronic device 200 may implement a shooting function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. In some embodiments, the electronic device 200 may include 1 or N camera modules 291, where N is a positive integer greater than 1, and if the electronic device 200 includes N cameras, one of the N cameras is a main camera.
Internal memory 221 may be used to store computer-executable program code, including instructions. The internal memory 221 may include a program storage area and a data storage area. The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 200.
The present disclosure also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
The following will explain taking as an example that the terminal device executes the image sharpness determining method of the present disclosure, in which case the image sharpness determining means may be configured in the terminal device.
Fig. 3 schematically shows a flowchart of an image sharpness determination method of an exemplary embodiment of the present disclosure. Referring to fig. 3, the image sharpness determining method may include the steps of:
and S32, extracting outline information of the target object image, and performing gradient calculation on the outline information to obtain a first gradient value.
In an exemplary embodiment of the present disclosure, the target object image is an image including the target object and having a size consistent with a size of the target object. The present disclosure does not limit the type of the target object, and may be an arbitrarily set object such as a human face, a portrait, an animal, and an automobile.
According to some embodiments of the present disclosure, the terminal device may directly acquire the target object image. For example, the terminal device can capture a target object image by a camera equipped therewith; for another example, the terminal device may obtain the target object image from another device or a server, and the source of the target object image is not limited by the present disclosure.
According to other embodiments of the present disclosure, the target object image may be an image obtained by cropping an original image. Specifically, first, the terminal device may obtain an original image, for example, the terminal device may capture the original image through a camera equipped with the terminal device; for another example, the terminal device may obtain the original image from another device or a server, and the source of the original image is not limited by the disclosure.
Next, the terminal device may identify a target object from the original image, and crop an area where the target object is located to obtain a target object area. The present disclosure does not limit the specific process of image recognition, and the process of recognizing the target object from the image may be implemented by way of machine learning, for example.
In an embodiment in which the target object image is cut out from the original image, the position of the target object image in the original image may be characterized, for example, by left, top, height, and width. However, this manner of characterization is merely exemplary, and other manners of implementing the characterization of the location may also be employed. For example, the central point of the target object is taken, and then the expansion amounts in four directions, i.e., upward, downward, left, and right, from the central point are determined, so as to obtain the region of the target object image.
It should be understood that the target object image described in the present disclosure is generally a rectangular image, and in embodiments where the target object image is cropped from the original image, the target object image is a rectangular area on the original image that contains the target object.
After the target object image is acquired, the terminal device may extract contour information of the target object image. It should be noted that the contour information in the present disclosure corresponds to low-frequency information of the target object image, and the contour information of the target object image can be extracted by means of low-pass filtering. For example, the contour information may be face contour information, facial contour information, or portrait contour information obtained after the target object image is subjected to low-pass filtering processing. In addition, the contour information is information relative to detail information (i.e., high frequency information), which generally characterizes information having a relatively small frequency in the spatial domain.
Specifically, first, the terminal device may determine a filter operator, where the filter operator is used to perform a low-pass filtering operation on the target object image to filter a high-frequency component. The low frequency component of the target object (i.e., substantially containing no noise information and high frequency component) can be acquired by the low pass filtering. In one embodiment of the present disclosure, the selected filtering shape is gaussian distribution, the size can range from 3 × 3 to 9 × 9, and the standard deviation interval is [0.5,2 ]. It should be appreciated that the selected filter operator is globally effective on the target object image for the same imaging scene (e.g., cell phone front shot, sensitivity ISO less than 600, etc.).
It is easily understood that different values can be set for the above parameter settings depending on the scene. For example, in a noise reduction scenario, a numerically larger standard deviation may be selected.
In addition, the selection of the filter operator can also be related to the ratio of the target object in the target object image, and in the case that the target object is a human face, namely related to the size of the human face, the filter operator is determined according to the ratio of the human face to the image.
According to some embodiments of the present disclosure, a mapping relationship between a shooting scene and a filter operator may be pre-constructed, and the filter operators corresponding to different shooting scenes may be determined through the mapping relationship.
After the filter operator is determined, the terminal device may perform a filtering operation on the target object image by using the filter operator to obtain the contour information of the target object image.
The terminal device may perform gradient calculation on the profile information to obtain a first gradient value.
First, the terminal device may determine the horizontal and vertical gradients of the contour information. That is, gradient components of the target object image in both the x and y directions are calculated, that is, the first derivation is performed on the target object image to obtain a gradient Gx in the x direction and a gradient Gy in the y direction.
Next, process parameter values are calculated using the horizontal and vertical gradients of the contour information. The process parameter value is also a quantity representing the gradient, and the process parameter value Gw can be determined by formula 1.
Gw=(Gx2+Gy2)0.5(formula 1)
Subsequently, a first gradient value is calculated in combination with the process parameter value and the size information of the target object image. Wherein the size information of the target object image can be characterized by a length M and a width N. Thus, the first gradient value G1 can be calculated by equation 2.
Figure BDA0002989205090000101
S34, determining a gradient statistic value based on a noise gradient threshold value and a process parameter value; wherein the process parameter values are determined based on the horizontal and vertical gradients of the profile information.
In an exemplary embodiment of the present disclosure, the noise gradient threshold may be an empirical threshold obtained in advance through big data analysis, denoted as δ, and has a value ranging from 0 to 1, which is usually set to a small value, for example, 0.1.
The gradient statistics may be determined by thresholding, and it should be noted that the gradient statistics described in this disclosure characterize noise-independent gradient statistics.
The terminal device may determine a magnitude relationship between the process parameter value and the noise gradient threshold, and determine the gradient statistic based on the magnitude relationship. Specifically, an intermediate parameter Gwn may be used to determine the gradient statistic, and if the process parameter value determined in step S32 is greater than the noise gradient threshold, the intermediate parameter Gwn is determined as the process parameter value; if the process parameter value is less than or equal to the noise gradient threshold, then the intermediate parameter Gwn is determined to be 0.
The gradient statistic Gn can be determined using equations 3 and 4.
Figure BDA0002989205090000111
Figure BDA0002989205090000112
S36, performing gradient calculation on the target object image to obtain a second gradient value.
Similar to the process of performing gradient calculation on the profile information in step S32, gradient calculation may be performed on the target object image. That is, the contour information in step S32 may be replaced by the target object image to perform the gradient calculation process, that is, the second gradient value G2 corresponding to the target object image may be obtained, and the details of this process are not repeated in this disclosure.
And S38, calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image.
Referring to fig. 4, in one aspect, a first scoring factor may be calculated using the first gradient value and the second gradient value; on the other hand, a second scoring factor may be calculated using the first gradient value, the second gradient value, and size information of the target object image; in yet another aspect, a third scoring factor may be calculated using the first gradient value, the gradient statistic, and the second gradient value. And calculating the definition score of the target object image by using the first scoring factor, the second scoring factor and the third scoring factor.
Specifically, the first scoring factor, the second scoring factor, and the third scoring factor may be weighted and summed to calculate the sharpness score of the target object image. And the value ranges of any one of the first scoring factor, the second scoring factor and the third scoring factor corresponding to the weight are all [0,1 ].
In some embodiments of the present disclosure, the weight of the first scoring factor and the second scoring factor are positively correlated with the sensitivity ISO corresponding to the target object image, and the weight of the third scoring factor and the sensitivity ISO corresponding to the target object image are negatively correlated. That is, the greater the sensitivity ISO, the greater the weight of the first scoring factor and the weight of the second scoring factor, and the smaller the weight of the third scoring factor; the smaller the sensitivity ISO, the smaller the weight of the first scoring factor and the weight of the second scoring factor, and the larger the weight of the third scoring factor. It should be understood that such a positive or negative correlation may be a linear or non-linear relationship, and the disclosure is not limited thereto.
After the first gradient value G1, the gradient statistic Gn, and the second gradient value G2 are determined, an object high frequency residual FE (FE ═ G2-G1) and a noise-independent high frequency residual FN (FN ═ Gn-G1) can be obtained.
On the basis of the above-mentioned evaluation factors can be calculated.
For the first scoring factor, which may also be referred to as a high frequency ratio factor Hr, i.e. the ratio of the object high frequency residual FE to the first gradient value G1, as shown in equation 5:
Figure BDA0002989205090000121
it will be appreciated that the greater the first scoring factor, the sharper the image.
For the second scoring factor, which may also be referred to as the average high-frequency factor Ha, the average high-frequency information is considered comprehensively based on the size of the object, and the calculation manner is shown in equation 6:
Figure BDA0002989205090000122
it will be appreciated that the greater the second scoring factor, the sharper the image.
For the third scoring factor, which may also be referred to as a noise-independent factor Hn, which is noise-independent information obtained by comprehensively considering the noise-independent high-frequency residual FN and the second gradient value G2, the calculation method is as shown in equation 7:
Figure BDA0002989205090000123
it is understood that the larger the third scoring factor, the sharper the image and the less disturbed by noise.
Next, using Hr, Ha, and Hn in combination with the weights w1, w2, and w3 respectively obtained based on the sensitivity ISO, the sharpness score CR of the target object image can be calculated. As shown in equation 8:
GR is w1 Hr + w2 Ha + w3 Hn (formula 8)
As described above, w1, w2, and w3 are all weights related to sensitivity ISO, and have a value range of [0,1], w1 and w2 are positively related to sensitivity ISO, and w3 is negatively related to sensitivity ISO.
It should be understood that the clarity score CR is a real number result greater than 0, and the greater the clarity score CR, the sharper the target object image.
The whole process of determining the sharpness of the face image according to an embodiment of the present disclosure will be described with reference to fig. 5.
In step S502, the terminal device may determine a face region from the original image.
In step S504, the terminal device may select a filter operator according to the shooting scene, for example, the filter operator may be selected according to the proportion of the face to the image.
In step S506, the terminal device performs low-pass filtering processing on the face region by using the selected filtering operator to obtain a low-frequency component of the face region.
In step S508, after obtaining the low frequency component of the face region, the terminal device may perform low frequency gradient statistics to obtain the first gradient value.
In step S510, the terminal device may perform gradient statistics on the face region to obtain original gradient statistics of the face, so as to obtain the second gradient value.
In step S512, the terminal device may perform noise-independent gradient statistics on the face region to obtain the gradient statistics.
In step S514, the terminal device may perform high-frequency residual error statistics by using the first gradient value, the second gradient value, and the gradient statistic value determined in steps S508 to S512, so as to obtain a high-frequency residual error of the face and a high-frequency residual error unrelated to noise.
In step S516, the terminal device may construct a scoring factor and solve the value of the scoring factor. Specifically, the process of calculating the first scoring factor, the second scoring factor and the third scoring factor is implemented.
In step S518, the scoring factors may be weighted and summed to obtain a sharpness score of the face region, and the sharpness score of the face is output.
The process of calculating the face sharpness using the scoring factor is described with reference to fig. 6.
In step S612, a high frequency residual of the human face and a low frequency gradient are input, where the low frequency gradient corresponds to the first gradient value. In step S614, a high frequency duty factor may be calculated. In step S616, a weight corresponding to the high-frequency ratio factor, which is related to the sensitivity ISO, is determined, and the high-frequency ratio factor is combined with the corresponding weight, that is, the high-frequency ratio factor is multiplied by the corresponding weight.
In step S622, a face high-frequency residual and a face region area are input. In step S624, an average high frequency factor may be calculated. In step S626, a weight corresponding to the average high-frequency factor, which is related to the sensitivity ISO, is determined, and the average high-frequency factor is combined with the corresponding weight, that is, the average high-frequency factor is multiplied by the corresponding weight.
In step S632, a noise-independent high-frequency residual and a face gradient are input, wherein the face gradient corresponds to the second gradient value. In step S634, a noise independent factor may be calculated. In step S636, a weight corresponding to the noise-independent factor, which is related to the sensitivity ISO, is determined, and the noise-independent factor is combined with the corresponding weight, i.e., the noise-independent factor is multiplied by the corresponding weight.
In step S640, the results of step S616, step S626 and step S636 may be added to calculate the face sharpness.
It should be noted that the face sharpness (i.e., the face sharpness score) may be used to guide a subsequent algorithm, so as to perform differentiation processing on faces with different sharpness, so as to improve the processing effect of the algorithm. Taking a beauty algorithm as an example, after the definition of the face is determined by adopting the definition determining method disclosed by the disclosure, for the face with high definition, the beauty can reduce the skin grinding force so as to achieve the purposes of beauty and cleaning; and for a low-definition face which is fuzzy and has large noise, the buffing force can be increased.
In addition, the scheme can be used as a boundary condition judgment algorithm self-adaptive scene, for example, scenes with extremely good or poor human face definition are often the limit scenes of a plurality of post-processing algorithms, and the definition score obtained by the scheme can be used for judging whether the scenes are the limit scenes or not, so that the purposes of prompting a user or changing a shooting strategy are achieved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, an image sharpness determination apparatus is also provided in the present exemplary embodiment.
Fig. 7 schematically shows a block diagram of an image sharpness determining apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 7, the image sharpness determining apparatus 7 according to an exemplary embodiment of the present disclosure may include a first gradient calculating module 71, a gradient statistic determining module 73, a second gradient calculating module 75, and a sharpness score calculating module 77.
Specifically, the first gradient calculation module 71 may be configured to extract contour information of the target object image, and perform gradient calculation on the contour information to obtain a first gradient value; the gradient statistics determination module 73 may be configured to determine a gradient statistics value based on the noise gradient threshold value and the process parameter value; wherein the process parameter values are determined based on the horizontal and vertical gradients of the profile information; the second gradient calculation module 75 may be configured to perform gradient calculation on the target object image to obtain a second gradient value; the sharpness score calculation module 77 may be configured to calculate a sharpness score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value, and the size information of the target object image.
According to an exemplary embodiment of the present disclosure, the clarity score calculating module 77 may be configured to perform: calculating a first scoring factor using the first gradient value and the second gradient value; calculating a second scoring factor by using the first gradient value, the second gradient value and the size information of the target object image; calculating a third scoring factor by using the first gradient value, the gradient statistic value and the second gradient value; and calculating the definition score of the target object image by using the first scoring factor, the second scoring factor and the third scoring factor.
According to an exemplary embodiment of the present disclosure, the process of the clarity score calculating module 77 calculating the clarity score may be configured to perform: respectively determining the weights of the first scoring factor, the second scoring factor and the third scoring factor; and carrying out weighted summation on the first scoring factor, the second scoring factor and the third scoring factor to obtain the definition score of the target object image.
According to an exemplary embodiment of the present disclosure, the weight of the first scoring factor and the second scoring factor are in a positive correlation with the sensitivity corresponding to the target object image, and the weight of the third scoring factor is in a negative correlation with the sensitivity corresponding to the target object image.
According to an exemplary embodiment of the present disclosure, the gradient statistics determination module 73 may be configured to perform: determining the magnitude relation between the process parameter value and the noise gradient threshold value; and determining a gradient statistic value based on the magnitude relation between the process parameter value and the noise gradient threshold value.
According to an exemplary embodiment of the present disclosure, the first gradient calculation module may be configured to perform: determining a horizontal gradient and a vertical gradient of the contour information; calculating a process parameter value by using the horizontal gradient and the vertical gradient of the contour information; and calculating a first gradient value by combining the process parameter value and the size information of the target object image.
According to an exemplary embodiment of the present disclosure, referring to fig. 8, the image sharpness determining apparatus 8 may further include an image acquisition module 81, compared to the image sharpness determining apparatus 7.
In particular, the image acquisition module 81 may be configured to perform: acquiring an original image; and identifying a target object from the original image, and cutting the area where the target object is located to obtain a target object image.
Since each functional module of the image definition determining apparatus according to the embodiment of the present disclosure is the same as that in the embodiment of the method described above, it is not described herein again.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image sharpness determination method, characterized by comprising:
extracting contour information of a target object image, and performing gradient calculation on the contour information to obtain a first gradient value;
determining a gradient statistic based on the noise gradient threshold and the process parameter value; wherein the process parameter values are determined based on a horizontal gradient and a vertical gradient of the profile information;
performing gradient calculation on the target object image to obtain a second gradient value;
and calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image.
2. A method for determining sharpness of an image according to claim 1, wherein calculating a sharpness score of the target object image in combination with the first gradient value, the gradient statistic value, the second gradient value, and size information of the target object image, comprises:
calculating a first scoring factor using the first gradient value and the second gradient value;
calculating a second scoring factor using the first gradient value, the second gradient value, and size information of the target object image;
calculating a third scoring factor using the first gradient value, the gradient statistic value, and the second gradient value;
and calculating the definition score of the target object image by using the first scoring factor, the second scoring factor and the third scoring factor.
3. An image sharpness determination method according to claim 2, wherein calculating a sharpness score of the target object image using the first scoring factor, the second scoring factor, and the third scoring factor includes:
determining weights of the first scoring factor, the second scoring factor and the third scoring factor, respectively;
and carrying out weighted summation on the first scoring factor, the second scoring factor and the third scoring factor to obtain the definition score of the target object image.
4. A method for determining a sharpness of an image according to claim 3, wherein the weight of the first scoring factor and the weight of the second scoring factor are in a positive correlation with the sensitivity corresponding to the target object image, and the weight of the third scoring factor is in a negative correlation with the sensitivity corresponding to the target object image.
5. A method for determining sharpness of an image according to claim 1, wherein performing a gradient calculation on the contour information to obtain a first gradient value includes:
determining a horizontal gradient and a vertical gradient of the contour information;
calculating the process parameter values using the horizontal and vertical gradients of the profile information;
and calculating the first gradient value by combining the process parameter value and the size information of the target object image.
6. A method of image sharpness determination according to claim 1, wherein determining a gradient statistic based on a noise gradient threshold and a process parameter value utilized in calculating the first gradient value comprises:
determining a magnitude relationship of the process parameter value to the noise gradient threshold;
and determining the gradient statistic value based on the magnitude relation between the process parameter value and the noise gradient threshold value.
7. An image sharpness determination method according to any one of claims 1 to 6, further comprising:
acquiring an original image;
and identifying a target object from the original image, and cutting the area where the target object is located to obtain the target object image.
8. An image sharpness determining apparatus, comprising:
the first gradient calculation module is used for extracting contour information of a target object image and performing gradient calculation on the contour information to obtain a first gradient value;
a gradient statistics determination module for determining a gradient statistics value based on the noise gradient threshold value and the process parameter value; wherein the process parameter values are determined based on a horizontal gradient and a vertical gradient of the profile information;
the second gradient calculation module is used for performing gradient calculation on the target object image to obtain a second gradient value;
and the definition score calculating module is used for calculating the definition score of the target object image by combining the first gradient value, the gradient statistic value, the second gradient value and the size information of the target object image.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image sharpness determination method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing one or more programs that, when executed by the processor, cause the processor to implement the image sharpness determination method of any of claims 1 to 7.
CN202110309640.0A 2021-03-23 2021-03-23 Image definition determining method and device, storage medium and electronic equipment Active CN113052815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110309640.0A CN113052815B (en) 2021-03-23 2021-03-23 Image definition determining method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110309640.0A CN113052815B (en) 2021-03-23 2021-03-23 Image definition determining method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113052815A true CN113052815A (en) 2021-06-29
CN113052815B CN113052815B (en) 2022-06-24

Family

ID=76514674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110309640.0A Active CN113052815B (en) 2021-03-23 2021-03-23 Image definition determining method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113052815B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820614A (en) * 2022-06-29 2022-07-29 上海闪马智能科技有限公司 Image type determination method and device, storage medium and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101169A1 (en) * 2002-11-20 2004-05-27 Stmicroelectronics S.A. Determination of a definition score of a digital image
CN106530286A (en) * 2016-10-24 2017-03-22 乐视控股(北京)有限公司 Method and device for determining definition level
CN107958455A (en) * 2017-12-06 2018-04-24 百度在线网络技术(北京)有限公司 Image definition appraisal procedure, device, computer equipment and storage medium
CN109005368A (en) * 2018-10-15 2018-12-14 Oppo广东移动通信有限公司 A kind of generation method of high dynamic range images, mobile terminal and storage medium
CN110473189A (en) * 2019-08-02 2019-11-19 南通使爱智能科技有限公司 A kind of definition of text images judgment method and system
CN110930363A (en) * 2019-10-29 2020-03-27 北京临近空间飞行器系统工程研究所 Method and device for determining sharpness evaluation value of curved-surface blurred image and storage medium
CN111260642A (en) * 2020-02-12 2020-06-09 上海联影医疗科技有限公司 Image positioning method, device, equipment and storage medium
CN111899243A (en) * 2020-07-28 2020-11-06 阳光保险集团股份有限公司 Image definition evaluation method and device and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101169A1 (en) * 2002-11-20 2004-05-27 Stmicroelectronics S.A. Determination of a definition score of a digital image
CN106530286A (en) * 2016-10-24 2017-03-22 乐视控股(北京)有限公司 Method and device for determining definition level
CN107958455A (en) * 2017-12-06 2018-04-24 百度在线网络技术(北京)有限公司 Image definition appraisal procedure, device, computer equipment and storage medium
CN109005368A (en) * 2018-10-15 2018-12-14 Oppo广东移动通信有限公司 A kind of generation method of high dynamic range images, mobile terminal and storage medium
CN110473189A (en) * 2019-08-02 2019-11-19 南通使爱智能科技有限公司 A kind of definition of text images judgment method and system
CN110930363A (en) * 2019-10-29 2020-03-27 北京临近空间飞行器系统工程研究所 Method and device for determining sharpness evaluation value of curved-surface blurred image and storage medium
CN111260642A (en) * 2020-02-12 2020-06-09 上海联影医疗科技有限公司 Image positioning method, device, equipment and storage medium
CN111899243A (en) * 2020-07-28 2020-11-06 阳光保险集团股份有限公司 Image definition evaluation method and device and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘亚梅: "基于梯度边缘最大值的图像清晰度评价", 《图学学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820614A (en) * 2022-06-29 2022-07-29 上海闪马智能科技有限公司 Image type determination method and device, storage medium and electronic device

Also Published As

Publication number Publication date
CN113052815B (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN108229369B (en) Image shooting method and device, storage medium and electronic equipment
US10534957B2 (en) Eyeball movement analysis method and device, and storage medium
CN108830892B (en) Face image processing method and device, electronic equipment and computer readable storage medium
US20110148868A1 (en) Apparatus and method for reconstructing three-dimensional face avatar through stereo vision and face detection
CN107771391B (en) Method and apparatus for determining exposure time of image frame
CN110781770B (en) Living body detection method, device and equipment based on face recognition
CN111967319B (en) Living body detection method, device, equipment and storage medium based on infrared and visible light
US11948280B2 (en) System and method for multi-frame contextual attention for multi-frame image and video processing using deep neural networks
CN113052815B (en) Image definition determining method and device, storage medium and electronic equipment
CN112446254A (en) Face tracking method and related device
CN111860057A (en) Face image blurring and living body detection method and device, storage medium and equipment
CN108010009B (en) Method and device for removing interference image
CN113658065A (en) Image noise reduction method and device, computer readable medium and electronic equipment
CN111783677B (en) Face recognition method, device, server and computer readable medium
CN117392404A (en) Method and system for improving image detection speed
CN112597911A (en) Buffing processing method and device, mobile terminal and storage medium
CN106611417B (en) Method and device for classifying visual elements into foreground or background
CN111385481A (en) Image processing method and device, electronic device and storage medium
CN115471413A (en) Image processing method and device, computer readable storage medium and electronic device
CN110097622B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN113920023A (en) Image processing method and device, computer readable medium and electronic device
CN113538268A (en) Image processing method and device, computer readable storage medium and electronic device
CN114565962A (en) Face image processing method and device, electronic equipment and storage medium
CN112950641A (en) Image processing method and device, computer readable storage medium and electronic device
CN113283319A (en) Method and device for evaluating face ambiguity, medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant