CN108460065B - Photo cleaning method and device and terminal equipment - Google Patents

Photo cleaning method and device and terminal equipment Download PDF

Info

Publication number
CN108460065B
CN108460065B CN201710707329.5A CN201710707329A CN108460065B CN 108460065 B CN108460065 B CN 108460065B CN 201710707329 A CN201710707329 A CN 201710707329A CN 108460065 B CN108460065 B CN 108460065B
Authority
CN
China
Prior art keywords
photo
photos
score
value
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710707329.5A
Other languages
Chinese (zh)
Other versions
CN108460065A (en
Inventor
刘灿尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710707329.5A priority Critical patent/CN108460065B/en
Publication of CN108460065A publication Critical patent/CN108460065A/en
Application granted granted Critical
Publication of CN108460065B publication Critical patent/CN108460065B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/162Delete operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Library & Information Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a photo cleaning method, a photo cleaning device and terminal equipment, wherein the photo cleaning method comprises the following steps: acquiring a photo stored in user equipment; calculating the fuzziness score, the noise score and the shake score of the photo, and carrying out weighted average on the fuzziness score, the noise score and the shake score of the photo to obtain the score of the photo; displaying photos with scores larger than or equal to a preset threshold value in the photos saved by the user equipment to a user using the user equipment; and deleting the photos selected by the user in the displayed photos. According to the method and the device, useless photos can be distinguished more intelligently, the fuzzy photos in continuous shooting can be identified, the fuzzy photos caused by defocus, excessive noise, shake and the like in daily shooting can also be identified, and after the useless photos are selected intelligently, the useless photos are displayed for a user to be selected and cleaned, so that precious storage space can be made for the user when the storage space of user equipment is in short supply.

Description

Photo cleaning method and device and terminal equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for clearing a photo, and a terminal device.
Background
In the prior art, the storage space of the user equipment is larger and larger, but along with the use of the user, a lot of system garbage, application data and multimedia data are inevitably generated, so that the use space of the user is insufficient. And the multimedia data occupies the largest part of the user's usage space, and the user's photo occupies the most part of the multimedia data. Therefore, useless resources can be effectively released by cleaning the useless photos, and the space of a user is saved. During the process of taking a picture by the user, many blurred pictures are inevitably produced. The causes of these blurred pictures may include out of focus, excessive noisiness of night shots, and/or shaking during shooting. These photos can be judged as photos that are not useful to the user.
In the prior art, an automatic cleaning scheme based on similar photo recognition is provided, and a user selects redundant or useless photos in a similar photo set to delete through the similarity recognition of continuous photos.
Disclosure of Invention
In order to overcome the problems in the related art, the application provides a photo clearing method, a photo clearing device and terminal equipment.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an embodiment of the present application provides a photo cleaning method, including: acquiring a photo stored in user equipment; calculating the fuzziness score, the noise score and the shake score of the photo, and carrying out weighted average on the fuzziness score, the noise score and the shake score of the photo to obtain the score of the photo; displaying photos with scores larger than or equal to a preset threshold value in the photos saved by the user equipment to a user using the user equipment; and deleting the photos selected by the user in the displayed photos.
In the photo cleaning method, after photos stored in user equipment are obtained, the blur degree score, the noise score and the shake score of the photos are calculated, then the blur degree score, the noise score and the shake score of the photos are weighted and averaged to obtain the scores of the photos, the photos with the scores larger than or equal to a preset threshold value in the photos stored in the user equipment are displayed to a user using the user equipment, and finally the photos selected by the user in the displayed photos are deleted, so that useless photos can be intelligently distinguished, besides the fuzzy photos in continuous shooting, the fuzzy photos caused by the fact that the user loses focus, has excessive noise and/or shakes in daily shooting can be identified, after the useless photos are intelligently selected, the photos are displayed to the user for selective cleaning, and when the storage space of the user equipment is in short supply, valuable storage space is freed up for the user.
In a second aspect, an embodiment of the present application provides a photo cleaning apparatus, including: the acquisition module is used for acquiring the photos stored in the user equipment; the calculating module is used for calculating the fuzziness score, the noise score and the jitter score of the photo acquired by the acquiring module, and carrying out weighted average on the fuzziness score, the noise score and the jitter score of the photo to acquire the score of the photo; the display module is used for displaying the photos with the scores larger than or equal to a preset threshold value in the photos stored by the user equipment to a user using the user equipment; and the deleting module is used for deleting the photos selected by the user from the photos displayed by the displaying module.
In the photo cleaning device, after an acquisition module acquires photos stored in user equipment, a calculation module calculates the blur degree score, the noise score and the shake score of the photos, then the blur degree score, the noise score and the shake score of the photos are weighted and averaged to obtain the scores of the photos, a display module displays the photos with the scores larger than or equal to a preset threshold value in the photos stored in the user equipment to a user using the user equipment, and finally a deletion module deletes the photos selected by the user from the displayed photos, so that useless photos can be distinguished more intelligently, besides the fuzzy photos in continuous shooting, the fuzzy photos caused by the loss of focus, excessive noise, shake and the like in daily shooting can be recognized, after the useless photos are intelligently selected, the photos are displayed to the user for selective cleaning, and therefore when the storage space of the user equipment is in short supply, valuable storage space is freed up for the user.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method is implemented.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements the method as described above.
In a fifth aspect, the present application provides a computer program product, wherein when the instructions of the computer program product are executed by a processor, the method as described above is performed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flowchart of an embodiment of a photo cleaning method of the present application;
FIG. 2 is a flowchart of another embodiment of a photo cleansing method according to the present application;
FIG. 3 is a flowchart of a photo cleaning method according to another embodiment of the present application;
FIG. 4 is a flowchart of a photo cleaning method according to another embodiment of the present application;
FIG. 5 is a flowchart of a photo cleaning method according to another embodiment of the present application;
FIG. 6 is a schematic structural diagram of an embodiment of a photo cleaning apparatus according to the present application;
fig. 7 is a schematic structural diagram of an embodiment of a terminal device according to the present application;
fig. 8 is a schematic structural diagram of an embodiment of an internal portion of the mobile phone 10 according to the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Fig. 1 is a flowchart of an embodiment of a photo cleansing method according to the present application, and as shown in fig. 1, the photo cleansing method may include:
step 101, acquiring a photo stored in user equipment.
In particular, the photos saved in the user device may be acquired by auto-scanning.
And 102, calculating the blur degree score, the noise score and the shake score of the photo, and carrying out weighted average on the blur degree score, the noise score and the shake score of the photo to obtain the score of the photo.
Specifically, the score obtained by performing weighted average on the blur level score, the noise score, and the shake score of the photo may be:
the score of the photograph was calculated according to equation (1):
M=(a1M1+a2M2+a3M3)/3; (1)
in the formula (1), M is the score of the photograph, M is1For the blur score of the above photographs, M2For the noise score of the above photographs, M3For the above-mentioned photograph a shake score1Is a weighted value of M1, a2Is M2Weighted value of a3Is M3The weighting value of (2).
Wherein, a above1、a2And a3The size of (a) can be set according to the system performance and/or the implementation requirement during the specific implementation, and the embodiment is a1、a2And a3Is not limited, and a1, a2 and a3 can take a partial 0, represent only partial scores, and can be cut off in scoring algorithms according to system and equipment performance.
And 103, displaying the photos with the scores larger than or equal to a preset threshold value in the photos stored by the user equipment to a user using the user equipment.
And 104, deleting the photos selected by the user in the displayed photos.
In this embodiment, photos with scores greater than or equal to a preset threshold value among the photos saved by the user equipment may be output to a photo list for selection by a user, and after the user clicks the photos to be cleaned, the photos clicked by the user are cleaned by one key.
In the photo cleaning method, after photos stored in user equipment are obtained, the blur degree score, the noise score and the shake score of the photos are calculated, then the blur degree score, the noise score and the shake score of the photos are weighted and averaged to obtain the scores of the photos, the photos with the scores larger than or equal to a preset threshold value in the photos stored in the user equipment are displayed to a user using the user equipment, and finally the photos selected by the user in the displayed photos are deleted, so that useless photos can be intelligently distinguished, besides the fuzzy photos in continuous shooting, the fuzzy photos caused by the fact that the user loses focus, has excessive noise and/or shakes in daily shooting can be identified, after the useless photos are intelligently selected, the photos are displayed to the user for selective cleaning, and when the storage space of the user equipment is in short supply, valuable storage space is freed up for the user.
Fig. 2 is a flowchart of another embodiment of the photo cleansing method, as shown in fig. 2, and in the embodiment shown in fig. 1 of the present application, the calculating the blur degree score of the photo in step 102 may include:
step 201, performing gray scale processing on the photo to obtain a gray scale image of the photo.
Step 202, calculating the maximum vertical gradient and the maximum horizontal gradient of the photo according to the gray scale map of the photo, and calculating the gray scale range of the photo.
Specifically, the step of calculating the maximum vertical gradient and the maximum horizontal gradient of the photograph from the grayscale map of the photograph may be: calculating the difference value of the gray values of any two adjacent pixel points in the vertical direction of the gray image of the photo according to the gray image of the photo, and taking the maximum value in the calculated difference values as the maximum vertical gradient of the photo; and calculating the difference value of the gray values of any two adjacent pixel points in the horizontal direction of the gray map of the photo according to the gray map of the photo, and taking the maximum value in the calculated difference value as the maximum horizontal gradient of the photo.
Specifically, the calculating of the gray scale range of the photograph according to the gray scale map of the photograph may be: counting the number of pixel points corresponding to each gray value in the gray map of the photo according to the gray map of the photo; selecting a gray value with a first preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from large to small, and carrying out weighted average on the gray value with the first preset proportion and the number of the pixel points corresponding to the gray value with the first preset proportion to obtain the maximum value of the gray value of the photo; selecting a gray value with a second preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from small to large, and carrying out weighted average on the gray value with the second preset proportion and the number of the pixel points corresponding to the gray value with the second preset proportion to obtain the minimum value of the gray value of the photo; and calculating to obtain the gray scale range of the photo according to the maximum value of the gray scale value of the photo and the minimum value of the gray scale value of the photo.
The first predetermined ratio and the second predetermined ratio may be equal to or different from each other, and the magnitude of the first predetermined ratio and the second predetermined ratio may be set according to system performance and/or implementation requirements during implementation, which is not limited in this embodiment, for example, the first predetermined ratio and the second predetermined ratio may be 5%.
The manner in which the gray scale range of the photograph is calculated is described in detail below:
1) counting the number of pixel points corresponding to each gray value in the gray map of the photo in 256 gray values to generate a gray value array hist [256] with the length of 256, wherein each value in the array is the number of the pixel points corresponding to the gray value in the gray map of the photo;
2) calculating darkestValue and whitettvalue; selecting a gray value with a second preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from small to large, and performing weighted average on the gray value with the second preset proportion and the number of the pixel points corresponding to the gray value with the second preset proportion to obtain the minimum gray value of the photo; whitestValue is the minimum value of the gray values of the photo obtained by selecting the gray values of the second predetermined proportion from the gray values of which the number is not 0 according to the order from small to large of the gray values, and performing weighted average on the gray values of the second predetermined proportion and the number of the pixel points corresponding to the gray values of the second predetermined proportion.
Pseudo code for the manner of calculation of darkestValue and whitestValue may be as follows:
sum←0
sum1←0
sum2←0
thresh←0.05
for k←0 to 256
sum1←sum1+hist[k]*k
sum2←sum2+hist[k]
if((sum2/imSize)>thresh)break
if(sum2>0)
darkestValue←(int)((float)sum1/sum2)
sum1←0
sum2←0
for k←256 to 0
sum1←sum1+hist[k]*k
sum2←sum2+hist[k]
if(sum2/imSize)>thresh)break
if(sum2>0)
whitestValue=(int)((float)sum1/sum2);
3) and calculating to obtain the gray scale range of the photo according to the maximum value of the gray scale value of the photo and the minimum value of the gray scale value of the photo, wherein the formula (2) is shown.
grayRange=max(whitestValue-darkestValue,0); (2)
In the formula (2), the gray range is the gray scale range of the photograph.
Step 203, calculating the fuzzy degree score of the photo according to the maximum vertical gradient of the photo, the maximum horizontal gradient of the photo and the gray scale range of the photo.
Specifically, the calculating of the blur level score of the photograph according to the maximum vertical gradient of the photograph, the maximum horizontal gradient of the photograph, and the gray scale range of the photograph may be:
calculating the blur score of the photograph according to equation (3):
Figure BDA0001381813220000061
in the formula (3), Score is a blur degree Score of the photograph, maxHoriGradient is a maximum vertical gradient of the photograph, maxvertgradient is a maximum horizontal gradient of the photograph, and grayRange is a gray scale range of the photograph.
Fig. 3 is a flowchart of a still another embodiment of the photo cleansing method, as shown in fig. 3, and in the embodiment shown in fig. 1 of the present application, the calculating the noise score of the photo in step 102 may include:
step 301, processing the photo by using a median filter to generate a contrast photo.
Step 302, calculating the mean square error of the contrast photo and the photo, and calculating the peak signal-to-noise ratio of the photo according to the mean square error.
Specifically, the peak signal-to-noise ratio of the photograph may be calculated according to equation (4) based on the mean square error:
Figure BDA0001381813220000062
in the formula (4), psnr is the peak signal-to-noise ratio of the photograph, and mse is the mean square error of the comparative photograph and the photograph.
Step 303, mapping the peak snr of the picture to a noise score of the picture.
Specifically, the peak signal-to-noise ratio of the photograph may be mapped to the noise score of the photograph according to equation (5).
Figure BDA0001381813220000071
In formula (5), score' is the noise score of the photograph.
Fig. 4 is a flowchart of a still another embodiment of the photo cleansing method of the present application, and as shown in fig. 4, in the embodiment of the present application shown in fig. 1, the calculating the shake score of the photo in step 102 may include:
step 401, processing the photo by using bilateral filtering and impact filtering to generate a filtered picture.
Step 402, calculating the gradient amplitude image of the picture and the gradient amplitude image of the filtering picture.
Step 403, calculating a fuzzy core pre-estimation value according to the gradient magnitude image of the picture and the gradient magnitude image of the filtering picture.
Specifically, the fuzzy core prediction value can be calculated according to equation (6):
Figure BDA0001381813220000072
in formula (6), F (-) and F-1(. cndot.) represents a Fourier transform and an inverse Fourier transform,
Figure BDA0001381813220000073
represents the operation of complex conjugation, and represents the operation of complex conjugation,
Figure BDA0001381813220000074
is the gradient magnitude image of the above-mentioned photograph,
Figure BDA0001381813220000075
and k is a fuzzy core estimated value for the gradient amplitude image of the filtering picture.
And step 404, performing normalization processing on the fuzzy core estimated value, and calculating the norm of the fuzzy core estimated value obtained through the normalization processing as the jitter score of the photo.
Specifically, after the blur kernel prediction value is normalized, a 2-norm of the blur kernel prediction value obtained by the normalization processing may be calculated as the shake score of the above-described photograph.
Fig. 5 is a flowchart of a further embodiment of the photo cleansing method, as shown in fig. 5, the embodiment shown in fig. 1 of the present application may further include, after step 104:
step 501, if the user equipment performs photo cleaning for the first time, after deleting the photos selected by the user from the displayed photos, reporting the number of the photos finally cleaned for the first time and the total number of the photos stored in the user equipment to a background server.
In this embodiment, the initial value of the preset threshold is determined by the background server according to an empirical value;
after reporting the number of the first finally cleaned photos and the total number of the photos stored in the user equipment to a background server, the final value of the preset threshold is a threshold corresponding to the optimal cleaning effect, and the optimal cleaning effect is obtained by adjusting the initial value of the preset threshold according to the cleaning effect corresponding to the user equipment after the background server calculates the cleaning effect corresponding to the user equipment according to the number of the first finally cleaned photos and the total number of the photos stored in the user equipment.
That is, in order to maximize the cleaning effect, it is desirable to scan as many useless photos as possible, and the user selects as many photos as possible from the photos displayed to the user for final cleaning, i.e. it is desirable to have the number of first final cleaning photos/number of scanned photos as large as possible, and the total number of displayed photos/photos saved in the user device as large as possible, and multiplying the above two ratios by each other can define the cleaning effect as the number of first final cleaning photos/total number of photos saved in the user device, and the larger ratio indicates the better cleaning effect.
After the background server calculates the cleaning effect corresponding to the user equipment according to the number of the first finally cleaned photos and the total number of the photos stored in the user equipment, the initial value of the preset threshold value can be adjusted according to the cleaning effect corresponding to the user equipment, then other user equipment except the user equipment is observed, the adjusted first photo cleaning effect of the preset threshold value is used, and the threshold value corresponding to the optimal cleaning effect is finally selected to serve as the final value of the preset threshold value.
The picture cleaning method can distinguish useless pictures more intelligently, can identify the fuzzy pictures in continuous shooting, can also identify the fuzzy pictures caused by defocus, excessive noise, shake and the like in daily shooting, and can display the useless pictures for a user to select cleaning after intelligently selecting the useless pictures, so that precious storage space can be saved for the user when the storage space of user equipment is in short supply.
Fig. 6 is a schematic structural diagram of an embodiment of the photo clearing apparatus in the present application, where the photo clearing apparatus in the embodiment of the present application may be used as an application installed in a terminal, for example: the photo album manager realizes the photo cleaning method provided by the embodiment of the application. As shown in fig. 6, the photo cleaning apparatus may include: the system comprises an acquisition module 61, a calculation module 62, a display module 63 and a deletion module 64;
the acquiring module 61 is configured to acquire a photo stored in user equipment; specifically, the obtaining module 61 may obtain the photos saved in the user equipment through automatic scanning.
A calculating module 62, configured to calculate the blur degree score, the noise score, and the shake score of the photo obtained by the obtaining module 61, and perform weighted average on the blur degree score, the noise score, and the shake score of the photo to obtain the score of the photo; specifically, the calculation module 62 may perform weighted average on the blur level score, the noise score, and the shake score of the photo according to equation (1) to obtain the score of the photo.
A display module 63, configured to display, to a user using the user equipment, a photo with a score greater than or equal to a preset threshold in photos stored by the user equipment;
a deleting module 64, configured to delete the photos selected by the user from the photos displayed by the displaying module 63.
In this embodiment, the display module 63 may output the photos with the score greater than or equal to the preset threshold value from the photos saved by the user equipment to a photo list for selection by the user, and after the user checks the photos required to be cleaned, the deletion module 64 performs one-key cleaning on the photos checked by the user.
In this embodiment, the calculating module 62 is specifically configured to perform gray scale processing on the photo to obtain a gray scale map of the photo, calculate a maximum vertical gradient and a maximum horizontal gradient of the photo according to the gray scale map of the photo, calculate a gray scale range of the photo, and calculate a blur level score of the photo according to the maximum vertical gradient and the maximum horizontal gradient of the photo, and the gray scale range of the photo.
In a specific implementation, the calculating module 62 may calculate a difference value between gray values of any two adjacent pixel points in a vertical direction of the gray map of the photo according to the gray map of the photo, so as to calculate a maximum value of the obtained difference values as a maximum vertical gradient of the photo; and calculating the difference value of the gray values of any two adjacent pixel points in the horizontal direction of the gray map of the photo according to the gray map of the photo, and taking the maximum value in the calculated difference value as the maximum horizontal gradient of the photo.
The calculating module 62 may count the number of pixel points corresponding to each gray value in the gray map of the photo according to the gray map of the photo; selecting a gray value with a first preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from large to small, and carrying out weighted average on the gray value with the first preset proportion and the number of the pixel points corresponding to the gray value with the first preset proportion to obtain the maximum value of the gray value of the photo; selecting a gray value with a second preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from small to large, and carrying out weighted average on the gray value with the second preset proportion and the number of the pixel points corresponding to the gray value with the second preset proportion to obtain the minimum value of the gray value of the photo; and calculating to obtain the gray scale range of the photo according to the maximum value of the gray scale value of the photo and the minimum value of the gray scale value of the photo.
The first predetermined ratio and the second predetermined ratio may be equal to or different from each other, and the magnitude of the first predetermined ratio and the second predetermined ratio may be set according to system performance and/or implementation requirements during implementation, which is not limited in this embodiment, for example, the first predetermined ratio and the second predetermined ratio may be 5%.
The manner in which the calculating module 62 calculates the gray scale range of the photograph will be described in detail below:
1) counting the number of pixel points corresponding to each gray value in the gray map of the photo in 256 gray values to generate a gray value array hist [256] with the length of 256, wherein each value in the array is the number of the pixel points corresponding to the gray value in the gray map of the photo;
2) calculating darkestValue and whitettvalue; selecting a gray value with a second preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from small to large, and performing weighted average on the gray value with the second preset proportion and the number of the pixel points corresponding to the gray value with the second preset proportion to obtain the minimum gray value of the photo; whitestValue is the minimum value of the gray values of the photo obtained by selecting the gray values of the second predetermined proportion from the gray values of which the number is not 0 according to the order from small to large of the gray values, and performing weighted average on the gray values of the second predetermined proportion and the number of the pixel points corresponding to the gray values of the second predetermined proportion.
Pseudo code for the manner of calculation of darkestValue and whitestValue may be as follows:
sum←0
sum1←0
sum2←0
thresh←0.05
for k←0 to 256
sum1←sum1+hist[k]*k
sum2←sum2+hist[k]
if((sum2/imSize)>thresh)break
if(sum2>0)
darkestValue←(int)((float)sum1/sum2)
sum1←0
sum2←0
for k←256 to 0
sum1←sum1+hist[k]*k
sum2←sum2+hist[k]
if(sum2/imSize)>thresh)break
if(sum2>0)
whitestValue=(int)((float)sum1/sum2);
3) and calculating to obtain the gray scale range of the photo according to the maximum value of the gray scale value of the photo and the minimum value of the gray scale value of the photo, wherein the formula (2) is shown.
In particular implementations, the calculation module 62 may calculate the blur level score of the photograph according to equation (3).
In this embodiment, the calculating module 62 is specifically configured to process the photo by using a median filter to generate a contrast photo, calculate a mean square error between the contrast photo and the photo, calculate a peak signal-to-noise ratio of the photo according to the mean square error, and map the peak signal-to-noise ratio of the photo to a noise score of the photo.
In a specific implementation, the calculating module 62 may calculate a peak snr of the photo according to equation (4) according to the mean square error, and map the peak snr of the photo to a noise score of the photo according to equation (5).
In this embodiment, the calculating module 62 is specifically configured to process the photo by using bilateral filtering and impact filtering to generate a filtered picture; calculating the gradient amplitude image of the picture and the gradient amplitude image of the filtering picture; and calculating a fuzzy kernel pre-evaluation value according to the gradient amplitude image of the picture and the gradient amplitude image of the filtering picture, carrying out normalization processing on the fuzzy kernel pre-evaluation value, and calculating the norm of the fuzzy kernel pre-evaluation value obtained by the normalization processing to be used as the jitter score of the picture.
In a specific implementation, the calculating module 62 may calculate the fuzzy core pre-estimated value according to equation (6), and after performing the normalization process on the fuzzy core pre-estimated value, may calculate a 2-norm of the fuzzy core pre-estimated value obtained by the normalization process as the shake score of the photo.
Further, the photo cleaning apparatus may further include:
a reporting module 65, configured to report, to a background server, the number of photos that are finally cleaned for the first time and the total number of photos stored in the user equipment after deleting the photos selected by the user from the displayed photos when the user equipment is performing photo cleaning for the first time.
In this embodiment, the initial value of the preset threshold is determined by the background server according to an empirical value;
after reporting the number of the first finally cleaned photos and the total number of the photos stored in the user equipment to a background server, the final value of the preset threshold is a threshold corresponding to the optimal cleaning effect, and the optimal cleaning effect is obtained by adjusting the initial value of the preset threshold according to the cleaning effect corresponding to the user equipment after the background server calculates the cleaning effect corresponding to the user equipment according to the number of the first finally cleaned photos and the total number of the photos stored in the user equipment.
That is, in order to maximize the cleaning effect, it is desirable to scan as many useless photos as possible, and the user selects as many photos as possible from the photos displayed to the user for final cleaning, i.e. it is desirable to have the number of first final cleaning photos/number of scanned photos as large as possible, and the total number of displayed photos/photos saved in the user device as large as possible, and multiplying the above two ratios by each other can define the cleaning effect as the number of first final cleaning photos/total number of photos saved in the user device, and the larger ratio indicates the better cleaning effect.
After the background server calculates the cleaning effect corresponding to the user equipment according to the number of the first finally cleaned photos and the total number of the photos stored in the user equipment, the initial value of the preset threshold value can be adjusted according to the cleaning effect corresponding to the user equipment, then other user equipment except the user equipment is observed, the adjusted first photo cleaning effect of the preset threshold value is used, and the threshold value corresponding to the optimal cleaning effect is finally selected to serve as the final value of the preset threshold value.
In the photo cleaning device, after an acquisition module 61 acquires photos stored in user equipment, a calculation module 62 calculates a blur degree score, a noise score and a shake score of the photos, then the blur degree score, the noise score and the shake score of the photos are weighted and averaged to obtain the scores of the photos, a display module 63 displays the photos with the scores larger than or equal to a preset threshold value in the photos stored in the user equipment to a user using the user equipment, and finally a deletion module 64 deletes the photos selected by the user from the displayed photos, so that useless photos can be more intelligently distinguished, the fuzzy photos in continuous shooting can be identified, the fuzzy photos caused by out-of-focus, excessive noise and/or shake in daily shooting can be identified, and the useless photos can be displayed to the user for selection and cleaning after the useless photos are intelligently selected, therefore, precious storage space can be made available for the user when the storage space of the user equipment is in short supply.
Fig. 7 is a schematic structural diagram of an embodiment of a terminal device according to the present application, where the terminal device in the present embodiment may include a memory, a processor, and a computer program that is stored in the memory and is executable on the processor, and when the processor executes the computer program, the photo clearing method according to the present application may be implemented.
The terminal device may be an intelligent terminal device such as a smart phone, a tablet computer, or a smart watch, and the form of the terminal device is not limited in this embodiment.
In this embodiment, the terminal device is taken as a smart phone as an example for explanation.
It should be understood that the handset 10 shown in fig. 7 is merely one example of the terminal device described above, and that the handset 10 may have more or fewer components than shown in fig. 7, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 7 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The mobile phone 10 will be specifically described as an example. As shown in fig. 7, the mobile phone 10 may include a memory 11, a Central Processing Unit (CPU) 12, a peripheral interface 13, a Radio Frequency (RF) circuit 14, an audio circuit 15, a speaker 16, a power supply system 17, an input/output (I/O) subsystem 18, other input/control devices 19, and an external port 20, which communicate via one or more communication buses or signal lines 21.
The following describes the mobile phone provided in this embodiment in detail.
The memory 11: the memory 11 may be accessed by the CPU12, the peripheral interface 13, and the like, and the memory 11 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other volatile solid state storage devices.
A peripheral interface 13 which may connect input and output peripherals of the handset 10 to the CPU12 and the memory 11.
I/O subsystem 18: the I/O subsystem 18 may connect input and output peripherals on the handset 10, such as a touch screen 22 and other input/control devices 19, to the peripheral interface 13. The I/O subsystem 18 may include a display controller 181 and one or more input controllers 182 for controlling other input/control devices 19. Where one or more input controllers 182 receive electrical signals from or transmit electrical signals to other input/control devices 19, the other input/control devices 19 may include physical buttons (e.g., push buttons or rocker buttons, etc.), dials, slide switches, joysticks, or click wheels. It is noted that the input controller 182 may be connected to any of: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
The touch screen 22: the touch screen 22 is an input and output interface between the handset 10 and the user that displays visual output to the user, which may include graphics, text, icons, video, and the like.
The display controller 181 in the I/O subsystem 18 receives electrical signals from the touch screen 22 or transmits electrical signals to the touch screen 22. The touch screen 22 detects a contact on the touch screen, and the display controller 181 converts the detected contact into an interaction with a user interface object displayed on the touch screen 22, i.e., implements a human-computer interaction, which may be an icon for running a game, an icon networked to a corresponding network, etc., displayed on the touch screen 22. It is noted that the handset 10 may also include a light mouse, which is a touch sensitive surface that does not display visual output, or an extension of the touch sensitive surface formed by a touch screen.
The RF circuit 14 is mainly used to establish communication between the mobile phone 10 and a wireless network (i.e., a network side), so as to implement data transmission and reception between the mobile phone 10 and the wireless network. Such as sending and receiving short messages, e-mails, etc. Specifically, the RF circuitry 14 receives and transmits RF signals, also referred to as electromagnetic signals, by which the RF circuitry 14 converts electrical signals to or from electromagnetic signals and communicates with communication networks and other devices. The RF circuitry 14 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC (CODEC) chipset, a Subscriber Identity Module (SIM), and so forth.
The audio circuit 15 is mainly used to receive audio data from the peripheral interface 13, convert the audio data into an electric signal, and transmit the electric signal to the speaker 16.
A speaker 16 for reproducing the voice signal received by the handset 10 from the wireless network through the RF circuit 14 into sound and playing the sound to the user.
And the power supply system 17 is used for supplying power and managing the power for the hardware connected with the CPU12, the I/O subsystem 18 and the peripheral interface 13. The power system 17 may include a power management system, one or more power sources (e.g., batteries or ac), a recharging system, power failure detection circuitry, a power converter or inverter, a power status indicator (e.g., light emitting diodes), and any other components associated with power generation, management, and distribution in the portable device.
Fig. 8 is a schematic structural diagram of an embodiment of an internal portion of the mobile phone 10 according to the present application. In the present embodiment, the software components stored in the memory 11 may include an operating system 1001, a communication module 1002, a contact/movement module 1003, a graphic module 1004, and a function module 1005.
The operating system 1001 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The communications module 1002 facilitates communications with other devices through one or more external ports 20, and also includes various software components for processing data received by the RF circuitry 14 and/or the external ports 20.
The contact/movement module 1003 may detect contact with the touch screen 22 (in conjunction with the display controller 181) and other touch sensitive devices (e.g., a touchpad or a physical click wheel). The contact/movement module 1003 includes various software components for performing various operations related to detecting contact, such as determining whether contact has occurred, determining whether the contact has moved and tracked the movement on the touch screen 22, and determining whether the contact has been broken (i.e., whether contact has ceased). Determining movement of the point of contact may include determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact. These operations may be applied to a single contact (e.g., one finger contact) or to multiple simultaneous contacts (e.g., "multi-touch"/multi-finger contacts). In some embodiments, the contact/movement module 1003 and the display controller 181 also detect contact on the touch panel.
The graphics module 1004 includes various known software components for displaying graphics on the touch screen 22, including components for changing the darkness of the displayed graphics. For example, a graphic user interface of various software is displayed on the touch panel 22 in response to an instruction from the CPU 12.
The function module 1005 executes various functional applications and data processing, for example, implementing the photo clearing method provided in the present application, by executing the program stored in the memory 11.
The RF circuit 14 receives a message sent by a network side or other devices, where the message includes an email or a short message or an instant message, and the message may specifically be the message in the embodiments shown in fig. 1 to fig. 5. It is understood that the message received by the RF circuit 14 may be other types of messages, and is not limited in the embodiments of the present application. Those skilled in the art will appreciate that the received message may carry data of a variety of data types. There may be only one data type of data, or there may be two or more data types of data.
The CPU12 implements the photo clearing method provided by the embodiment of the present application when executing the program stored in the memory 11. In the above embodiment, the CPU12 may be embodied as a pentium series processor or an itanium processor manufactured by intel corporation.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, where computer-executable instructions in the storage medium are executed by a computer processor to perform the photo cleaning method provided by the embodiments of the present application.
The non-transitory computer readable storage medium described above may take any combination of one or more computer readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM) or flash Memory, an optical fiber, a portable compact disc Read Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The embodiment of the present application further provides a computer program product, and when instructions in the computer program product are executed by a processor, the photo cleaning method provided by the embodiment of the present application is executed.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic Gate circuit for implementing a logic function on a data signal, an asic having an appropriate combinational logic Gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), and the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A method of cleaning a photograph, comprising:
acquiring a photo stored in user equipment;
calculating a blur score for the photograph according to the following formula:
Figure FDA0002594108700000011
wherein, the Score is the blur degree Score of the photo, the maxHoriGradient is the maximum vertical gradient of the photo, the maxvertgradient is the maximum horizontal gradient of the photo, and the grayRange is the gray scale range of the photo;
calculating the peak signal-to-noise ratio of the photograph according to the following formula:
Figure FDA0002594108700000012
wherein psnr is the peak signal-to-noise ratio of the picture, mse is the mean square error of the picture and the comparison picture;
mapping the peak signal-to-noise ratio of the photograph to a noise score of the photograph according to the following formula:
Figure FDA0002594108700000013
wherein score' is the noise score of the photograph and psnr is the peak signal-to-noise ratio of the photograph;
calculating a fuzzy core prediction value of the picture according to the following formula:
Figure FDA0002594108700000014
wherein, F (-) and F-1(. cndot.) is a Fourier transform and an inverse Fourier transform,
Figure FDA0002594108700000015
in order to perform the complex conjugate operation,
Figure FDA0002594108700000016
is a gradient magnitude image of the picture,
Figure FDA0002594108700000017
the gradient amplitude image of the filtering picture of the picture is obtained, and k is a fuzzy core estimated value;
calculating the jitter score of the photo according to the fuzzy core estimated value of the photo;
carrying out weighted average on the fuzziness score, the noise score and the jitter score of the photo to obtain the score of the photo;
displaying photos with scores larger than or equal to a preset threshold value in the photos stored by the user equipment to a user using the user equipment, and deleting photos selected by the user in the displayed photos;
sending the number of the deleted photos and the number of the stored photos in the user equipment to a background server when the photos are cleared for the first time in the user equipment, so that the background server executes the following operations:
calculating the ratio of the number of the first-time cleaned photos in the user equipment to the number of the photos stored in the user equipment, and determining the ratio as the first-time photo cleaning effect corresponding to the user equipment, wherein the larger the ratio is, the better the first-time photo cleaning effect is represented;
and adjusting the preset threshold according to the cleaning effect corresponding to the user equipment, and selecting the optimal threshold corresponding to the first cleaning effect as the final value of the preset threshold according to the first photo cleaning effect of other user equipment except the user equipment using the adjusted preset threshold.
2. The method of claim 1, wherein prior to calculating the blur level score for the photograph according to the following formula, the method further comprises:
carrying out gray level processing on the photo to obtain a gray level image of the photo;
and calculating the maximum vertical gradient of the photo and the maximum horizontal gradient of the photo according to the gray level map of the photo, and calculating the gray level range of the photo.
3. The method of claim 2, wherein calculating the maximum vertical gradient of the photograph and the maximum horizontal gradient of the photograph from the grayscale map of the photograph comprises:
calculating the difference value of the gray values of any two adjacent pixel points in the vertical direction of the gray image of the photo according to the gray image of the photo, and taking the maximum value in the calculated difference values as the maximum vertical gradient of the photo; and the number of the first and second groups,
and calculating the difference value of the gray values of any two adjacent pixel points in the horizontal direction of the gray image of the photo according to the gray image of the photo, and taking the maximum value of the calculated difference values as the maximum horizontal gradient of the photo.
4. The method of claim 2, wherein calculating the grayscale range of the photograph from the grayscale map of the photograph comprises:
counting the number of pixel points corresponding to each gray value in the gray map of the photo according to the gray map of the photo;
selecting a gray value with a first preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from large to small, and carrying out weighted average on the gray value with the first preset proportion and the number of the pixel points corresponding to the gray value with the first preset proportion to obtain the maximum value of the gray value of the photo;
selecting a gray value with a second preset proportion from gray values with the number of the corresponding pixel points not being 0 according to the sequence of the gray values from small to large, and carrying out weighted average on the gray value with the second preset proportion and the number of the pixel points corresponding to the gray value with the second preset proportion to obtain the minimum value of the gray value of the photo;
and calculating to obtain the gray scale range of the photo according to the maximum value of the gray scale value of the photo and the minimum value of the gray scale value of the photo.
5. The method of claim 1, wherein prior to said calculating a peak signal-to-noise ratio of said photograph according to the following formula, said method further comprises:
and processing the photo by using a median filter to generate the comparison photo.
6. The method of claim 1, wherein before calculating the blur kernel estimate for the photograph according to the following formula, the method further comprises:
processing the photo by utilizing bilateral filtering and impact filtering to generate a filtering picture;
calculating a gradient magnitude image of the picture and a gradient magnitude image of the filtered picture;
the calculating the jitter score of the photo according to the fuzzy core estimated value of the photo comprises the following steps:
and carrying out normalization processing on the fuzzy core predicted value, and calculating the norm of the fuzzy core predicted value obtained by the normalization processing to be used as the jitter score of the photo.
7. A photograph cleaning apparatus, comprising:
the acquisition module is used for acquiring the photos stored in the user equipment;
a computing module for
Calculating a blur score for the photograph according to the following formula:
Figure FDA0002594108700000031
wherein, the Score is the blur degree Score of the photo, the maxHoriGradient is the maximum vertical gradient of the photo, the maxvertgradient is the maximum horizontal gradient of the photo, and the grayRange is the gray scale range of the photo;
calculating the peak signal-to-noise ratio of the photograph according to the following formula:
Figure FDA0002594108700000032
wherein psnr is the peak signal-to-noise ratio of the picture, mse is the mean square error of the picture and the comparison picture;
mapping the peak signal-to-noise ratio of the photograph to a noise score of the photograph according to the following formula:
Figure FDA0002594108700000033
wherein score' is the noise score of the photograph and psnr is the peak signal-to-noise ratio of the photograph;
calculating a fuzzy core prediction value of the picture according to the following formula:
Figure FDA0002594108700000034
wherein, F (-) and F-1(. cndot.) is a Fourier transform and an inverse Fourier transform,
Figure FDA0002594108700000041
in order to perform the complex conjugate operation,
Figure FDA0002594108700000042
is a gradient magnitude image of the picture,
Figure FDA0002594108700000043
the gradient amplitude image of the filtering picture of the picture is obtained, and k is a fuzzy core estimated value;
calculating the jitter score of the photo according to the fuzzy core estimated value of the photo;
carrying out weighted average on the fuzziness score, the noise score and the jitter score of the photo to obtain the score of the photo;
the display module is used for displaying the photos with the scores larger than or equal to a preset threshold value in the photos stored by the user equipment to a user using the user equipment;
the deleting module is used for deleting the photos selected by the user from the photos displayed by the displaying module;
a reporting module, configured to send the number of deleted photos and the number of saved photos in the user equipment to a background server when photos are first cleaned in the user equipment, so that the background server performs the following operations:
calculating the ratio of the number of the first-time cleaned photos in the user equipment to the number of the photos stored in the user equipment, and determining the ratio as the first-time photo cleaning effect corresponding to the user equipment, wherein the larger the ratio is, the better the first-time photo cleaning effect is represented;
and adjusting the preset threshold according to the cleaning effect corresponding to the user equipment, and selecting the optimal threshold corresponding to the first cleaning effect as the final value of the preset threshold according to the first photo cleaning effect of other user equipment except the user equipment using the adjusted preset threshold.
8. A terminal device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any one of claims 1-6 when executing the computer program.
9. A non-transitory computer-readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the method of any one of claims 1-6.
10. A computer program product, characterized in that instructions in the computer program product, when executed by a processor, perform the method according to any of claims 1-6.
CN201710707329.5A 2017-08-17 2017-08-17 Photo cleaning method and device and terminal equipment Active CN108460065B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710707329.5A CN108460065B (en) 2017-08-17 2017-08-17 Photo cleaning method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710707329.5A CN108460065B (en) 2017-08-17 2017-08-17 Photo cleaning method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN108460065A CN108460065A (en) 2018-08-28
CN108460065B true CN108460065B (en) 2020-10-16

Family

ID=63221010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710707329.5A Active CN108460065B (en) 2017-08-17 2017-08-17 Photo cleaning method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN108460065B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111480158A (en) * 2018-10-12 2020-07-31 华为技术有限公司 File management method and electronic equipment
CN110727810B (en) * 2019-10-15 2023-05-02 联想(北京)有限公司 Image processing method, device, electronic equipment and storage medium
CN110795579B (en) * 2019-10-29 2022-11-18 Oppo广东移动通信有限公司 Picture cleaning method and device, terminal and storage medium
CN113360600A (en) * 2021-06-03 2021-09-07 中国科学院计算机网络信息中心 Method and system for screening enterprise performance prediction indexes based on signal attenuation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104408061A (en) * 2014-10-29 2015-03-11 深圳市中兴移动通信有限公司 Photo album management method and device
US9147262B1 (en) * 2014-08-25 2015-09-29 Xerox Corporation Methods and systems for image processing
CN105512671A (en) * 2015-11-02 2016-04-20 北京蓝数科技有限公司 Picture management method based on blurred picture recognition
CN106155593A (en) * 2016-08-01 2016-11-23 惠州Tcl移动通信有限公司 A kind of method and system deleting photo based on shooting quality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509285A (en) * 2011-09-28 2012-06-20 宇龙计算机通信科技(深圳)有限公司 Processing method and system for shot fuzzy picture and shooting equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9147262B1 (en) * 2014-08-25 2015-09-29 Xerox Corporation Methods and systems for image processing
CN104408061A (en) * 2014-10-29 2015-03-11 深圳市中兴移动通信有限公司 Photo album management method and device
CN105512671A (en) * 2015-11-02 2016-04-20 北京蓝数科技有限公司 Picture management method based on blurred picture recognition
CN106155593A (en) * 2016-08-01 2016-11-23 惠州Tcl移动通信有限公司 A kind of method and system deleting photo based on shooting quality

Also Published As

Publication number Publication date
CN108460065A (en) 2018-08-28

Similar Documents

Publication Publication Date Title
CN108460065B (en) Photo cleaning method and device and terminal equipment
CN110059744B (en) Method for training neural network, method and equipment for processing image and storage medium
CN111726533B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
EP2793451A2 (en) Apparatus and method for displaying unchecked messages in a terminal
CN109040523B (en) Artifact eliminating method and device, storage medium and terminal
US20190034720A1 (en) Method and device for searching stripe set
AU2012302448A1 (en) Method of providing a user interface in portable terminal and apparatus thereof
CN107562539B (en) Application program processing method and device, computer equipment and storage medium
CN109547335B (en) Session message processing method and device
WO2021179856A1 (en) Content recognition method and apparatus, electronic device, and storage medium
CN104349080A (en) Image processing method and electronic equipment
JP2018531564A (en) Method, apparatus and system for obtaining video data and computer-readable storage medium
CN111818050A (en) Target access behavior detection method, system, device, equipment and storage medium
EP2584451A2 (en) Method for adjusting video image compression using gesture
CN110166696B (en) Photographing method, photographing device, terminal equipment and computer-readable storage medium
CN113421211B (en) Method for blurring light spots, terminal equipment and storage medium
CN110365906A (en) Image pickup method and mobile terminal
CN105513098B (en) Image processing method and device
CN104933688B (en) Data processing method and electronic equipment
CN114650361A (en) Shooting mode determining method and device, electronic equipment and storage medium
CN109672829B (en) Image brightness adjusting method and device, storage medium and terminal
CN109218620B (en) Photographing method and device based on ambient brightness, storage medium and mobile terminal
CN108549702B (en) Method for cleaning picture library of mobile terminal and mobile terminal
CN113204293B (en) Touch sensing processing method, touch sensing processing device, medium and electronic equipment
CN109600409B (en) Resource management method and terminal for application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant