CN111480158A - File management method and electronic equipment - Google Patents

File management method and electronic equipment Download PDF

Info

Publication number
CN111480158A
CN111480158A CN201880081385.5A CN201880081385A CN111480158A CN 111480158 A CN111480158 A CN 111480158A CN 201880081385 A CN201880081385 A CN 201880081385A CN 111480158 A CN111480158 A CN 111480158A
Authority
CN
China
Prior art keywords
picture
pictures
score
feature
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880081385.5A
Other languages
Chinese (zh)
Inventor
郭颂
刘宗超
高华江
宋海东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN111480158A publication Critical patent/CN111480158A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A picture management method and electronic equipment can improve the efficiency of searching pictures. The method comprises the following steps: the method comprises the steps of obtaining scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a first characteristic and a scoring standard thereof, a second characteristic and a scoring standard thereof of each picture according to the scene characteristics; calculating the score of each picture according to the value of the first characteristic and the value of the second characteristic, wherein the influence of the first characteristic on the score of the picture is larger than the influence of the second characteristic on the score of the picture; detecting a first operation of a user; and responding to the first operation, displaying S second pictures in the at least two pictures, wherein S is an integer greater than or equal to 1, and the scores of the S second pictures are higher than the scores of other pictures except the S second pictures in the at least two pictures.

Description

File management method and electronic equipment Technical Field
The present application relates to the field of electronic devices, and in particular, to a file management method and an electronic device.
Background
At present, the storage capacity of electronic devices is increasing, and the number of files (such as pictures, audio, video, etc.) that can be stored is also increasing, so that the files need to be managed effectively.
The electronic equipment manages the pictures, including displaying the pictures and deleting the pictures. In the prior art, the pictures can be scored, and the pictures can be displayed or deleted according to the scores. However, in the prior art, a scoring mechanism is unreasonable, and the score of an obtained picture is often different from the psychological expectation of a user, so that the user cannot quickly find the picture, or the picture which the user does not want to delete is deleted, and the operation efficiency is affected.
Disclosure of Invention
The embodiment of the application provides a picture management method and electronic equipment, so that a scoring mechanism is more reasonable, the score of an obtained picture is more in line with the psychological expectation of a user, the user can quickly search for the picture, or the accuracy of deleting the picture is improved, and the operation efficiency is improved.
In a first aspect, an embodiment of the present application provides a picture management method, where the method is performed by an electronic device, and the method may include: the method comprises the steps of obtaining scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a first characteristic, a second characteristic, a scoring criterion of the first characteristic and a scoring criterion of the second characteristic of each picture according to the scene characteristic of each picture; calculating the score of each picture according to the value of the first characteristic of each picture and the value of the second characteristic of each picture; the influence of the first characteristic on the score of each picture is greater than the influence of the second characteristic on the score of each picture; detecting a first operation of a user; responding to the first operation, and displaying S second pictures in the at least two pictures; and S is an integer greater than or equal to 1, and the score of the S second pictures is higher than the scores of other pictures except the S second pictures in the at least two pictures.
According to the technical scheme provided by the embodiment of the application, the electronic equipment can determine the first characteristic and the scoring standard thereof, and the second characteristic and the scoring standard thereof of each picture stored in the electronic equipment and/or stored in the cloud photo album according to the scene characteristics, calculate the score of each picture according to the value of the first characteristic and the value of the second characteristic, and display S second pictures with high scores. The picture score can be used for determining the user's preference degree of the picture, and the higher the score is, the higher the preference degree is. Therefore, the second picture with high display score can enable the user to more easily check the favorite picture, and the picture searching efficiency of the user is improved.
In a possible implementation manner, the scene features include one or any combination of the following: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or not, and whether the wallpaper is uploaded to a storage position or not; the first feature may include one or any combination of the following: whether the storage is collected or remarked; the second feature may include one or any combination of the following: whether the picture is shared, a shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size and aesthetic score.
According to the technical scheme provided by the embodiment of the application, whether a user possibly likes a certain picture can be preliminarily judged according to the scene characteristics, and then the characteristics of the picture are extracted according to the judgment result to calculate. Different interpretation results correspond to different characteristics participating in calculating the picture scores and scoring standards of the characteristics, the scores of the pictures can be calculated more accurately, the scores of the pictures are more in line with psychological expectation of users, the operation of the users is reduced, and the operation efficiency is improved.
In a possible implementation manner, before the detecting the first operation of the user, the method further includes: the electronic equipment displays a status bar, a navigation bar, a time component icon and icons of one or more application programs, wherein the icon of the camera application belongs to the icons of the one or more application programs, and the first operation is an operation of a user on the icon of the photo album application; after the detecting the first operation of the user, the method further includes: responding to a first operation, displaying other pictures except for S second pictures in at least two pictures, wherein the scores of the S second pictures are higher than the scores of the other pictures except for the S second pictures; the S second pictures are displayed before the other pictures except the S second pictures, or the S second pictures are specially marked to be displayed in a way of being distinguished from the other pictures except the S second pictures.
According to the technical scheme, the second picture with the high score can be displayed in front of other pictures with the low score, or special marks are added to the second picture with the high score to be displayed in a manner of being distinguished from the pictures with the low score, so that a user can more easily view the favorite picture, and the picture searching efficiency of the user is improved.
In a possible implementation manner, the S second pictures are arranged in order from high to low according to the score, and the other pictures except the S second pictures are arranged in order from high to low according to the score.
According to the technical scheme, the pictures are arranged according to the score sequence, the pictures possibly liked by the user are arranged at the forefront, the user can more easily look up the favorite pictures, and the picture searching efficiency of the user is improved.
In a possible implementation manner, the manner in which the S second pictures are specially marked includes one or any combination of the following: enlarged display, increased frame display, increased mark display, special color display, and special transparency display.
According to the technical scheme, the special mark is added to the second picture with the high value, so that the user can more easily check the favorite picture, and the picture searching efficiency of the user is improved.
In a possible implementation manner, after the detecting the first operation of the user, the method further includes: responding to the first operation, displaying the folders according to the classification, and displaying a search control, a first menu control, a second menu control and a third menu control; the first operation is an operation of the third menu control by a user, and the classification mode includes one or any combination of the following: location, time, people; each folder comprises one or more pictures, and the one or more pictures belong to the at least two pictures; after the folders are displayed according to the categories and the search control, the first menu control, the second menu control and the third menu control are displayed, the method further comprises the following steps: and responding to a second operation of the user on the search control, and displaying a search bar, folders displayed according to the classification and the S second pictures.
According to the technical scheme, the second picture with the high score and other folders displayed according to the classification are displayed together, so that a user can more easily view own favorite pictures, and the efficiency of searching the pictures by the user is improved.
In a possible implementation manner, before the calculating the score of each picture according to the value of the first feature of each picture and the value of the second feature of each picture, the method further includes: judging that an optimization condition is met, wherein the optimization condition comprises one or any combination of the following: when the remaining storage space of the electronic equipment is lower than the first set value and the set time is reached, the remaining electric quantity of the electronic equipment is lower than the second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
According to the technical scheme provided by the embodiment of the application, the score of each picture can be calculated under the condition that the optimization condition is met. The technical scheme provided by the embodiment of the application can ensure that the normal use of the user is not influenced in the calculation process of the picture score.
In a possible implementation manner, after S second pictures of the at least two pictures are displayed, the method further includes: receiving a third operation that the user cancels the special mark of at least one second picture in the S second pictures, and recalculating the score of the at least one second picture in response to the third operation; or receiving a fourth operation that a user adds a special mark to at least one picture in the other pictures except the S second pictures, and recalculating the score of the at least one picture in response to the fourth operation; or receiving a fifth operation that a user displays after moving at least one of the S second pictures to at least one of the other pictures except the S second pictures, and recalculating the score of at least one of the S second pictures in response to the fifth operation; or receiving a sixth operation that a user displays before moving at least one of the other pictures except the S second pictures to at least one of the S second pictures, and recalculating the score of at least one of the other pictures except the S second pictures in response to the sixth operation. According to the technical scheme provided by the embodiment of the application, the scores of the partial pictures can be recalculated according to the feedback behaviors of the user to the partial pictures, so that the calculated result is more in line with the psychological expectation of the user, the displayed result is more accurate, and the picture searching efficiency of the user is further improved.
In a second aspect, an embodiment of the present application provides a picture management method, which is executed by an electronic device, and includes: the method comprises the steps of obtaining scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a third feature, a fourth feature, a scoring criterion of the third feature and a scoring criterion of the fourth feature of each picture according to the scene features of each picture; calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; the influence of the third characteristic on the score of each picture is greater than the influence of the fourth characteristic on the score of each picture; detecting a first operation of a user; responding to the first operation, and displaying a first folder; the first folder comprises M first pictures in at least two pictures; and M is an integer greater than or equal to 1, and the scores of the M first pictures are lower than the scores of other pictures except the M first pictures in the at least two pictures.
According to the technical scheme provided by the embodiment of the application, the electronic equipment can determine the third characteristic and the scoring standard thereof, and the fourth characteristic and the scoring standard thereof of each picture stored in the electronic equipment and/or stored in the cloud album according to the scene characteristics, calculate the score of each picture according to the value of the third characteristic and the value of the fourth characteristic, and display the folder comprising M first pictures with low scores. The high and low of the score of the picture can be used for determining the degree of the user's non-preference of the picture, and the lower the score, the higher the non-preference degree. Therefore, the M first pictures with low scores are placed in the first folder, a user does not need to select the pictures which are not loved one by one, user operation is reduced, and operation efficiency is improved.
In a possible implementation manner, after displaying the first folder, the method further includes: detecting a second operation of the user; and in response to the second operation, deleting the M first pictures.
According to the technical scheme provided by the embodiment of the application, a user can delete all M first pictures in the folder by one key without deleting favorite pictures one by one, so that the operation of deleting pictures is reduced, and the efficiency of deleting pictures by the user is improved.
In a possible implementation manner, the scene features include one or any combination of the following: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or a storage position or not; the third feature may include one or any combination of the following: whether or not it is placed in a trash; the fourth feature may include one or any combination of the following: shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size, and aesthetic score.
According to the technical scheme provided by the embodiment of the application, whether a user possibly likes a certain picture can be preliminarily judged according to the scene characteristics, and then the characteristics of the picture are extracted according to the judgment result to calculate. Different interpretation results correspond to different characteristics participating in calculating the picture scores and scoring standards of the characteristics, the scores of the pictures can be calculated more accurately, the scores of the pictures are more in line with psychological expectation of users, the operation of the users is reduced, and the operation efficiency is improved.
In a possible implementation manner, before the calculating the score of each picture according to the value of the third feature of each picture and the value of the fourth feature of each picture, the method further includes: judging that the optimization condition is met; wherein, the optimization condition comprises one or any combination of the following: when the remaining storage space of the electronic equipment is lower than the first set value and the set time is reached, the remaining electric quantity of the electronic equipment is lower than the second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
According to the technical scheme provided by the embodiment of the application, the score of each picture can be calculated under the condition that the optimization condition is met. The technical scheme provided by the embodiment of the application can ensure that the normal use of the user is not influenced in the calculation process of the picture score.
In a possible implementation manner, after displaying the first folder, the method further includes: receiving a third operation that a user moves at least one first picture out of the M first pictures out of the first folder, and recalculating the at least one first score in response to the third operation; or receiving a fourth operation that the user moves at least one picture in the other pictures except the M first pictures into the first folder, and recalculating the score of the at least one picture in response to the fourth operation.
According to the technical scheme provided by the embodiment of the application, the scores of the partial pictures can be recalculated according to the feedback behaviors of the user to the partial pictures, so that the calculated result is more in line with the psychological expectation of the user, the displayed result is more accurate, and the picture deleting efficiency of the user is further improved.
In a third aspect, an embodiment of the present application provides a picture management method, which is executed by an electronic device, and includes: the method comprises the steps of obtaining scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a third feature of each picture and a scoring standard of the third feature, and a scoring standard of a fourth feature and a scoring standard of the fourth feature according to the scene features of each picture; calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; the influence of the third characteristic on the score of each picture is greater than the influence of the fourth characteristic on the score of each picture; deleting M first pictures in the at least two pictures; and M is an integer greater than or equal to 1, and the scores of the M first pictures are lower than those of the other pictures except the M first pictures in the at least two pictures.
According to the technical scheme provided by the embodiment of the application, the electronic equipment can determine the third characteristic and the scoring standard thereof, and the fourth characteristic and the scoring standard thereof of each picture stored in the electronic equipment and/or stored in the cloud photo album according to the scene characteristics, calculate the score of each picture according to the value of the third characteristic and the value of the fourth characteristic, and delete the first picture with low score according to the score. The high and low of the score of the picture can be used for determining the degree of the user's non-preference of the picture, and the lower the score, the higher the non-preference degree. Therefore, the first picture with the low deletion score can reduce the operation of deleting the picture by the user, and the picture deleting efficiency is improved.
In a possible implementation manner, the scene features include one or any combination of the following: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or a storage position or not; the third feature described above includes: whether or not it is placed in a trash; the fourth feature may include one or any combination of the following: shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size, and aesthetic score.
According to the technical scheme provided by the embodiment of the application, whether a user possibly likes a certain picture can be preliminarily judged according to the scene characteristics, and then the characteristics of the picture are extracted according to the judgment result to calculate. Different interpretation results correspond to different characteristics participating in calculating the picture scores and scoring standards of the characteristics, the scores of the pictures can be calculated more accurately, the scores of the pictures are more in line with psychological expectation of users, the operation of the users is reduced, and the operation efficiency is improved.
In a possible implementation manner, before the calculating the score of each picture according to the value of the third feature of each picture and the value of the fourth feature of each picture, the method further includes: judging that an optimization condition is met, wherein the optimization condition comprises one or any combination of the following: the remaining storage space of the electronic equipment is lower than a first set value, the set time is reached, the remaining electric quantity of the electronic equipment is lower than a second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
According to the technical scheme provided by the embodiment of the application, the score of each picture can be calculated under the condition that the optimization condition is met. The technical scheme provided by the embodiment of the application can ensure that the normal use of the user is not influenced in the calculation process of the picture score.
In a possible implementation manner, after deleting M first pictures of the at least two pictures, the method further includes: receiving a first operation of downloading at least one first picture in the M first pictures by a user, and recalculating the score of the at least one first picture in response to the first operation; or receiving a second operation of deleting at least one picture in the other pictures except the M first pictures by the user, and recalculating the score of the at least one picture in response to the second operation.
According to the technical scheme provided by the embodiment of the application, the scores of the partial pictures can be recalculated according to the feedback behaviors of the user to the partial pictures, so that the calculated result is more in line with the psychological expectation of the user, the displayed result is more accurate, and the picture searching efficiency of the user is further improved.
In a fourth aspect, an embodiment of the present application provides a picture score calculation method, which is executed by an electronic device and applied to a picture management method provided by any one implementation manner of the first aspect or any one implementation manner of the first aspect of the present application, where the score calculation method includes: the method comprises the steps of obtaining scene characteristics of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a first characteristic of each picture and a scoring standard of the first characteristic, and determining a scoring standard of a second characteristic and a scoring standard of the second characteristic according to the scene characteristic of each picture; the first feature corresponds to a first value and a second value, the first value being greater than the second value, the second feature corresponding to two or more values; calculating the score of each picture according to the value of the first characteristic of each picture and the value of the second characteristic of each picture; the score is proportional to the value of the first feature, the score is proportional to the value of the weighted sum of the second features, and the first value of the first feature is greater than the value of the weighted sum when the second feature takes the maximum value; when the first value of the first characteristic is larger than the maximum value of the second characteristic, the values of the first characteristic and the second characteristic are weighted and calculated.
According to the technical scheme provided by the embodiment of the application, the score of the picture is in direct proportion to the value of the first characteristic, the score of the picture is in direct proportion to the value of the weighted sum of the second characteristic, the first value of the first characteristic is larger than the value of the weighted sum when the second characteristic takes the maximum value, the score of the picture can be guaranteed to be mainly influenced by the value of the first characteristic, the value of the weighted sum of the second characteristic is secondarily influenced, the importance of the first characteristic is highlighted, a scoring mechanism is more reasonable, and the psychological expectation of a user is better met.
In a fifth aspect, an embodiment of the present application provides a method for calculating a score of a picture, which is executed by an electronic device and applied to a picture management method provided by any one of the implementation manners of the second aspect or the second aspect, the third aspect, or a third party of the embodiment of the present application, and the method for calculating a score of a picture includes: the method comprises the steps of obtaining scene characteristics of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a third feature of each picture and a scoring standard of the third feature, and a scoring standard of a fourth feature and a scoring standard of the fourth feature according to the scene features of each picture; the third feature corresponds to a first value and a second value, the first value is greater than the second value, and the fourth feature corresponds to two or more values; calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; the score is inversely proportional to the value of the third feature, the score is proportional to the value of the weighted sum of the fourth features, the value of the weighted sum of the fourth features is a negative number, and the first value of the third feature is greater than the absolute value of the weighted sum when the fourth feature assumes a minimum value.
According to the technical scheme provided by the embodiment of the application, the score of the picture is inversely proportional to the value of the third characteristic, and the score of the picture is proportional to the value of the weighted sum of the fourth characteristic. Under the condition that the weighted sum value of the fourth feature is negative, the first value of the third feature is larger than the absolute value of the weighted sum value when the fourth feature takes the minimum value, so that the score of the picture is mainly influenced by the value of the third feature and is secondarily influenced by the weighted sum value of the fourth feature, the importance of the third feature is highlighted, a scoring mechanism is more reasonable, and the psychological expectation of a user is better met.
In a sixth aspect, an embodiment of the present application provides a method for calculating a score of a picture, which is executed by an electronic device and applied to a picture management method provided by any one of the implementation manners of the second aspect or the second aspect, the third aspect, or a third party of the embodiment of the present application, and the method for calculating a score of a picture includes: the method comprises the steps of obtaining scene characteristics of at least two pictures stored in the electronic equipment and/or stored in a cloud album; determining a third feature of each picture and a scoring standard of the third feature, and a scoring standard of a fourth feature and a scoring standard of the fourth feature according to the scene features of each picture; the third feature corresponds to a first value and a second value, the first value is greater than the second value, and the fourth feature corresponds to two or more values; calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; the score is inversely proportional to the value of the third feature, the score is proportional to the value of the weighted sum of the fourth feature, the value of the weighted sum of the fourth feature is non-negative, and the first value of the third feature is less than the inverse of the value of the weighted sum when the fourth feature is at its maximum.
According to the technical scheme provided by the embodiment of the application, the score of the picture is inversely proportional to the value of the third characteristic, and the score of the picture is proportional to the value of the weighted sum of the fourth characteristic. Under the condition that the value of the weighted sum of the fourth features is a non-negative number, the first value of the third feature is smaller than the reciprocal of the value of the weighted sum when the maximum value of the fourth features is taken, so that the point value of the picture is mainly influenced by the value of the third feature and is secondarily influenced by the value of the weighted sum of the fourth features, the importance of the third features is highlighted, the scoring mechanism is more reasonable, and the psychological expectation of a user is better met.
In a seventh aspect, an embodiment of the present application provides an electronic device, including: one or more processors, memory, display screens, wireless communication modules, and mobile communication modules; the memory, the display screen, the wireless communication module and the mobile communication module are coupled to one or more processors, the memory is used for storing computer program codes, the computer program codes comprise computer instructions, and when the one or more processors execute the computer instructions, the electronic device executes the picture management method provided by the first aspect or any implementation manner of the first aspect, or any implementation manner of the second aspect or the second aspect, or any implementation manner of the third aspect or the third aspect.
In an eighth aspect, an embodiment of the present application provides an electronic device, including: one or more processors, memory; the above memory is coupled with one or more processors and is configured to store computer program code, where the computer program code includes computer instructions, and when the computer instructions are executed by the one or more processors, the electronic device performs the picture score calculation method as provided by any one of the fourth aspect or the fourth aspect, or any one of the fifth aspect or the fifth aspect, or any one of the sixth aspect or the sixth aspect.
In a ninth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to execute the method for managing pictures provided in the first aspect or any one of the implementations of the first aspect, or any one of the second aspect, or the third aspect.
In a tenth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device executes the picture score calculation method provided in any one of the fourth aspect or the fourth aspect, or any one of the fifth aspect or the fifth aspect, or any one of the sixth aspect or the sixth aspect.
In an eleventh aspect, an embodiment of the present application provides a computer program product, which when run on a computer, causes the computer to execute the picture management method provided in the first aspect or any one of the implementation manners of the first aspect, or any one of the implementation manners of the second aspect or the second aspect, or any one of the implementation manners of the third aspect or the third aspect.
In a twelfth aspect, an embodiment of the present application provides a computer program product, which when run on a computer, causes the computer to execute the picture score calculating method according to any one of the fourth aspect or the implementation manner of the fourth aspect, or according to any one of the fifth aspect or the fifth aspect, or according to any one of the sixth aspect or the sixth aspect.
It is to be understood that the electronic device of the seventh aspect, the computer storage medium of the ninth aspect, or the computer program product of the eleventh aspect is provided for executing the picture management method of any of the first, second, or third aspects. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
It is to be understood that the electronic device of the eighth aspect, the computer storage medium of the tenth aspect, or the computer program product of the twelfth aspect provided above are all configured to execute the picture score calculating method of any one of the fourth aspect, the fifth aspect, or the sixth aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a picture management method according to an embodiment of the present application;
FIG. 4A is a first human-computer interaction interface diagram provided in an embodiment of the present application;
FIG. 4B is a diagram of a human-computer interaction interface provided by the embodiment of the present application;
fig. 5A is a first schematic view of an interface of an electronic device according to an embodiment of the present disclosure;
fig. 5B is a schematic view of an interface of an electronic device according to an embodiment of the present application;
fig. 6A is a first picture management scene diagram provided in the embodiment of the present application;
FIG. 6B is a diagram of a human-computer interaction interface provided by the embodiment of the present application;
FIG. 7 is a fourth illustration of a human-computer interaction interface provided by an embodiment of the present application;
FIG. 8 is a comparison of before and after optimization provided by the examples of the present application;
FIG. 9 is a comparison of before and after optimization provided by the examples of the present application;
fig. 10 is a schematic diagram illustrating a score sorting of pictures in a display scenario according to an embodiment of the present application;
fig. 11A is a first diagram illustrating a picture display provided in an embodiment of the present application;
fig. 11B is a second diagram illustrating a picture display provided in the embodiment of the present application;
fig. 11C is a schematic diagram of a third picture display provided in the embodiment of the present application;
fig. 12 is a fourth schematic view of a picture display provided in the embodiment of the present application;
fig. 13 is a fifth diagram illustrating a picture display provided in an embodiment of the present application;
fig. 14 is a schematic diagram illustrating a score sorting of pictures in a deletion scene according to an embodiment of the present application;
fig. 15 is a schematic diagram of a picture deletion provided in an embodiment of the present application;
FIG. 16 is a first schematic diagram illustrating user feedback provided in an embodiment of the present application;
fig. 17 is a schematic diagram of a user feedback provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
For ease of understanding, examples are given in part to illustrate concepts related to embodiments of the present application. As shown below:
a first picture: and after the pictures are scored according to the scoring algorithm model, one or more pictures with the lowest score are obtained. The scoring algorithm model is used for receiving the characteristics of the input picture and outputting a score of the picture. Pictures are pictures stored in electronic device memories (including internal memories and external memory cards). The number of the first pictures can be obtained according to a deleting condition, and the deleting condition is used for determining the number of the pictures needing to be deleted. In one possible embodiment, the deletion condition is that the remaining storage capacity of the electronic device is X1. If the current residual storage capacity of the electronic equipment is X2Wherein X is2<X1If the picture to be deleted occupies a storage capacity Δ X ═ X1-X2If the number of the first pictures is M, the storage capacity occupied by the M pictures with the lowest score is greater than or equal to Δ X, and the storage capacity occupied by the M-1 pictures with the lowest score is less than Δ X.
In one possible embodiment, the number of the second pictures can be an absolute value or a relative value, and can be recommended by a factory or a mobile phone manufacturer, or set by a user.
A first feature: the characteristics of the user's preference degree to the picture can be directly represented, and the scoring result of the picture is relatively mainly influenced. The first feature may comprise at least two states, different states representing different degrees of user's liking of the picture, each state corresponding to a value respectively.
The second characteristic: the characteristics of the user's preference degree to the picture may be indirectly characterized, and have a relatively minor influence on the scoring result of the picture. The second feature may include at least two states, different states representing different degrees of user's liking to the picture, each state corresponding to a value, respectively, a state with a higher value representing a higher degree of user's liking. Each picture can comprise one or more first characteristics and one or more second characteristics, and when the states of the first characteristics of the two pictures are the same, the user's preference degrees of the two pictures are distinguished through the second characteristics.
The third characteristic: the characteristics of the user's dislike degree of the picture can be directly represented, and the scoring result of the picture is relatively mainly influenced. The third feature may comprise at least two states, different states representing different degrees of user dislike of the picture, each state corresponding to a value.
A fourth feature: the characteristic which can indirectly represent the user's dissatisfaction degree on the picture and the characteristic which has relatively minor influence on the picture scoring result. The fourth feature may include at least two states, different states representing different degrees of user's dislike of the picture, each state corresponding to a value, and a lower state representing a higher degree of user's dislike. Each picture can contain one or more third characteristics and one or more fourth characteristics, and when the states of the third characteristics of the two pictures are the same, the user's dislike degrees of the two pictures are distinguished through the fourth characteristics.
Forward feedback behavior: and the operation behavior of the user on the managed picture can indicate that the score calculated according to the scoring algorithm model is lower than the psychological expectation of the user.
Reverse feedback behavior: the user is the operation behavior of the managed picture, which may indicate that the score calculated according to the scoring algorithm model is higher than the psychological expectation of the user.
The file management method provided by the embodiment of the application can be applied to the management of pictures, audio files, video files, documents, application programs and the like by electronic equipment, and the management of the pictures is taken as an example in the following embodiments.
The electronic device related in the embodiment of the present application may be a mobile phone, a tablet Computer, a desktop Computer, a laptop Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a handheld Computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a virtual reality device, and the like.
Referring to fig. 1, a schematic structural diagram of an electronic device 10 is shown.
The electronic device 10 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a Subscriber Identification Module (SIM) card interface 195, etc., where the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180K, a bone conduction sensor 180M 180L, etc.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 10. In other embodiments of the present application, the electronic device 10 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 10. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 10. In other embodiments of the present application, the electronic device 10 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charging management module 140 may also supply power to the terminal through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 10 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. For example, in the embodiment of the present application, the antenna 1 and the antenna 2 may be configured to send data to a cloud server, so as to backup a picture stored in a memory of the electronic device 10 to a cloud. The antenna 1 and the antenna 2 may also be configured to send a download request to the cloud server, where the download request is used to obtain a picture backed up at the cloud end. The antenna 1 and the antenna 2 may also be used to receive data transmitted by the cloud server in response to a download request transmitted by the electronic device 10.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G applied to the electronic device 10. the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (L NA), etc. the mobile communication module 150 may receive an electromagnetic wave from the antenna 1, filter the received electromagnetic wave, amplify, etc., and transmit the processed electromagnetic wave to the modem processor for demodulation.
The modem processor may include a modulator and a demodulator. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide wireless communication solutions including wireless local area networks (wlan) and W L AN (e.g., wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), infrared (infrared, IR), etc. applied to the electronic device 10. the wireless communication module 160 may be one or more devices integrating at least one communication processing module, the wireless communication module 160 may receive electromagnetic waves via the antenna 2, receive electromagnetic wave signals and frequency modulation filtering, and transmit the processed signals to the processor 110. the wireless communication module 160 may also receive signals to be transmitted from the processor 110, perform frequency modulation, amplification, and convert the signals to electromagnetic wave radiation via the antenna 2.
In some embodiments, antenna 1 of electronic device 10 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 such that electronic device 10 may communicate with a network and other devices via wireless communication technologies, which may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), wideband code division multiple Access (wideband code division multiple Access, WCDMA), time division code division multiple Access (TD-SCDMA), Long term evolution (long term evolution, L TE), GNSS, W L AN, NFC, FM, and IR or IR technologies, which may include Global positioning System (Global positioning System, Global System for satellite), Global System for Mobile communications (SBAS), Global System for Mobile navigation (SBAS), Beidou navigation System (SBS/satellite System), Beidou navigation System (QSa/S), Beidou navigation System (BDS/S52), Beidou navigation System and Beidou navigation System (BDS/S).
The electronic device 10 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display panel may employ a liquid crystal display (L CD), organic light-emitting diodes (O L ED), active-matrix organic light-emitting diodes (AMO L ED), flexible light-emitting diodes (F L ED), minified, Micro L ED, Micro-O L ED, quantum dot light-emitting diodes (Q L ED), and the like.
The electronic device 10 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 10 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 10 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 10 may support one or more video codecs. In this way, the electronic device 10 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 10 can be realized by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize the storage capability of the expansion terminal 201. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, in the embodiment of the present application, pictures may be saved in an external memory card, and the processor 110 of the electronic device 10 may obtain the pictures saved in the external memory card through the external memory interface 120.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 10 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data created during use of the electronic device 10 (e.g., audio data, phone books, pictures, etc.), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. For example, in the embodiment of the present application, the internal memory 121 may be configured to store multiple pictures, where the multiple pictures are obtained by the electronic device 10 through shooting by the camera 193, or obtained by the electronic device 10 through receiving and downloading from other applications (e.g., wechat, microblog, Facebook, etc.) by the antenna 1 and the antenna 2.
The electronic device 10 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The proximity light sensor 180G may include, for example, a light emitting diode (L ED) and a light detector, such as a photodiode, the electronic device 10 may detect that the user is holding the electronic device 10 close to the ear for power saving purposes by automatically extinguishing the screen, the proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocking and locking the screen, the ambient light sensor 180L may be used to sense ambient light, the ambient light sensor 180L may also be used to automatically adjust white balance when taking a picture.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 10 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 10, different from the position of the display screen 194. For example, in the embodiment of the present application, the touch sensor 180K may be configured to detect a touch operation of a user on a first picture contained in an album, and transfer the detected touch operation to the application processor, so as to display a second picture corresponding to the first picture. The size of the first picture is smaller than that of the second picture, and the number of pixel points contained in the first picture is smaller than that contained in the second picture.
The bone conduction sensor 180M may acquire a vibration signal. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device 10 may receive key inputs to generate key signal inputs relating to user settings and function controls of the electronic device 10.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be attached to and detached from the electronic device 10 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 10 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The electronic device 10 interacts with the network through the SIM card to implement functions such as communication and data communication.
The software system of the electronic device 10 may employ a hierarchical architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the present application, a software structure of the electronic device 10 is exemplarily described by taking an Android system with a layered architecture as an example.
Fig. 2 is a block diagram of a software configuration of the electronic device 10 according to the embodiment of the present application.
The layered architecture divides the software into a plurality of layers, and the layers communicate with each other through software interfaces. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include short messages, Facebook, QQ, maps, albums, calendars, W L AN, Twitter (Twitter), music players, amazon, and other applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The phone manager is used to provide communication functions for the electronic device 10. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as a surface manager (surface manager), a Media library (Media L ibraries), a three-dimensional graphics processing library (e.g., OpenG L ES), a 2D graphics engine (e.g., SG L), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
For example, the technical solutions involved in the following embodiments may be implemented in the electronic device 10 having the above hardware architecture and software architecture. The following describes a file management method provided by an embodiment of the present application in detail with reference to the accompanying drawings and application scenarios. The process of managing pictures will be described in connection with fig. 3-8. The picture management in the embodiment of the application may include displaying and deleting of pictures. In the following embodiments, the electronic device 10 is a mobile phone as an example.
First, please refer to fig. 3. Fig. 3 is a schematic diagram of a picture management method according to an embodiment of the present application. As shown in fig. 3, the picture management method may include at least the following steps:
s301: the touch sensor 180K of the electronic device 10 detects a first operation by the user.
Alternatively, the above-described first operation is an operation by the user on a shooting button of the camera application displayed on the display screen 194. As shown in fig. 4A, an interface 20 is displayed in the display screen 194, the interface 20 including a status bar 204, a navigation bar 205, a time component icon and a weather component icon, icons of a plurality of applications such as a camera icon 201, a WeChat icon 202, a settings icon 203, a photo album icon, a micro blog icon, a Payment icon, and the like. The status bar 204 may include a name of an operator (for example, china mobile), a time WIFI icon, signal strength, and a current remaining power. The navigation bar 306 may include a return control, a home screen control, a control to display a task window, and the like. When the electronic device 10 detects a click operation by the user based on the camera icon 201 in the interface 20, the display 194 displays a shooting interface a1, and as shown in fig. 4B, the shooting interface a1 may include at least a view box B1 and a shooting button c1 that display a picture to be shot. The first operation may be a single click, or may be a double click, a long press, or the like. In addition, the shooting interface a1 may further include a control d1 for opening an album and a control e1 for switching cameras.
S302: in response to the first operation, the camera 193 of the electronic device 10 obtains a picture and stores the picture in the internal memory 121.
Specifically, after entering the shooting interface a1 or the shooting interface a2, the camera 193 is turned on to obtain a picture to be shot in real time, and after the touch sensor 180K receives a first operation of the user, in response to the first operation, the camera 193 obtains the shot picture in the finder frame, converts the shot picture into a picture, and stores the picture in the internal memory 121.
In addition, the picture may also be obtained by the electronic device 10 through the antenna 1 or the antenna 2 by downloading from an application server of another application, for example, the electronic device 10 may be obtained by downloading from an application server of a microblog through the antenna 1, or may also be obtained by downloading from an application server of a google browser through the antenna 2. The electronic device 10 may save pictures downloaded from an application server of another application into the internal memory 121.
Further, the picture may also be a picture stored in an external memory card connected to the electronic device 10, and the electronic device 10 may read the picture stored in the external memory card through the external memory interface 120.
S303: the touch sensor 180K of the electronic device 10 detects the second operation by the user.
Specifically, the second operation is an operation of the user on a setting item in the system setting application or the album application of the electronic device 10. The setting item is used for setting the on and off of the optimization mode. After the optimization mode is started, the processor 110 of the electronic device 10 may determine whether an optimization condition is satisfied, and if the optimization condition is satisfied, obtain the pictures stored in the internal memory 121 and the pictures of the cloud album corresponding to the electronic device 10, calculate the score of each picture, and optimally display or propose deletion of the picture according to the score.
Next, a system setting application will be described as an example. When the electronic device 10 detects an operation of the setting control 203 by the user, the display screen 194 of the electronic device 10 may display the system setting interface 40, as shown in fig. 5A, the system setting interface 40 may include setting entries of a plurality of applications and components, when the electronic device 10 detects an operation of the setting entry of a certain application by the user, the display screen 194 of the electronic device 10 may display the setting interface of the application, and the setting interface may include various setting items related to the application. The setup portal for the applications and components shown in fig. 5A includes: the photo album setting entry 401, the micro blog setting entry, the payment treasure setting entry, the micro message setting entry, the camera setting entry, the telephone setting entry, the short message setting entry, the address book setting entry, the weather setting entry and the like. The setup interface 40 may also include setup entries for other applications or components, and the display screen 194 may display setup entries for more applications or setup entries for components when the electronic device 10 detects a user's swipe in or out. For example, when the electronic device 10 detects a user operation of the setting inlet 401 of the album, the display screen 194 of the electronic device 10 may display the setting interface 50 of the album as shown in fig. 5B. The setting interface 50 may include a cloud account setting item 501, a cloud album setting item 502, other album setting items 503 which need to be synchronized, an optimization mode setting item 504, a shooting time setting item 505, and a shooting location setting item 506. The cloud account setting item 501 is used for setting a cloud account, the electronic device 10 may upload a picture stored in the internal memory 121 to an album corresponding to the cloud account for backup, and the electronic device 10 may also download the picture from the album (referred to as a cloud album) corresponding to the cloud account and store the picture in the internal memory 121. The cloud album settings item 502 may be used to open or close a cloud album. When the function of the cloud album is turned on, the electronic device 10 may perform data interaction with the cloud album, including: the electronic device 10 uploads the pictures stored in the internal memory 121 to a cloud album for backup, and the electronic device 10 downloads the pictures from the cloud album to be saved in the internal memory 121. When the cloud album is closed, the electronic device 10 cannot perform data interaction with the cloud album. Other album settings 503 that need to be synchronized are used to set which albums in the albums need to be uploaded to the cloud album. The optimization mode setting item 504 is used to turn on or off the optimization mode, and when the optimization mode is turned on, the processor 110 of the electronic device 10 executes the picture displaying and deleting method according to the embodiment of the present invention (refer to the following detailed description). Specifically, the optimization mode can be turned on or off by sliding a slide button in a control 5041 included in the optimization mode setting item 504. When the sliding button slides from left to right, the optimization mode can be switched from off to on, and when the sliding button slides from right to left, the optimization mode can be switched from on to off. The on and off modes of the optimization mode are not limited to be realized by the sliding button, and other forms may also exist, which is not limited in the embodiment of the present application. The shooting time setting item 405 is used to display the shooting time in a picture when the picture in the album is displayed on the display screen 194. The shooting place setting item 506 is used to display a shooting place in a picture when the picture in the album is displayed on the display screen 194. The setting items included in the setting interface 50 shown in fig. 5B are only exemplary, and other setting items may be included in a specific implementation, which is not limited in the embodiment of the present application.
S304: in response to the second operation described above, the processor 110 of the electronic device 10 turns on the optimization mode.
Specifically, after the optimization mode is started, the processor 110 of the electronic device 10 starts to determine whether the optimization condition is satisfied, and if the optimization condition is satisfied, the processor acquires the picture stored in the internal memory 121, acquires the picture stored in the external memory card and the feature of the picture stored in the cloud album corresponding to the electronic device 10 through the external memory interface 120, calculates the score of each picture, and displays the picture in an optimized manner or proposes to delete the picture according to the score.
S305: the processor 110 of the electronic device 10 determines whether the optimization condition is satisfied, and if so, executes step S306, and if not, continues to execute step S305.
In a specific embodiment, the optimization condition is satisfied when the remaining storage space in the internal memory 121 of the electronic device 10 is insufficient. For example, when the remaining storage space in the internal memory 121 of the electronic device 10 is less than 200M, the optimization condition is satisfied. For another example, the optimization condition is satisfied when the remaining storage space in the internal memory 121 of the electronic device 10 is less than 10% of the total storage space. The value of the remaining memory space (200M) and the ratio (10%) of the remaining memory space to the total memory space are merely exemplary, and other values may be used in specific implementations, which is not limited in this embodiment of the application.
In another specific embodiment, when the processor 110 of the electronic device 10 detects that the current time is 22:00, it determines whether the optimization condition is satisfied, if not, the time is delayed from 1 hour to 23:00, and then it determines whether the optimization condition is satisfied again until the optimization condition is satisfied, and then step S306 is executed. Wherein, the optimization condition may include one or any combination of the following: the remaining power in the battery 142 of the electronic device 10 is greater than 20% of the total charge, the charge management module 140 of the electronic device 10 is receiving a charge input from the charger, the display 194 of the electronic device 10 is in an off state, and the electronic device 10 has access to a WiFi network through the wireless communication module 160. The starting time (22:00) for determining whether the optimization condition is satisfied, the ratio (20%) of the remaining power to the total power, and the delay time (1 hour) are exemplary illustrations, and other values may be provided in practical implementation, which is not limited in the embodiment of the present application.
Illustratively, as shown in fig. 6A, when the optimization condition is satisfied, optimization is started (i.e., subsequent steps S306-S308). In addition, after it is determined that the optimization condition is satisfied, a prompt box 701 may be displayed to prompt the user to "start optimization", and a cancel optimization control 702 may be provided for the user to input an instruction to cancel optimization within a period of time, as shown in fig. 6B. The period of time may be displayed in the cancellation optimization control 702 in the form of a countdown, and when the user's cancellation optimization instruction is not received after the countdown is finished, the optimization is started. The period of time may be, for example, 10s, 15s, 30s, etc.
In addition, after the optimization mode is turned on by the processor 110 of the electronic device 10, it is not necessary to perform step S305 to determine whether the optimization condition is satisfied, and actually, steps S306 to S308 may be performed once on the picture stored in the internal memory 121, the picture stored in the external memory card, and the picture stored in the cloud album corresponding to the electronic device 10 directly after the optimization mode is turned on.
S306: the processor 110 of the electronic device 10 obtains the features of the picture stored in the internal memory 121, the picture stored in the external memory card, and the picture stored in the cloud album corresponding to the electronic device 10, inputs the features into the scoring algorithm model, and calculates the score of the picture.
Specifically, each picture may contain many features, and the features contained in the pictures may be classified into the following four categories: the feature of the use habit record, the user portrait feature, the feature of the picture and the storage condition of the electronic equipment.
Characteristics of usage habit records may include, but are not limited to: number of clicks, number of views, number of zooms, whether to be edited, whether to be shared or shared, whether to associate a wallpaper (e.g., associated with a phone contact, or set as a main screen wallpaper of the electronic device 10, etc.), whether to be searched or the number of times to be searched, whether to be placed in a trash box (manually moved by a user into a "trash box" folder and still stored in the internal memory 121), whether to be collected, whether to be remarked (e.g., a user may select to add a remark in a menu option of a picture, may record mood when a picture is taken, or record content of a taken picture, etc.), whether to be marked (e.g., a picture includes a plurality of fruits, may add a mark to each fruit in the picture to indicate the name of the fruit), whether to be downloaded (saved in a cloud album and downloaded into the internal memory 121 of the electronic device 10), or not to perform a method of displaying a picture, Last browsing time, etc. The characteristics of the number of clicks, the number of browsing times, the number of zooming times, whether the user is edited, whether the user is shared or shared, whether the user is searched or whether the searched number of times is put in a trash bin, whether the user is collected, whether the user is remarked, whether the user is marked, whether the user is downloaded, the last browsing time and the like can all be obtained through recording the operation of the user in a period of time. For example, if the touch sensor 180K detects that the operation of the user is a single click, the coordinate position of the operation is (x, y), and the processor 110 obtains that the currently displayed interface of the display screen 194 is 60, as shown in fig. 7, the processor 110 may analyze that the event corresponding to the single click operation with the coordinate (x, y) of the currently displayed interface is browsing P1, and the processor 110 causes the display screen 194 to display the display interface 70 of P1 for the user to browse P1. The processor 110 may save the "browse" event for P1 to the internal memory 121, and may also save the browse time to the internal memory 121. When the processor 110 analyzes the "browse" event for P1 again, the "browse" event and browse time can be saved in the internal memory 121. When the processor 110 determines that the optimization condition is satisfied, the processor 110 may acquire the number of "browsing" events of P1 from the internal memory 121 and acquire the last recorded browsing time (last browsing time). In another possible embodiment, only the number of views for P1 and the current time of view may be recorded in the internal memory. For example, after the "browse" event for P1 is analyzed for the mth time, only the value m of the number of browses and the browse time may be saved in the internal memory 121, and after the "browse" event for P1 is analyzed for the next time, only the value m of the number of browses needs to be updated to m +1 in the internal memory 121, and the browse time needs to be updated. When the processor 110 determines that the optimization condition is satisfied, the processor 110 may obtain the browsing times and browsing time (last browsing time) of P1 from the internal memory 121. It should be noted that the illustration of occurrence of the "browse" event in fig. 7 is only an exemplary illustration, and actually, the "browse" event may be caused to occur by receiving a left-sliding operation or a right-sliding operation of a user in a display interface of a certain picture, and the "browse" event may be caused to occur by entering an album in a third-party application (e.g., WeChat, QQ, microblog) and selecting picture sharing.
User profile features may include, but are not limited to: user classification, user preferences. For example, the user classification may be gender, age, etc. of the user. The user preference may be whether the user is a food photography enthusiast or a landscape photography enthusiast, or the like. In one possible embodiment, the user classification may be that the electronic device 10 acquires the gender, birth date, and the like filled by the user from personal information of a system account (e.g., hua account center of hua terminal, Apple account center (Apple ID) of Apple terminal, and the like). In another possible embodiment, the user classification may be that the electronic device 10 calls a data access interface of a third-party application (e.g., QQ, WeChat, Youtube, etc.) to provide access right, obtains the gender and birth date of the user from a server of the third-party application, and so on. The processor 110 of the electronic device 10 may calculate the age of the user from the obtained birth date, thereby determining the age bracket of the user. The above-mentioned obtaining manner of the user classification is only an exemplary illustration, and other obtaining manners may be available in a specific implementation, which is not limited in the embodiment of the present application. In one possible embodiment, the user preference may be obtained by analyzing a plurality of pictures stored in the internal memory 121 and the external memory card acquired through the external memory interface 120. For example, 100 pictures, 70 pictures of which are food, the electronic device 10 considers the user to be a food photo fan. In another possible embodiment, the electronic device 10 may provide a data access interface with access rights through a third-party application (e.g., google browser, google, baidu forum, etc.), obtain browsing records of the user from a server of the third-party application, obtain content frequently browsed by the user and related to gourmet photography, and then the electronic device 10 considers the user as a gourmet photography fan. The above-mentioned user classification and user preference obtaining manner are only exemplary, and other obtaining manners may be available in specific implementation, which is not limited in this embodiment of the present application.
The features of the picture itself may include, but are not limited to: the picture may be taken in a different manner, such as, for example, when the user is looking down, when the user is looking up, etc., when the user is looking up, etc. The storage location is a storage location of the picture when the processor 110 obtains the picture. The above-mentioned characteristics of the photographing time, the name of the picture, the photographing mode, the format of the picture, the geographical location, the size of the picture, the resolution, the type of the device, etc. are parameters of the picture itself, which may be stored in the internal memory 121 or the external memory card together with the picture, and the processor 110 may acquire the characteristics from the internal memory 121 or the external memory card when acquiring the picture. The above-mentioned shooting technique, color, composition, aesthetic score, classification of the content of the picture, similarity to other pictures, blur, etc. can be obtained by analyzing the content of the picture, or parameters during shooting by the processor 110 of the electronic device 10.
Electronic device storage cases may include, but are not limited to: the total storage capacity of the electronic equipment, the residual available storage capacity and the occupied storage capacity of the pictures. The above-mentioned electronic device storage status can be obtained by the processor 110 querying the state of the internal memory 121.
Specifically, the above process of calculating the picture score may be performed at most once a day, and the condition for specifically triggering the score calculation may refer to the description in step S305, and may be set to determine whether the optimization condition is satisfied at 22:00 pm, and if not, delay 1 hour to 23:00 to determine whether the optimization condition is satisfied again until the optimization condition is satisfied, and trigger the score calculation. After the score is calculated, the score of each picture may be stored in the internal memory 121, and each score is associated with the corresponding picture, specifically, the picture and the score corresponding to the picture may be associated by the identifier of the picture. In the case where the score of each picture is already stored in the internal memory 121, the score of each picture may be updated after the score calculation is completed. The picture identifier may be automatically generated when the electronic device 10 captures the picture through the camera 193. The identification of the picture may also be carried when the picture is downloaded from a server of another application, or when the processor 110 obtains the picture from an external memory card through the external memory interface 120. There is a unique identifier for each picture that allows the electronic device 10 to identify the picture. The frequency of calculating the score (at most once per day) is merely an example, and the frequency of calculating the score may be higher or lower in a specific implementation, and the embodiment of the present application may not be limited.
S307: the processor 110 of the electronic device 10 presents the picture according to the score.
Specifically, the second picture is preferentially displayed or enlarged according to the score of each picture. As shown in fig. 8, the interface 80 is a display interface of an album, and the interface may include three menu controls (photos, albums, and findings), where the pictures in the three menus are displayed in different manners. The currently selected menu category shown in fig. 8 is "photo". When the electronic device 10 detects a user operation on the "photo" menu control, the interface 80 may present multiple pictures, and the electronic device 10 may receive a sliding operation of the user in the interface 80 to browse more pictures, and the "photo" menu control may be referred to as a first menu control. When the electronic device 10 detects a user operation with respect to the "album" menu control, the interface 80 may present one or more folders (collections of files), each of which may contain multiple pictures having a common characteristic, and the "album" menu control may be referred to as a second menu control. For example, a picture in the same shooting mode (e.g., panorama mode, HDR mode, etc.) may be assigned to folder one, a picture from the same source (e.g., microblog, wechat, QQ, Facebook, etc.) may be assigned to folder two, and a user may define folder three and assign a plurality of pictures to folder three. When the electronic device 10 detects a user operation with respect to the "find" menu control, the interface 80 may present multiple folders in different categories, each category containing one or more folders and each folder containing one or more pictures, and the "find" menu control may be referred to as a third menu control. For example, multiple folders may be exposed by category by location and time. Under the place classification, the pictures can be classified into different folders according to the shooting places (such as Beijing, Shanghai, New York, Tokyo, etc.). In the time classification, pictures can be classified into different folders according to shooting time (for example, 2018, 2017, 2016, and the like).
The middle photos in the left image of fig. 8 may be arranged according to the sequence of the shooting time, and the score calculated in the embodiment of the present invention is not considered in this sequence. As shown in the left diagram of fig. 8, the (P) P1 position with low score is at the forefront of all pictures, and the P16 position with high score is at the back. After optimization, in the right diagram of fig. 8, the P16 with high score has been moved to the foremost priority presentation of all pictures, and the P1 with low score is moved backwards.
S308: the processor 110 of the electronic device 10 deletes the picture according to the score.
Optionally, the first picture set is displayed according to the score of each picture, so that the user can delete a plurality of first pictures by one key. As shown in fig. 9, after optimization, P1, P9, P12, and P20 with low scores are all categorized into a first folder, by clicking on the first folder, the user can view the first pictures contained in the folder, and the electronic device 10 can delete all the first pictures in the first folder based on a button of the control 901.
Optionally, the first picture may be displayed differently according to the score of each picture, for example, the chroma value of the first picture may be decreased, or an outer frame may be added, or a specific mark may be marked to distinguish the first picture from other pictures.
Specifically, the process of deleting the picture according to the score may be performed at most once per week, and it may be determined whether the optimization condition described in step S305 is satisfied at 22:00 pm every weekday, if not, the process is delayed from 1 hour to 23:00 to determine whether the optimization condition is satisfied again, and the deletion process performed at most once per week is triggered until the optimization condition is satisfied, so as to ensure that normal use of the user is not affected in the picture deletion process, thereby improving deletion efficiency and user experience.
The above deletion frequency (at most once per week) is merely an exemplary illustration, and in a specific implementation, the deletion frequency may be higher or lower, which is not limited in the embodiments of the present application.
Further, in addition to the above-described automatic deletion process in a case where the optimization condition is satisfied, it is also possible to actually delete the picture therein in accordance with the user's operation on the first folder. The frequency of deleting pictures at this time is not limited to at most once per week as described above.
In addition, S307 and S308 may be two independent steps, and the sequence of the two steps is not limited in the embodiment of the present invention.
S309: the touch sensor 180K of the electronic device 10 detects the third operation by the user.
Specifically, the third operation may be an operation of the user on the optimized picture detected by the touch sensor 180K, the operation may be, for example, to cancel the enlarged displayed picture from the enlarged display, and the operation may be, for example, to move out the first picture in the folder "suggested to delete" and display the first picture in the interface 80 under the "photo" menu, or to classify the first picture into other folders under the album menu according to the characteristics of the first picture.
S310: in response to the third operation, the processor 110 of the electronic device 10 adjusts the scoring algorithm model.
Specifically, whether the picture score is lower or higher than the user psychological expectation is judged according to the third operation, and the scoring algorithm model is adjusted according to the judgment result, so that the picture score is close to the user psychological expectation, the picture display and deletion result is more in line with the user intention, and the accuracy of the picture display and deletion result is improved.
In addition, feedback data of different users for pictures in albums of different electronic devices 10 can be collected in the Beta user test, so that the algorithm model can be adjusted according to the feedback data.
Further, user big data can be collected, and an Artificial Intelligence (AI) algorithm is introduced to adjust the scoring algorithm model. By adjusting the scoring algorithm model, the score of the picture output by the scoring algorithm model can be closer to the psychological expectation of the user, so that the picture display and deletion result can better accord with the intention of the user, and the accuracy of the picture display and deletion is improved.
The method for displaying and deleting pictures provided by the embodiment of the application is described in detail with reference to specific examples.
Specifically, the processor 110 of the electronic device 10 may obtain all pictures in the internal memory 121 and the external memory card thereof and pictures stored in the cloud album corresponding to the electronic device 10. The processor 110 of the electronic device 10 may also obtain characteristics of the respective pictures. After obtaining the features of each picture, it may first be determined, according to at least one feature of the picture, that a scene available for the picture is a presentation scene or a deletion scene. The at least one feature is used to preliminarily determine whether a picture is likely to be the first picture or likely to be the second picture. If the picture is judged to be possibly the first picture according to the at least one characteristic, determining that the picture can be used for deleting the scene; and if the picture is judged to be possibly the second picture according to the at least one characteristic, determining that the picture can be used for displaying the scene. At least one feature used herein to determine a scene for which a picture is available may be referred to as a scene feature. The purpose of determining the scenes in which a picture is available is to determine which features of the picture to extract and the scoring criteria for those features to use in calculating the score for the picture. The score of the picture is calculated according to different characteristics and scoring standards of the characteristics, so that the final score can better accord with the psychological expectation of the user, the accuracy of the score is improved, the picture is displayed or deleted more accurately, the operation of the user is reduced, and the operation efficiency is improved.
And if the available scene of the picture is a display scene, extracting the first characteristic and the second characteristic of the picture. And calculating the score of each picture according to the first characteristic and the second characteristic. And displaying the picture according to the score. And if the available scene of the picture is a deleted scene, extracting a third feature and a fourth feature of the picture. And calculating the score of each picture according to the third characteristic and the fourth characteristic. And deleting the picture according to the score. The following embodiments respectively describe a picture presentation method and a picture deletion method according to available scenes.
In particular, picture PiScore value S ofiThe calculation formula is as follows:
Figure PCTCN2018110070-APPB-000001
in the display scenario, in formula (1), k, m, j, and n are positive integers, m is the number of the first features, and k is 1kThe value of the first characteristic corresponding to the k value; n is the number of the second features, j is 1. x is the number ofjThe value of the second characteristic corresponding to the value of j. OmegajThe weight of the second feature corresponding to the j value. f (x)j) And the characteristic function of the second characteristic corresponding to the j value is used for normalizing the value of the second characteristic. In the embodiment of the application, the weight ω of each second feature in the initial algorithmjThe values are all the same and are set to be 1, and the weights can be adjusted according to certain characteristics subsequently according to the test feedback result, so that the function of configuring the weights is provided for the user. In practical implementations, the weight ω of each second featurejMay be different, ωjThe value of (c) is also not limited to 1. The embodiments of the present application do not limit this.
The first characteristic corresponding to the k value can be obtained by looking up a mapping relation table. A mapping table may be stored in the internal memory 121 for indicating the mapping of the k value to the first characteristic. For example, when the k value is 1, the corresponding first feature may be a favorite, and when the k value is 2, the corresponding first feature may be a remark, and the like. Similarly, the second feature corresponding to the j value can also be obtained by looking up the mapping relation table.
In the deletion scenario, in formula (1), k, m, j, and n are positive integers, m is the number of the third features, and k is 1kThe value of the third characteristic corresponding to the k value; n is the number of the fourth features, j is 1. x is the number ofjThe value of the fourth feature corresponding to the value of j. OmegajThe weight of the fourth feature corresponding to the j value. f (x)j) And the characteristic function of the fourth characteristic corresponding to the j value is used for normalizing the value of the fourth characteristic. In the embodiment of the present application, the weight ω of each fourth feature in the initial algorithmjThe values are all the same and are set to be 1, and the weights can be adjusted according to certain characteristics subsequently according to the test feedback result, so that the function of configuring the weights is provided for the user. In practical implementation, the weight ω of each fourth featurejMay be different, ωjThe value of (c) is also not limited to 1. The embodiments of the present application do not limit this.
Similar to the display scenario, the third feature corresponding to the k value and the fourth feature corresponding to the j value can be obtained by looking up the mapping relationship table, which is not described in detail herein.
Next, with reference to table 1, states corresponding to respective features included in 20 pictures (P1-P20) are exemplarily listed, and the following embodiments will be explained based on the 20 pictures.
TABLE 1 characteristic state tables P1-P20
Figure PCTCN2018110070-APPB-000002
Figure PCTCN2018110070-APPB-000003
Table 1 lists P1-P20, the status of each feature contained in each picture. The first column in table 1 is used to indicate the number of pictures. The first row in table 1 is used to represent the respective features of the picture.
The meaning of each feature and the meaning of the state corresponding to each feature in table 1 are described next.
The "upload cloud" refers to whether a picture stored in the internal memory 121 has been uploaded to the cloud. If a certain picture is uploaded to the cloud, determining that the user probably does not like the picture; if a certain picture is not uploaded to the cloud, it is determined that the user may like the picture. If the information is uploaded to the cloud, the state of uploading the cloud is yes; if the data is not uploaded to the cloud, the state of uploading the data to the cloud is no.
If the processor 110 obtains a certain picture from the internal memory 121, the storage location of the picture is built-in, and it is determined that the user may dislike the picture, and the status of the "storage location" is built-in; if the processor 110 obtains a picture from the external memory card through the external memory interface 120, the storage location of the picture is external, and it is determined that the user may like the picture, the state of the "storage location" is external.
If a certain picture is associated with the wallpaper, determining that the user may like the picture, wherein the state of the associated wallpaper is yes; if a picture is not associated with wallpaper, then a determination is made as to whether the user may dislike the "associated wallpaper" status of the picture.
If a certain picture is collected, determining that the user may like the picture, wherein the collected state is yes; if a certain picture is not collected, determining that the user probably does not like the picture, and the state of collecting is negative.
If a certain picture is remarked, determining that the user may like the picture, wherein the state of the remark is yes; if a picture is not remarked, determining that the user probably does not like the picture, and setting the state of the remark to be negative.
If a certain picture is shared, determining that the user may like the picture, wherein the sharing state is yes; if a picture is not shared, determining that the user may not like the picture, and the "sharing" state is no.
If a certain picture is taken in a certain shooting mode, determining that the user may like the picture, wherein the shooting mode is in a positive state; if a picture is not taken in any of the shooting modes, it is determined that the user may dislike the picture, and the status of "shooting mode" is no.
If the content of a certain picture belongs to a certain category, determining that the user may like the picture, wherein the state of the picture content category is yes; if the content of a certain picture does not belong to any category, determining that the user may not like the picture, and determining that the status of the "picture content category" is no.
If the number of times that a certain picture is browsed is larger than a certain threshold value, determining that the user probably likes the picture; if the number of times a picture is viewed is not greater than the threshold, it is determined that the user may not like the picture. The threshold may be, for example, 5. If the browsing times of a certain picture are more than 5, the state of the browsing times is more than 5; if the browsing times of a certain picture are not more than 5, the state of the browsing times is not more than 5.
If the shooting time of a certain picture is greater than a certain threshold value, determining that the user probably does not like the picture; and if the shooting time of a certain picture is not more than the threshold value, determining that the user probably likes the picture. The threshold may be 30, for example. If the shooting time of a certain picture is more than 30 days, the shooting time is more than 30; if the shooting time of a certain picture is not more than 30 days, the shooting time is not more than 30.
The "last browsing time" refers to the time when the picture was last browsed. Whether a user likes a picture can be determined by the last time the picture was viewed. If the last time that a certain picture is browsed is larger than a certain threshold value, determining that the user probably does not like the picture; and if the last browsing time of a certain picture is not greater than the threshold value, determining that the user is probably fond of the picture. The threshold may be 30, for example. If the last browsing time of a certain picture is more than 30 days, the state of the 'last browsing time' is more than 30; if the last browsing time of a certain picture is not more than 30 days, the state of the last browsing time is not more than 30.
"picture size" refers to the storage space occupied by a picture. Whether the user likes a certain picture can be determined by the size of the storage space occupied by the picture. If the storage space occupied by a certain picture is larger than a certain threshold value, determining that the user probably does not like the picture; and if the storage space occupied by a certain picture is not larger than the threshold value, determining that the user probably likes the picture. The threshold may be, for example, 5 megabits (M). If the storage space occupied by a certain picture is more than 5M, the state of the picture size is more than 5; if the storage space occupied by a certain picture is not more than 5M, the state of the picture size is not more than 5.
"aesthetic score" refers to a score calculated from the structure, color, etc. of the picture. Whether a user likes a picture can be determined by the aesthetic score of the picture. If the aesthetic score of a certain picture is larger than a certain threshold value, determining that the user probably likes the picture; if the aesthetic score of a picture is not greater than the threshold, it is determined that the user may not like the picture. The threshold may be, for example, 5 points. If the aesthetic score of a certain picture is more than 5 points, the state of the aesthetic score is more than 5; if the aesthetic score of a picture is not more than 5 points, the state of the aesthetic score is less than or equal to 5.
"trash" refers to a category of pictures, which may be in the form of a folder, and all the pictures contained in the "trash" folder are unwanted pictures that the user wants to delete, and at this time, the pictures belonging to the "trash" folder are still stored in the internal memory 121, and when the pictures are cleared or deleted from the "trash" folder, the pictures are deleted from the internal memory 121. If a certain picture belongs to the 'dustbin' folder, the 'dustbin' state is yes; if a picture is not attributed to the "trash" folder, the "trash" status is no. In addition, the folder to which the picture that the user dislikes and wants to delete belongs is not limited to be named "trash box", and may be "recycle bin", "recently deleted", and the like, which is not limited in the embodiment of the present application.
In particular, a picture that may be the second picture may be used to show the scene and a picture that may be the first picture may be used to delete the scene. The at least one characteristic for determining that the scene available for the picture is the presentation scene or the deletion scene may indicate that a picture may be a first picture or a second picture, and the at least one characteristic may include: collection, remarking, wallpaper association, storage position and cloud uploading. If a picture is not collected, remarked, associated wallpaper is not collected, the storage location is built-in and is uploaded to the cloud, it is determined that the picture is possibly the first picture, and the picture can be used for deleting the scene. If a certain picture is collected, representing that the user probably likes the picture, the picture can be a second picture and can be used for showing a scene; if the storage position of a certain picture is external, the picture is possibly liked by the user, and the picture can be a second picture and can be used for displaying a scene; if a certain picture is associated with the wallpaper, the user may like the picture, and the picture may be a second picture and can be used for displaying a scene; if a certain picture is remarked, the user may like the picture, and the picture may be a second picture and can be used for displaying a scene; if a certain picture is not uploaded to the cloud, the picture is possibly liked by the user, and the picture is possibly a second picture and can be used for showing a scene. That is, if a certain picture is collected, or remarked, or associated with wallpaper, or the storage location is external, or is not uploaded to the cloud, a possible second picture of the picture can be used for displaying a scene. The above-mentioned features for determining the usage scenario are not limited to the above-listed five, but may also include other features, such as whether the feature is shared, and if the feature is shared, it may be a second picture, which may be used for displaying the scenario. In a specific implementation, the feature for determining the usage scenario may be any combination of the above-listed features, and may also include other features, and the feature can be used to determine that a certain picture may be a first picture or a possible second picture, and determine the usage scenario of each picture, which is not limited in this application embodiment.
Therefore, as can be seen from the states of the features (uploading cloud, storage location, associated wallpaper, collection, remarks) in the second to sixth columns in table 1, P1, P4, P6, P7, P8, P10, P11, P13, P15, P18, and P19 may be used to show a scene, and P2, P3, P5, P9, P12, P14, P16, P17, and P20 may be used to delete a scene.
Next, the selection and values of the first feature (or the third feature) and the second feature (or the fourth feature) in different scenarios are described. The presentation scenario is introduced first, and then the deletion scenario is introduced.
In the display scene:
the first characteristic and the second characteristic are used for representing the user's preference degree of the picture. The first feature may include, for example: and (4) storing and remarking. If the picture is collected or remarked, determining that the user likes the picture, and increasing the score of the picture through the value of the first characteristic. With reference to the description of table 1, if it is determined that the user may like the picture according to the state of the first feature, the value of the feature is a larger value; if it is determined that the user may dislike the picture based on the state of the first feature, the value of the feature is a smaller value. For the values corresponding to the different states of the first feature, as can be seen in table 2 for example, the larger value is 10 and the smaller value is 1. The second feature may include, for example: sharing, shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size, and aesthetic score. In conjunction with the description of table 1, if it is determined that the user may like the picture according to the state of the second feature, the value of the feature is a larger value; if it is determined that the user may dislike the picture based on the state of the second feature, the value of the feature is a smaller value. For the values corresponding to the different states of the second feature, as can be seen in table 3 for example, the larger value is 1 and the smaller value is 0.
In particular, sharing may include sharing to a third party platform through third party software, which may be, for example and without limitation, a WeChat, a microblog, a Tencent QQ, a Tencent microblog, a Facebook, and so on. Sharing may also include sharing to other electronic devices via short-range wireless communication. If a certain picture is shared, the value of the second characteristic sharing is 1; if a picture is not shared, the value of the second feature "share" is 0. If a picture is taken in a certain shooting mode, the value of the second feature "shooting mode" is 1; if a certain picture is not taken in any one of the shooting modes, the value of the second feature "shooting mode" is 0. If the content of a certain picture belongs to a certain classification, the value of the second characteristic 'picture content classification' is 1; if the content of a certain picture does not belong to any one of the categories, the value of the second feature "picture content category" is 0. If the number of times of browsing a certain picture exceeds 5 times, the value of the second characteristic 'browsing number of times' is 1; if the number of times a certain picture is browsed is not more than 5, the value of the second special 'browsing number' is 0. If the shooting time of a certain picture exceeds 30 days, the shooting time of the second characteristic is 0; if the shooting time of a certain picture does not exceed 30 days, the value of the second characteristic "shooting time" is 1. If the last browsing time of a certain picture exceeds 30 days, the value of the second characteristic 'last browsing time' is 0; if the last browsing time of a certain picture does not exceed 30 days, the value of the second characteristic "last browsing time" is 1. If the size of a certain picture exceeds 5 million, the value of the second characteristic 'picture size' is 0; if the size of a certain picture does not exceed 5 megabits, the value of the second feature "picture size" is 1. If the aesthetic score of a certain picture exceeds 5 points, the value of the second characteristic "aesthetic score" is 1; if the size of a picture does not exceed 5 points, the value of the second feature "aesthetic score" is 0.
Specifically, if a certain picture is collected, the value of the collection of the first feature is 10, the score of the weighted sum of the second features is enlarged by ten times, the score of the picture is greatly improved, and the score of the picture preferred by a user is ensured to be ahead; if a certain picture is not collected, the value of the first characteristic collection is 1, and the score of the weighted sum of the second characteristics is not changed. If a certain picture is remarked, the value of the first characteristic remark is 10, the value of the weighted sum of the second characteristics is amplified by ten times, the value of the picture is greatly improved, and the favorite picture value of a user is ensured to be ahead; if a picture is not remarked, the value of the first feature remark is 1, and the score of the weighted sum of the second features is not changed. In summary, the score of the picture needs to be boosted by a larger value of the first feature.
Table 2 shows the selection and value of the first feature under the scenario
First characteristic Collection method Remarks for note
Is that 10 10
Whether or not 1 1
Table 3 shows the selection and value of the second feature under the scenario
Figure PCTCN2018110070-APPB-000004
As can be seen from table 2 and table 3, m is 2 and n is 8 in formula (1). An exemplary mapping table of the k value and the first feature is shown in table 4. An exemplary mapping table of the j value and the second feature is shown in table 5.
TABLE 4 mapping relationship of k value to first feature
k value First characteristic
k=1 Collection method
k=2 Remarks for note
TABLE 5 mapping of j values to second features
j value Second characteristic
j=1 Sharing
j=2 Shooting mode
j=3 Picture content classification
j=4 Number of times of browsing
j=5 Time of shooting
j=6 Last browsing time
j=7 Size of picture
j=8 Aesthetic scoring
The above mapping relationship between the k value and the first feature and the mapping relationship between the j value and the second feature are merely exemplary, and other mapping relationships may exist in practice, which is not limited in the embodiments of the present application.
Specifically, the selection of each of the first features and the selection of the second features are not limited to the selections shown in tables 2 and 3, and other selections may be made during the actual use process, and the first features and the second features may be selected subsequently according to the feedback of the user, or the first features and the second features may be manually selected by the user. In addition, the larger value and the smaller value of each of the first feature and the second feature are not limited to the values shown in table 2 and table 3, and other options may be available in the actual use process, which is not limited in the embodiment of the present application.
In some embodiments, the value of the second characteristic is not limited to being 0 or 1 as listed in table 3. In one possible implementation, it may also be 0 or 2, 0 or 10, 1 or 10, etc. In one possible implementation, the value of the second feature may also be continuous, for example, the value of "number of views" for the second feature may be, but is not limited to, a value of 1 > 5 times, a value of 0 ≦ 5 times, and may also increase linearly as the number of views increases. If the browsing times is 0, the value is 0; if the browsing times is 1, the value is 0.1; if the browsing times is 2 times, the value is 0.2; if the number of times of browsing is 10 or more, the value is 1. Similarly, the method is applicable to the second feature "sharing", "shooting time", "last browsing time", "picture size", "aesthetic score", and the like, and the specific assignment method may refer to the assignment method of the "browsing times" of the second feature, which is not described herein again.
In some embodiments, the greater value of the first feature depends on a score S 'calculated from a weighted summation of the second features'i
Figure PCTCN2018110070-APPB-000005
J, n, ω in the formula (2)j、f(xj) The meaning of (a) is identical to that of formula (1) and is not described in detail here. As can be seen from equation (2), the score S 'is calculated from the respective second characteristics'iDepending on the number of second features and the value of each second feature. If the score of the picture is to be improved through the larger value of the first feature and the decisive effect of the first feature on the score of the picture is highlighted, the larger value of the first feature needs to be larger than the possible maximum value of the weighted sum of the second features, and the possible maximum value of the weighted sum of the second features is the score S 'obtained when the values of the second features are all larger values'i. Illustratively, if the number of the second features is 8, the larger value of each second feature is 1, and each second feature occupies the weight ωjAll of them are 1, the maximum possible weighted sum of the second features is 8, and the larger value of the first feature needs to be larger than 8. For example, if the number of the second features is 10, the larger value of each second feature is 2, and each second feature occupies the weight ωjAll 1, the maximum possible weighted sum of the second features is 20, and the larger value of the first feature needs to be greater than 20.
For example, the values of the second features of the picture a are all larger values, but the value of the first feature is smaller values, and the values of the first features of the picture B are smaller values, but the values of the second features are not all larger values, so that the score of the picture B is definitely larger than the score of the picture a. It can be seen from this example that, when the value of the first feature is a larger value, the score of the weighted sum of the second features can be directly amplified by several times, so that the score of the picture is greatly improved, the score advantage of the picture is embodied, and the user's preference degree of the picture is embodied through the score.
Specifically, if some pictures are collected or remarked, the difference between the values of the first characteristic and the values of the uncollected and remarked pictures can be separated by the value of the first characteristic, so that the difference of the user's preference degrees on the pictures is reflected. If the two pictures are collected or remarked at the same time or are not collected and remarked, the difference of the scores is separated through the value of the second characteristic, and the difference of the user's love degrees of the two pictures is reflected.
The values of the features of the pictures suitable for the recommended scenes can be obtained by combining table 1, table 2 and table 3, and the score S of each picture can be calculated according to the formula (1)iAs shown in table 6.
Table 6 shows values of various features of the picture and values of the picture scores in the scene
Figure PCTCN2018110070-APPB-000006
Figure PCTCN2018110070-APPB-000007
In Table 6, the first column indicates the number of each picture, and the first line indicates the feature, S ', of each picture'iAnd Si
Specifically, from the score calculation results in table 6, it can be seen that the score S 'if weighted and summed according to the second feature'iThe pictures listed in Table 6 are sorted (score is from high to low) and combined with a score S obtained according to the first and second characteristicsiA comparison chart of the ranking (score from high to low) of the pictures listed in table 6 is shown in fig. 10. As can be seen from table 6 and fig. 10, P11 of the 6 th is sorted only according to the score weighted and summed by the second feature, and after the score is increased by the larger value of the first feature, the sorting is promoted to the 1 st; sorting the P15 of the 7 th according to the score weighted and summed by the second features, and increasing the sorting to the 5 th after the score is increased by the larger value of the first features; the 8 th P7 is ranked according to the score weighted by the second feature alone, and the ranking is promoted to the 6 th after the score is increased by the larger value of the first feature. It can be seen that the score of the weighted summation of the second features can be directly amplified by a plurality of times through the larger value of the first features, the score of the picture is greatly improved, the score advantage of the picture is embodied, and therefore the user's liking degree of the picture is embodied through the score.
Specifically, after calculating the score of each picture, the electronic device 10 may display the picture according to the score.
In one possible embodiment, the electronic device 10 may prioritize the high scoring pictures in the interface 80 under the "photos" menu. In a specific implementation manner, the pictures with the highest score may be arranged first, and the pictures are arranged in sequence from top to bottom, left to right, and from top to bottom according to the order of the scores from high to low, and the arrangement manner may refer to that shown in the right diagram of fig. 8, which is not described herein again.
In another possible embodiment, the electronic device 10 may enlarge the area of the second picture in the interface 80 under the "photos" menu. For example, the second picture may be 1 picture with the highest score, and the picture with the highest score is P11, and the area of the picture in the interface 80 is increased, so that the area of the picture P11 in the interface 80 may be at least larger than the area of the other non-second pictures. As shown in fig. 11A, the second picture P11 is at least 4 times the size of the area of the other non-second pictures.
In another possible embodiment, the electronic device 10 may frame a second picture in the interface 80 under the "photos" menu, as shown in FIG. 11B.
In another possible embodiment, electronic device 10 may also display a second picture, as shown in FIG. 11C, with a star in interface 80 under the "photos" menu.
In a specific implementation, the second picture may be displayed in other display manners, for example, the second picture is displayed in a special color, or the second picture is displayed in a special transparency, which is not limited in this embodiment of the application. The pictures shown in the "photos" menu lower interface 80 are all pictures in the internal memory 121 and external memory cards.
In another possible embodiment, the electronic device 10 may collectively present the second pictures, as shown in fig. 12 and 13. In fig. 12, when the electronic device 10 detects the operation of the user on the "find" menu control, the interface 80 may include a first category 8021 in addition to a location category and a time category. The location category includes a plurality of folders, each of which may include a plurality of pictures, and the shooting locations of the plurality of pictures included in the folders are the same, and may be, for example, beijing, shanghai, New York, Tokyo, or the like. The time classification includes a plurality of folders, each of which may include a plurality of pictures, and the plurality of pictures included in the folder have the same shooting time, and may be, for example, 2018, 2017, 2016, 2015, or the like. The first category 8021 may include a plurality of second pictures. The "first category" may also be displayed as "guess you like" or "Favorite" or "Fav" in the interface 80, but is not limited thereto, and may also have other category names, which is not limited by the embodiment of the present application. The classification modes may be other classification modes besides the classification according to location and time, for example, the classification according to people, and the specific classification mode is not limited in the embodiment of the present application. A search control 804 may also be included in the interface 80, and when the touch sensor 180K of the electronic device 10 detects an operation of the search control 804 by the user, the display screen 194 of the electronic device 10 displays a search interface, as shown in fig. 13, and the search interface 90 at least includes: a search bar 901, a presentation interface 902, and a status bar. Wherein the status bar is similar to the status bar 204 listed in fig. 4A and is not described in detail herein. The search bar 901 is used for receiving a search instruction of a user, and searching pictures from the internal memory 121, the external memory card, and the cloud album of the electronic device 10. When the touch sensor 180K of the electronic device 10 detects that the user operates the search bar 901, the input method interface 9022 is displayed in the presentation interface 902, and the user can input a picture to be searched, such as "blue sky", in the input method interface 9022, and then the electronic device 10 can search for a picture with "blue sky" content from the internal memory 121, the external memory card, and the cloud album. The display interface 902 includes a plurality of different classifications, where fig. 13 shows a location classification and a second classification 9021, folders included in the location classification are similar to folders included in the location classification in fig. 12, and the second classification 9021 includes a second picture, which is similar to the second picture included in the first classification 8021 in fig. 12 and is not described herein again.
The above presentation form of the set of second pictures is not limited to the above listed classification form, and other presentation forms may also be available in practical implementations, and this is not limited by the embodiment of the present application.
According to the image searching method and device, the available scene of the image is determined through partial features, and then the score calculated according to the algorithm model provided by the embodiment of the application is enabled to be more accordant with the psychological expectation of the user through the combined action of the first features and the second features, so that the user can quickly search the image.
In the delete scenario:
the third feature and the fourth feature are used for representing the user's dislike degree of the picture. The third feature may include, for example: a garbage can is provided. Here, the picture containing the "trash" feature is the picture that the user actively deleted, and the picture is currently classified in the "trash" folder but the picture is still saved in the internal memory 121. And if a certain picture is classified into a 'garbage can' folder, determining that the user does not like the picture, and reducing the score of the picture through the value of the third characteristic. With reference to the description of table 1, if it is determined that the user may dislike the picture according to the state of the third feature, the value of the feature is a larger value; if it is determined that the user may like the picture according to the state of the third feature, the value of the feature is a smaller value. For the values corresponding to the different states of the third feature, as can be seen in table 7 for example, the larger value is 10 and the smaller value is 1. The fourth feature may include, for example: shooting mode, picture content classification, shooting time, last browsing time, picture size, and aesthetic score. With reference to the description of table 1, if it is determined that the user may dislike the picture according to the state of the fourth feature, the value of the feature is a smaller value; if it is determined that the user may like the picture according to the state of the second feature, the value of the feature is a larger value. For the values of the fourth characteristic different state, table 8 can be seen as an example.
Specifically, if a certain picture is classified into a "trash box" folder, and it is determined that the user is likely to dislike the picture, the larger value of the third feature "trash box" is 10, and since the score of the weighted sum of the fourth features may be negative in the deletion scene, the score of the weighted sum of the fourth features can be negatively amplified by ten times by setting the larger value of the third feature to 10, so that the score of the picture is reduced to a great extent, and the score of the picture disliked by the user is ensured to be behind; if a picture is not categorized in the "trash" folder, it is determined that the user may like the picture, and the smaller value of the third feature "trash" is 1, the score of the fourth feature weighted sum is not changed. In a possible embodiment, if the score of the weighted sum of the fourth features in the deletion scenario is not negative, the value of the feature is a smaller value if it is determined that the user may dislike the picture according to the state of the third feature. The smaller value may be a decimal number, 0, or a negative number. In any case, the score of the picture needs to be reduced by the value of the third feature. The embodiment of the application is mainly described by taking a case that the score of the weighted sum of the fourth features may be negative in a deletion scenario as an example.
TABLE 7 selection and value of third feature under delete scenarios
Third characteristic Garbage can
Is that 10
Whether or not 1
TABLE 8 selection and value of fourth feature in delete scenario
Figure PCTCN2018110070-APPB-000008
As can be seen by combining table 7 and table 8, formula (1) m is 1 and n is 7. An exemplary mapping table of the k value and the third feature is shown in table 9. An exemplary mapping table of j values and the fourth feature is shown in table 10.
TABLE 9 mapping of k values to third features
k value Third characteristic
k=1 Garbage can
TABLE 10 mapping of j values to fourth features
j value Second characteristic
j=1 Shooting mode
j=2 Picture content classification
j=3 Number of times of browsing
j=4 Time of shooting
j=5 Last browsing time
j=6 Size of picture
j=7 Aesthetic scoring
The mapping relationship between the j value and the fourth feature is merely an exemplary illustration, and actually, other mapping relationships may exist, which is not limited in the embodiment of the present application.
Specifically, the selection of each of the third features and the selection of the fourth features are not limited to the selections shown in tables 7 and 8, and other selections may be made in the actual use process, and the third features and the fourth features may be selected subsequently according to the feedback of the user, or the third features and the fourth features may be manually selected by the user. In addition, the larger values and the smaller values of the third features and the fourth features are not limited to the assignment values shown in tables 7 and 8, and other options may be available in the actual use process, which is not limited in the embodiment of the present application.
In some embodiments, the values of the fourth characteristic are not limited to the few listed in table 8. In one possible implementation, the smaller value of the fourth feature may be a positive number, and the larger value of the fourth feature may also be a positive number. In another possible implementation, the smaller value of the fourth feature may be a negative number, and the larger value of the fourth feature may also be a negative number. In a possible implementation, the assignment of the fourth feature may also be continuous, for example the value of "shooting time" for the fourth feature may be a linear decrease of its value with increasing shooting time. If the shooting time is more than 30 days, the value is minus 1 minute; when the shooting time is more than 20 days and less than or equal to 30 days, the value is-0.5; when the shooting time is more than 10 days and less than or equal to 20 days, the value is 0; when the shooting time is more than 5 days and less than or equal to 10 days, the value is 0.5; when the shooting time was 5 days or less, the value was 1. Similarly, the assignment method may be applied to the fourth feature "view this number", "last view time", "picture size", "aesthetic score", and the like, and the specific assignment method may refer to the assignment method of the fourth feature "shooting time", which is not described herein again.
In some embodiments, the greater value of the third feature depends on the calculated score S 'weighted by the fourth feature'i
As can be seen from formula (2), the score S 'is calculated from the respective fourth features'iDepending on the number of fourth features and the value of each fourth feature. If one wants to reduce the score of the picture by a larger value of the third feature, the decisive role of the third feature on the score of the picture is highlighted.
In S'iIn the case of a negative number, it is necessary that the larger value of the third feature is larger than the absolute value of the smallest value of the weighted sums of the fourth features, and the smallest value of the weighted sums of the fourth features is the score S 'obtained when the values of the fourth features are all smaller'i. Illustratively, if the number of the fourth features is 8, the smaller value of each fourth feature is-1, and each fourth feature occupies the weight ωjAll 1, the possible minimum value of the weighted sum of the fourth features is-8, and the larger value of the third feature needs to be larger than 8. Illustratively, if the number of the fourth features is 10, the smaller value of each fourth feature is-2, and each fourth feature occupies the weight ωjAll 1, the weighted sum of the fourth features may have a minimum value of-20, and the larger value of the third feature may need to be greater than 20.
In S'iIn the case of a non-negative number, it is necessary that a smaller value of the third feature is smaller than a reciprocal of a maximum value of the weighted sum of the fourth features, and the maximum value of the weighted sum of the fourth features is a score S 'obtained when the values of the fourth features are all larger values'i. Illustratively, if the number of the fourth features is 8, the larger value of each fourth feature is 1, and each fourth feature occupies the weight ωjAll 1, the possible maximum value of the weighted sum of the fourth features is 8, and the smaller value of the third feature needs to be less than 1/8. For example, if the number of the fourth features is 10, the larger value of each fourth feature is 2, and each fourth feature occupies the weight ωjAll 1, the weighted sum of the fourth features has a possible maximum value of 20, and the smaller value of the third feature needs to be less than 1/20.
In short, it is at S'iIs negative or S'iIf the number is not negative, the value of the third feature is reduced when the user does not like a certain picture according to the state of the third feature, so that the condition that the user does not like the certain picture according to the state of the third feature is ensured to ensure that S 'is ensured'iThe score of the picture with higher value is lower than that of the picture S 'determined according to the third characteristic that the user probably likes a certain picture'iThe score of the lower valued picture.
For example, the values of the fourth feature of the picture a are all smaller values, but it is determined that the user may like a certain picture according to the state of the third feature, and it is determined that the user does not like a certain picture according to the third feature of the picture B, but the values of the fourth feature are not all smaller values, and the score of the picture B is certainly smaller than the score of the picture a. In this example, it can be seen that, when it is determined that the user does not like a certain picture according to the state of the third feature, the score obtained by the weighted summation of the fourth feature may be directly negatively amplified by several times or reduced by several times, so that the score of the picture is reduced to a great extent, the score disadvantage of the picture is reflected, and the degree of the user's non-like of the picture is reflected by the score.
The above can be obtained by combining tables 1, 7 and 8The values of all the characteristics of all the pictures in the scene to be deleted are calculated, and the score S of each picture is calculated according to the formula (1)iAs shown in table 11.
Table 11 deletes values of respective features of pictures and scores of pictures in scenes
Figure PCTCN2018110070-APPB-000009
In Table 11, the first column indicates the number of each picture, and the first line indicates the feature, S ', of each picture'iAnd Si
Specifically, from the score calculation results in table 11, it can be seen that the score S 'if weighted and summed according to the fourth feature'iThe pictures listed in table 11 are sorted (score is from high to low), and the score S obtained by the cooperation of the third feature and the fourth feature is addediA comparison chart of the pictures listed in table 11 sorted (scores from high to low) is shown in fig. 14. As can be seen from table 11 and fig. 14, P3 of the 2 nd is sorted only according to the score weighted and summed by the fourth feature, and after the score is reduced by the larger value of the third feature, the sorting is reduced to the 6 th; sorting the P12 of the 4 th according to the score weighted and summed by the fourth feature only, and after the score is reduced by the larger value of the third feature, the sorting is reduced to the 7 th; sorting the P20 of the 6 th according to the score weighted and summed by the fourth feature only, and after the score is reduced by the larger value of the third feature, the sorting is reduced to the 8 th; p14 at 8 is sorted according to the score of the fourth feature weighted sum alone, and after the score is reduced by the larger value of the third feature, the sort is reduced to 9. Therefore, the score weighted and summed by the fourth feature can be directly amplified by a plurality of times in the negative direction through the larger value of the third feature, the score of the picture is reduced to a great extent, the score disadvantage of the picture is reflected, and the dislike degree of the user to the picture is reflected through the score.
Specifically, after calculating the score of each picture, the electronic device 10 may prompt the user to delete the first picture according to the score and the deletion condition or directly and automatically delete the first picture. The deletion condition is one of the bases for the electronic device 10 to determine the first picture. For example, the deletion condition may be that the remaining available capacity of the internal memory 121 of the electronic device 10 is not lower than a certain threshold, and the electronic device 10 may determine the first picture to be deleted according to the deletion condition and the score of each picture, so as to ensure that the remaining available capacity of the internal memory 121 of the electronic device 10 is not lower than the threshold after the first picture is deleted. It should be noted that "delete" in the embodiment of the present application is different from classifying a picture in a "trash box" folder, and "prompt a user to delete" in the embodiment of the present application means that the picture is still stored in the internal memory 121 or the external memory card, and is deleted from the internal memory 121 or the external memory card until a deletion instruction of the user is received. "delete directly" refers to deleting a picture from the internal memory 121 or an external memory card to free up the storage capacity of the electronic device 10.
Specifically, the first picture may be determined according to the deletion condition and the score of each picture, and the user is prompted to delete the first picture, or the first picture is directly and automatically deleted.
In a possible embodiment, the first picture may be based on the total storage capacity Q of the user's handset, the available storage capacity QleftAnd calculating and determining the number N of pictures.
Specifically, a target remaining storage capacity:
Qleft.threshold=min(2G,Q×10%) (3)
namely, when the total storage capacity Q is greater than 20G, the target remaining storage capacity Q is 2G; when the total storage capacity Q is not more than 20G, the target remaining storage capacity Q left.threshold10% of the total storage capacity Q. I.e. the target remaining storage capacity is at most 2G.
Deletable score threshold S to be soughttThe following can be obtained by equation (5):
Figure PCTCN2018110070-APPB-000010
wherein S isiThe score of the picture i is less thanEqual to the deletable score threshold StThe total capacity of the pictures is greater than or equal to the target remaining storage capacity Qleft.thresholdWith the current remaining capacity QleftThe difference of (a). Suppose, the target remaining storage capacity Qleft.thresholdWith the current remaining capacity QleftThe difference value of (A) is 200M, namely the storage space to be released is 200M, the pictures with the lowest score are sequentially selected as the pictures to be deleted until the total capacity of the selected pictures to be deleted is just equal to or more than 200M, and the score of the picture with the highest score in the pictures to be deleted is the threshold value S of the deletable scoretIn (1).
In addition, the total capacity size Q of each deleted picture can be setdelete.thresholdComprises the following steps:
Qdelete.threshold=max(100M,(Qleft.threshold-Qleft)) (5)
i.e. when the target remaining storage capacity Q isleft.thresholdWith the current remaining capacity QleftWhen the difference value of (2) is less than 100M, the total capacity Q of each picture is deleteddelete.thresholdIs 100M; when the target remaining storage capacity Qleft.thresholdWith the current remaining capacity QleftWhen the difference is not less than 100M, the total capacity Q of each picture is deleteddelete.thresholdFor a target residual storage capacity Qleft.thresholdWith the current remaining capacity QleftThe difference of (a). In other words, the total capacity of each deleted picture is at least 100M. For example, if the deletable score threshold S is calculated according to equation (4)tKnowing that the size of the picture to be deleted is 80M, several pictures with the lowest score can be deleted until the size of the deleted pictures is just larger than or equal to 100M.
In addition, a number threshold N of each picture deletion may be setthresholdIs at most 10% of the total number N, i.e.
Nthreshold≤N×10% (6)
If the deletable score threshold S is calculated according to equation (4)tIt can be seen that the number of pictures to be deleted is 20, and the total number of pictures stored in the electronic device 10 is 150,the number of the finally deleted pictures is 15, and the first picture is the 15 pictures with the lowest score.
Specifically, the maximum value of the target remaining storage capacity is not limited to 2G listed above, the minimum value of the total capacity of each deleted picture is not limited to 100M listed above, the threshold value of the number of pictures deleted is not limited to 10% of the total number N listed above, and other values may also be used in the actual implementation process, and the user may also manually set the above parameters.
Illustratively, if the current remaining capacity of the internal memory 121 of the electronic device 10 is 10% of the total capacity and the target remaining capacity is 20% of the total capacity, the lowest-valued 5 pictures in the pictures listed in table 11 are determined as the first picture according to the deletion policy, i.e., P17, P3, P12, P20, and P14 are determined as the first picture. When the touch sensor 180K of the electronic device 10 detects an operation of the user on the "album" menu control, the display interface 80 of the album may display the first folder 8022, in addition to the "microblog" folder, the "WeChat" folder, and the "Facebook" folder, as shown in fig. 15. The first folder 8022 may include a plurality of first pictures. The first folder may also be more visually displayed in the interface 80 as suggested deletion, and is not limited thereto, and may also have other names, which is not limited by the embodiment of the present application. After the touch sensor 180K of the electronic device 10 detects the operation of the user on the first folder 8022, the display screen 194 of the electronic device 10 may display the interface 100, and the interface 100 may include the first picture, the control 901, and the status bar. The status bar is similar to the status bar 204 in fig. 4A, and is not described herein. When the touch sensor 180K of the electronic device 10 detects that the user deletes all the first pictures with respect to the operation of the control 901, the storage capacity of the internal memory 121 of the electronic device 10 can be released to 20% of the total capacity, the user is easy and convenient to operate during the deletion process, the first pictures are comprehensively determined according to the third feature and the fourth feature, the accuracy of the deleted pictures is high, the probability that the user downloads the pictures from the cloud again is reduced, and the user experience is improved. The manner of collectively displaying the first pictures is not limited to that of placing the first folder 8022 in an actual implementation process, and other manners of collectively displaying the first pictures may also be available, which is not limited in the embodiment of the present application.
In addition, after the electronic device 10 confirms the first picture, the first picture can be directly deleted without being manually deleted by the user, so that the user operation is further reduced.
In a possible embodiment, the displaying of the picture and the deleting of the picture may be performed separately. That is, in the picture management flow shown in fig. 3, only S301 to S307 may be executed to display pictures, or only S301 to S306 and S308 may be executed to delete pictures. For example, the displaying of the picture may be that the electronic device 10 calculates a score once a day for the picture suitable for the displaying scene, when the electronic device 10 receives an instruction from the user to enter the album application, the processor 110 of the electronic device 10 may update the displaying of the picture according to the score, and the deleting of the picture may be that the electronic device 10 calculates a score once a week for the picture suitable for the deleting scene, displays the first picture according to the score and the deletion policy set to prompt the user to delete the first picture, or automatically deletes the first picture. For another example, the electronic device 10 may perform score calculation once a day on all the pictures stored in the internal memory 121 and/or stored in the cloud album, then update the presentation of the pictures according to the scores, and perform weekly collective presentation of the first pictures according to the scores and the deletion policy to prompt the user to delete the first pictures, or automatically delete the first pictures.
According to the image deletion method and device, the applicable scene of the image is determined through partial features, and then the third feature and the fourth feature act together, so that the score calculated according to the algorithm model provided by the embodiment of the application is more in line with the psychological expectation of the user, the image can be deleted quickly and accurately.
Further, after the scores of all the pictures stored in the internal memory 121 and/or stored in the cloud album are calculated in the above-described exhibition scenario and deletion scenario, the scores of all the pictures may be arranged in order from high to low as a whole and displayed in the right drawing of fig. 8 in order from left to right and from top to bottom.
The above is a process of calculating the score of the picture through the algorithm model and managing the picture in different scenes. Next, a process of performing optimization adjustment on the algorithm module through user feedback after the electronic device 10 manages the pictures will be described.
In particular, the user feedback may include a forward feedback behavior and a reverse feedback behavior of the user.
Specifically, the positive feedback behavior may include: moving the later picture in the pictures displayed in the interface 80 under the "photos" menu forward, displaying the normally displayed pictures in an enlarged manner, displaying the normally displayed pictures in a frame, displaying the normally displayed pictures in a star, and moving the pictures classified in the first folder 8022 out of the interface 80 under the "photos" menu. If the above-mentioned feedback behavior of the user occurs, it can be determined that the score of the picture should be higher.
Specifically, the backward feedback behavior may include: moving the front picture back in the pictures displayed in the interface 80 under the 'photos' menu, canceling the enlarged and displayed pictures from enlarged and displayed, canceling the framed and displayed pictures from framed and displayed, canceling the starred and displayed pictures from starred and displaying pictures, and classifying a certain picture in a 'garbage can' folder by a user. If the above-mentioned feedback behavior of the user occurs, it is determined that the score of the picture should be lower.
Next, how to optimize the algorithm model based on two feedback behaviors is described with reference to fig. 16 and 17. Wherein fig. 16 introduces an algorithm model optimized according to the backward feedback behavior, and fig. 17 introduces an algorithm model optimized according to the forward feedback behavior.
As can be seen from fig. 16, when the user moves the picture P8 located at the 7 th in the picture display queue back to the 11 th, and determines that the user wants the score of P8 to be reduced, the score of P8 can be reduced in various ways, for example:
1. decreasing the first feature of P8 may determine a greater value for a feature that the user may like a picture.
2. Decreasing the weight of the feature in the second feature of P8 that the user may like a certain picture and increasing the weight of the feature in the second feature of P8 that the user may not like a certain picture may be determined.
Next, how to reduce the score of P8 in the above two ways will be described in detail.
1. As can be seen from Table 6, the feature that the user may like a certain picture among the first features of P8 is "favorite", and the larger value thereof is 10, which can be reduced, for example, to 5, and then the score S of P8 is obtainediFrom 30 to 15. The reduction of the larger value of "favorite" to 5 is merely an exemplary illustration, and the embodiment of the present application does not limit this.
2. As can be seen from table 6, the features that the user may like a certain picture in the second features of P8 are "share", "picture content classification", and "last browsing time", the features that the user may not like a certain picture are "shooting mode", "browsing times", "shooting time", "picture size", and "aesthetic score", and it can be seen from the description of equation (1) that the weights of the respective second features are all 1. In order to ensure that the sum of all second feature weights contained in all pictures is a fixed value, ensure that the scoring standards of all pictures are consistent, and ensure the comparability of the scores of all pictures, the weight weights of the features of 'sharing', 'picture content classification' and 'last browsing time' which can determine that a user may like a certain picture can be reduced from 1 to 0.5, and the weights of the features of 'shooting mode', 'browsing frequency' and 'shooting time' which can determine that the user may not like a certain picture can be increased to 1.5, then the score S of P8 is obtained at this timeiFrom 30 to 15. The above-mentioned reduction amplitude and increase amplitude of the weight of the second feature are only exemplary illustrations, and the feature that can determine that the user may like a certain picture by selecting the reduction weight and the feature that can determine that the user may not like a certain picture by selecting the increase weight are also exemplary illustrations, which are not limited by the embodiment of the present application.
In another possible embodiment, the electronic device 10 may collect the user's reverse feedback behavior with respect to multiple pictures in the internal memory 121 and the external memory card for a period of time (e.g., a week, a month, etc.), extract a feature that is common to the first features of the multiple pictures and can determine that the user may like a certain picture, and extract a feature that is common to the second features and can determine that the user may like a certain picture and may not like a certain picture. The common feature that can determine that the user may like a certain picture and the common feature that can determine that the user may not like a certain picture are not features of every picture in a strict sense, but only features that can determine that the user may like a certain picture or features that can determine that the user may not like a certain picture exist consistently in most of the plurality of pictures. For example, if the electronic device 10 collects the reverse feedback behavior of the user for 100 pictures in one month, the features of 60 pictures in the first features of the 100 pictures that can determine that the user may like a certain picture are all "favorite", and the "favorite" may be used as the feature that can determine that the user may like a certain picture in the first features of the 100 pictures, which is the same as the feature that can determine that the user may like a certain picture in the second features and the feature that can determine that the user may not like a certain picture. The algorithm model can be optimized more accurately by collecting common features extracted by the user aiming at the reverse feedback behaviors of a plurality of pictures.
As can be seen from fig. 17, when the touch sensor 180K of the electronic device 10 detects a user operation with respect to P3 in the interface 100, the display screen 194 of the electronic device 10 may display an interface 200, which interface 200 may include a picture display area 2001 and a resume control 2002, the picture display area 2001 is used to display P3, the restore control 2002 is used to receive a restore instruction from the user, move P3 out of the first folder 8022, when the electronic device 10 again receives the user's instruction to view the "suggested deletion" folder 8022 after removing P3 from the first folder 8022, no more P3 is displayed in the interface 100, and when the electronic device 10 again receives a user operation with respect to the "album" menu control, P3 may be displayed in the interface 80 under the "photos" menu, determining that the user desires the score of P3 to increase, the score of P3 may be increased in a number of ways, such as:
1. lowering P3 in the third feature may determine that the user may dislike the larger value of the feature for a picture.
2. The weight of the feature of the fourth feature of P3, which can be determined that the user may dislike a certain picture, is decreased, and the weight of the feature of the fourth feature of P3, which can be determined that the user may like a certain picture, is increased.
Next, how to increase the score of P3 in the above two ways will be described in detail.
1. As can be seen from Table 7, the larger value of the third characteristic "trash bin" of P3 is 10, which can be reduced, for example, to 5, when the score S of P3 is reachediThe increase from-10 to-5. The reduction of the larger value of the "trash bin" to 5 is merely an exemplary illustration, and the embodiment of the present application does not limit this.
2. As can be seen from table 7, the features that the user may not like a certain picture in the fourth features of P3 are determined to be "shooting mode", "shooting time", "last browsing time", "picture size", the features that the user may like a certain picture are determined to be "picture content classification", "browsing times", and "aesthetic score", and the weight of each fourth feature is 1 as can be seen from the description of equation (1). In order to ensure that the sum of all fourth feature weights contained in all pictures is a fixed value, ensure that the scoring standards of all pictures are consistent, and ensure the comparability of scores of all pictures, the weight weights of features of 'shooting mode', 'shooting time', 'last browsing time', 'picture size' which can determine that a user possibly dislikes a certain picture can be reduced from 1 to 0.5, and the weight weights of features of 'picture content classification', 'browsing times' and 'aesthetic score' which can determine that the user possibly likes a certain picture can be increased to 1.5, then the score S of P3 is obtained at the momentiFrom-10 to 15. The magnitude of the decrease in the weight and the magnitude of the increase in the weight of the above-described fourth feature are merely exemplary illustrations, and the selection of the feature for decreasing the weight and the determination of the feature for increasing the weight, which can determine that the user may dislike a certain picture, are selectedThe feature that the user may like a certain picture is also an exemplary illustration, and the embodiment of the present application is not limited to this.
In another possible embodiment, the electronic device 10 may collect the positive feedback behavior of the user for multiple pictures within a period of time (e.g., a week, a month, etc.), extract the common feature of the third features of the multiple pictures that may determine that the user may not like a certain picture, and extract the common feature of the fourth features that may determine that the user may like a certain picture and determine that the user may not like a certain picture. The common feature that can determine that the user may like a certain picture and the common feature that can determine that the user may not like a certain picture are not features of every picture in a strict sense, but only features that can determine that the user may like a certain picture or features that can determine that the user may not like a certain picture exist consistently in most of the plurality of pictures. For example, if the electronic device 10 collects the forward feedback behavior of the user for 100 pictures in one month, the third features "trash boxes" of 60 pictures in the 100 pictures are all features that can determine that the user may not like a certain picture, and the "trash box" can be taken as a feature that can determine that the user may not like a certain picture, which is common among the third features of the 100 pictures, and the same applies to a feature that can determine that the user may like a certain picture or a feature that can determine that the user may not like a certain picture in the fourth feature. The algorithm model can be optimized more accurately by collecting common features extracted by the user for the forward feedback behaviors of a plurality of pictures.
In addition, when the algorithm model is optimized according to the two feedback behaviors, the optimization can be realized by adjusting the first characteristic (third characteristic) or the second characteristic (or fourth characteristic). Specifically, the first feature (third) may be changed into the second feature (fourth feature), or the second feature (fourth feature) may be changed into the first feature (third feature), or the first feature (third feature) may be added, or the first feature (third feature) may be reduced, or the second feature (fourth feature) may be added, or the second feature (fourth feature) may be reduced. The embodiments of the present application are not described in detail.
In the embodiment of the present application, the selection of the first feature (third feature), the adjustment of the larger value and the smaller value of the first feature (third feature), the selection of each second feature (fourth feature), and the adjustment of the larger value and the smaller value of the second feature (fourth feature) may also be manually selected or manually input by a user, and the manually selected feature by the user and the manually input value of each feature may more accurately represent the intention of the user. The embodiments of the present application are not limited to the selection of each feature and the value of each feature.
The algorithm model involved in the embodiment of the present application is not limited to the formula (1) set forth above, and may actually be an AI machine learning algorithm model, such as naive bayes, support vector machines, deep neural networks, and the like. The initial training samples of the AI machine learning algorithm model may be scores of a variety of users for a large number of pictures that may include the various features listed in the above embodiments, and are not listed here. After the model training is completed, after each feature data of the newly input picture i is received, the score of the picture can be calculated according to the feature data, and finally the score of the picture i is output. The electronic device 10 may manage the picture according to the output score. In addition, the algorithm model can be continuously optimized, and the accuracy of the output score is improved.
Embodiments of the present application also provide a computer-readable storage medium having stored therein instructions, which when executed on a computer or processor, cause the computer or processor to perform one or more steps of any one of the methods described above.
The embodiment of the application also provides a computer program product containing instructions. The computer program product, when run on a computer or processor, causes the computer or processor to perform one or more steps of any of the methods described above.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (19)

  1. A picture management method, executed by an electronic device, comprising:
    acquiring scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album;
    determining a first characteristic, a second characteristic, a scoring criterion of the first characteristic and a scoring criterion of the second characteristic of each picture according to the scene characteristic of each picture;
    calculating the score of each picture according to the value of the first characteristic of each picture and the value of the second characteristic of each picture; wherein the influence of the first characteristic on the score of each picture is greater than the influence of the second characteristic on the score of each picture;
    detecting a first operation of a user;
    responding to the first operation, and displaying S second pictures in the at least two pictures; and S is an integer greater than or equal to 1, and the scores of the S second pictures are higher than the scores of other pictures except the S second pictures in the at least two pictures.
  2. The method of claim 1, wherein the scene features comprise one or any combination of: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or a storage position or not; the first feature comprises one or any combination of the following: whether the storage is collected or remarked; the second feature comprises one or any combination of the following: whether the picture is shared, a shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size and aesthetic score.
  3. The method of claim 1 or 2,
    before the detecting the first operation of the user, the method further includes: the electronic equipment displays a status bar, a navigation bar, a time component icon and icons of one or more application programs, the icon of the camera application belongs to the icons of the one or more application programs, and the first operation is an operation of a user on the icon of the album application;
    after the detecting the first operation of the user, the method further includes: responding to the first operation, and displaying other pictures except the S second pictures in the at least two pictures; wherein the S second pictures are displayed before the other pictures except the S second pictures, or the S second pictures are specially marked to be displayed differently from the other pictures except the S second pictures.
  4. The method of claim 3, wherein the S second pictures are arranged in order of scores from high to low, and the other pictures except the S second pictures are arranged in order of scores from high to low.
  5. The method according to claim 3 or 4, wherein the manner in which the S second pictures are specially marked comprises one or any combination of the following: enlarged display, increased frame display, increased mark display, special color display, and special transparency display.
  6. The method of claim 1 or 2, wherein after detecting the first operation by the user, the method further comprises:
    responding to the first operation, displaying the folders according to the classification, and displaying a search control, a first menu control, a second menu control and a third menu control; the first operation is an operation of the third menu control by a user, and the classification mode includes one or any combination of the following: location, time, people; each folder comprises one or more pictures, and the one or more pictures belong to the at least two pictures;
    after the folders are displayed according to the categories and the search control, the first menu control, the second menu control and the third menu control are displayed, the method further comprises the following steps: and responding to a second operation of the user on the search control, and displaying a search bar, a folder displayed according to the classification and the S second pictures.
  7. The method of any one of claims 1 to 6, before calculating the score of each picture according to the value of the first characteristic of each picture and the value of the second characteristic of each picture, further comprising:
    judging that the optimization condition is met; wherein the optimization condition comprises one or any combination of the following: the remaining storage space of the electronic equipment is lower than a first set value, the set time is reached, the remaining electric quantity of the electronic equipment is lower than a second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
  8. The method according to any of claims 3 to 5, wherein after displaying the second picture of the S second pictures, the method further comprises:
    receiving a third operation that the user cancels the special mark of at least one second picture in the S second pictures, and recalculating the score of the at least one second picture in response to the third operation; or
    Receiving a fourth operation that a user adds a special mark to at least one of the other pictures except the S second pictures, and recalculating the score of the at least one picture in response to the fourth operation; or
    Receiving a fifth operation that a user displays after moving at least one of the S second pictures to at least one of the other pictures except the S second pictures, and recalculating the score of at least one of the S second pictures in response to the fifth operation; or
    And receiving a sixth operation that a user displays before moving at least one of the other pictures except the S second pictures to at least one of the S second pictures, and recalculating the score of at least one of the other pictures except the S second pictures in response to the sixth operation.
  9. A picture management method, executed by an electronic device, comprising:
    acquiring scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album;
    determining a third feature, a fourth feature, a scoring criterion of the third feature and a scoring criterion of the fourth feature of each picture according to the scene features of each picture;
    calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; wherein the influence of the third feature on the score of each picture is greater than the influence of the fourth feature on the score of each picture;
    detecting a first operation of a user;
    displaying a first folder in response to the first operation; wherein the first folder comprises M first pictures in the at least two pictures; and M is an integer greater than or equal to 1, and the scores of the M first pictures are lower than the scores of other pictures except the M first pictures in the at least two pictures.
  10. The method of claim 9, wherein after displaying the first folder, further comprising:
    detecting a second operation of the user;
    and in response to the second operation, deleting the M first pictures.
  11. The method of claim 9 or 10, performed by an electronic device, wherein the scene features comprise one or any combination of: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or a storage position or not; the third feature may include one or any combination of the following: whether or not it is placed in a trash; the fourth feature may include one or any combination of the following: shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size, and aesthetic score.
  12. The method according to any one of claims 9 to 11, before calculating the score of each picture according to the value of the third feature of each picture and the value of the fourth feature of each picture, further comprising:
    judging that the optimization condition is met; wherein the optimization condition comprises one or any combination of the following: the remaining storage space of the electronic equipment is lower than a first set value, the set time is reached, the remaining electric quantity of the electronic equipment is lower than a second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
  13. The method of any of claims 9 to 12, wherein after displaying the first folder, further comprising:
    receiving a third operation of moving at least one first picture in the M first pictures out of the first folder by a user, and recalculating the score of the at least one first picture in response to the third operation; or
    Receiving a fourth operation that the user moves at least one picture in the other pictures except the M first pictures into the first folder, and recalculating the score of the at least one picture in response to the fourth operation.
  14. A picture management method, executed by an electronic device, comprising:
    acquiring scene characteristics of each of at least two pictures stored in the electronic equipment and/or stored in a cloud album;
    determining a third feature, a fourth feature, a scoring standard of the third feature and a scoring standard of the fourth feature of each picture according to the scene features of each picture;
    calculating the score of each picture according to the value of the third characteristic of each picture and the value of the fourth characteristic of each picture; wherein the influence of the third feature on the score of each picture is greater than the influence of the fourth feature on the score of each picture;
    deleting M first pictures in the at least two pictures; and M is an integer greater than or equal to 1, and the scores of the M first pictures are lower than those of the other pictures except the M first pictures in the at least two pictures.
  15. The method of claim 14, wherein the scene features comprise one or any combination of: whether the wallpaper is collected or not, whether the wallpaper is remarked or not, whether the wallpaper is associated or not, whether the wallpaper is uploaded to a cloud end or a storage position or not; the third feature includes: whether or not it is placed in a trash; the fourth feature may include one or any combination of the following: shooting mode, picture content classification, browsing times, shooting time, last browsing time, picture size, and aesthetic score.
  16. The method of any one of claims 14 or 15, wherein before calculating the score of each picture based on the value of the third characteristic of each picture and the value of the fourth characteristic of each picture, further comprising: judging that the optimization condition is met; wherein the optimization condition comprises one or any combination of the following: the remaining storage space of the electronic equipment is lower than a first set value, the set time is reached, the remaining electric quantity of the electronic equipment is lower than a second set value, the electronic equipment is being charged, and the electronic equipment is in a screen-off state.
  17. The method according to any one of claims 14 to 16, wherein after deleting the M first pictures of the at least two pictures, further comprising:
    receiving a first operation of downloading at least one first picture in the M first pictures by a user, and recalculating the score of the at least one first picture in response to the first operation; or
    And receiving a second operation of deleting at least one picture in the other pictures except the M first pictures by the user, and recalculating the score of the at least one picture in response to the second operation.
  18. An electronic device, comprising: one or more processors, memory, display screens, wireless communication modules, and mobile communication modules;
    the memory, the display screen, the wireless communication module, and the mobile communication module are coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the picture management method of any of claims 1-17.
  19. A computer storage medium comprising computer instructions that, when run on an electronic device, cause the electronic device to perform the picture management method of any of claims 1-17.
CN201880081385.5A 2018-10-12 2018-10-12 File management method and electronic equipment Pending CN111480158A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/110070 WO2020073317A1 (en) 2018-10-12 2018-10-12 File management method and electronic device

Publications (1)

Publication Number Publication Date
CN111480158A true CN111480158A (en) 2020-07-31

Family

ID=70164365

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880081385.5A Pending CN111480158A (en) 2018-10-12 2018-10-12 File management method and electronic equipment

Country Status (2)

Country Link
CN (1) CN111480158A (en)
WO (1) WO2020073317A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112965776A (en) * 2021-03-11 2021-06-15 广州酷狗计算机科技有限公司 Display method and device of screen saver picture, terminal and computer readable storage medium
CN115525610A (en) * 2022-02-17 2022-12-27 荣耀终端有限公司 File deletion method, electronic device and computer-readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115098449B (en) * 2022-08-26 2023-07-07 荣耀终端有限公司 File cleaning method and electronic equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080137964A1 (en) * 2006-12-07 2008-06-12 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN102033937A (en) * 2010-12-20 2011-04-27 百度在线网络技术(北京)有限公司 Method and system for displaying video search result
JP2014191701A (en) * 2013-03-28 2014-10-06 Fujifilm Corp Image retrieval device, operation control method thereof, and image retrieval server
US20150066957A1 (en) * 2012-03-29 2015-03-05 Rakuten, Inc. Image search device, image search method, program, and computer-readable storage medium
CN104408077A (en) * 2014-11-03 2015-03-11 宇龙计算机通信科技(深圳)有限公司 Method and system for displaying pictures and terminal
CN106155592A (en) * 2016-07-26 2016-11-23 深圳天珑无线科技有限公司 A kind of photo processing method and terminal
CN106528879A (en) * 2016-12-14 2017-03-22 北京小米移动软件有限公司 Picture processing method and device
CN106570155A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Image management device and method
CN106570110A (en) * 2016-10-25 2017-04-19 北京小米移动软件有限公司 De-overlapping processing method and apparatus of image
CN107291781A (en) * 2016-04-12 2017-10-24 中兴通讯股份有限公司 A kind of image management method and device
CN107480176A (en) * 2017-07-01 2017-12-15 珠海格力电器股份有限公司 A kind of management method of picture, device and terminal device
US20170364303A1 (en) * 2016-06-17 2017-12-21 Microsoft Technology Licensing, Llc. Suggesting image files for deletion based on image file parameters
CN108460065A (en) * 2017-08-17 2018-08-28 腾讯科技(深圳)有限公司 Photo method for cleaning, device and terminal device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750688A (en) * 2013-12-25 2015-07-01 鸿富锦精密工业(深圳)有限公司 Photo management method and system
CN105808100A (en) * 2016-02-29 2016-07-27 北京金山安全软件有限公司 Picture sorting method and device and electronic equipment
CN106777214B (en) * 2016-12-24 2020-08-04 河南省地质科学研究所 Photo album picture ordering method and mobile terminal

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080137964A1 (en) * 2006-12-07 2008-06-12 Canon Kabushiki Kaisha Image processing apparatus and image processing method
CN102033937A (en) * 2010-12-20 2011-04-27 百度在线网络技术(北京)有限公司 Method and system for displaying video search result
US20150066957A1 (en) * 2012-03-29 2015-03-05 Rakuten, Inc. Image search device, image search method, program, and computer-readable storage medium
JP2014191701A (en) * 2013-03-28 2014-10-06 Fujifilm Corp Image retrieval device, operation control method thereof, and image retrieval server
CN104408077A (en) * 2014-11-03 2015-03-11 宇龙计算机通信科技(深圳)有限公司 Method and system for displaying pictures and terminal
CN107291781A (en) * 2016-04-12 2017-10-24 中兴通讯股份有限公司 A kind of image management method and device
US20170364303A1 (en) * 2016-06-17 2017-12-21 Microsoft Technology Licensing, Llc. Suggesting image files for deletion based on image file parameters
CN106155592A (en) * 2016-07-26 2016-11-23 深圳天珑无线科技有限公司 A kind of photo processing method and terminal
CN106570110A (en) * 2016-10-25 2017-04-19 北京小米移动软件有限公司 De-overlapping processing method and apparatus of image
CN106570155A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Image management device and method
CN106528879A (en) * 2016-12-14 2017-03-22 北京小米移动软件有限公司 Picture processing method and device
CN107480176A (en) * 2017-07-01 2017-12-15 珠海格力电器股份有限公司 A kind of management method of picture, device and terminal device
CN108460065A (en) * 2017-08-17 2018-08-28 腾讯科技(深圳)有限公司 Photo method for cleaning, device and terminal device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112965776A (en) * 2021-03-11 2021-06-15 广州酷狗计算机科技有限公司 Display method and device of screen saver picture, terminal and computer readable storage medium
CN115525610A (en) * 2022-02-17 2022-12-27 荣耀终端有限公司 File deletion method, electronic device and computer-readable storage medium
CN115525610B (en) * 2022-02-17 2023-05-09 荣耀终端有限公司 File deletion method, electronic device and computer readable storage medium

Also Published As

Publication number Publication date
WO2020073317A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
WO2020238356A1 (en) Interface display method and apparatus, terminal, and storage medium
WO2021017932A1 (en) Image display method and electronic device
CN108235765B (en) Method and device for displaying story photo album
CN112783379B (en) Picture selection method and electronic equipment
WO2021000841A1 (en) Method for generating user profile photo, and electronic device
WO2020259554A1 (en) Learning-based keyword search method, and electronic device
US20220343648A1 (en) Image selection method and electronic device
JP2019531561A (en) Image processing method and apparatus, electronic device, and graphical user interface
CN111480158A (en) File management method and electronic equipment
US20240053868A1 (en) Feedback method, apparatus, and system
CN112740148A (en) Method for inputting information into input box and electronic equipment
CN114691276B (en) Application processing method, intelligent terminal and storage medium
CN115525783B (en) Picture display method and electronic equipment
WO2021196980A1 (en) Multi-screen interaction method, electronic device, and computer-readable storage medium
CN116527805A (en) Card display method, electronic device, and computer-readable storage medium
CN114594882A (en) Feedback method, device and system
WO2023246666A1 (en) Search method and electronic device
CN114244951B (en) Method for opening page by application program, medium and electronic equipment thereof
WO2024027570A1 (en) Interface display method and related apparatus
WO2024078120A1 (en) File management method, and device and storage medium
WO2022228010A1 (en) Method for generating cover, and electronic device
US20240126808A1 (en) Search result feedback method and apparatus, and storage medium
CN116414280A (en) Picture display method and electronic equipment
CN114579006A (en) Application classification method, electronic equipment and chip system
CN116798418A (en) Control method and device based on voice assistant

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination