CN108268322B - Memory optimization method and device and computer readable storage medium - Google Patents

Memory optimization method and device and computer readable storage medium Download PDF

Info

Publication number
CN108268322B
CN108268322B CN201810159902.8A CN201810159902A CN108268322B CN 108268322 B CN108268322 B CN 108268322B CN 201810159902 A CN201810159902 A CN 201810159902A CN 108268322 B CN108268322 B CN 108268322B
Authority
CN
China
Prior art keywords
apps
app
memory
behavior data
probability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810159902.8A
Other languages
Chinese (zh)
Other versions
CN108268322A (en
Inventor
刘任
李加佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201810159902.8A priority Critical patent/CN108268322B/en
Publication of CN108268322A publication Critical patent/CN108268322A/en
Application granted granted Critical
Publication of CN108268322B publication Critical patent/CN108268322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5016Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals the resource being the memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Stored Programmes (AREA)

Abstract

The disclosure relates to a memory optimization method, a memory optimization device and a computer readable storage medium, and relates to the technical field of terminals. The method comprises the following steps: acquiring packet names and historical behavior data of N first application programs APP continuously used before the current time, wherein N is a positive integer larger than 1; according to the packet names and the historical behavior data of the N first APPs, the using probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users; and optimizing the memory according to the using probabilities of the plurality of second APPs. The method and the device can predict the use probability of the APP to be used by the user before the user uses the APP, can automatically optimize the memory according to the predicted use probability of the APP, and improve the accuracy of memory optimization without manual operation of the user.

Description

Memory optimization method and device and computer readable storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a method and an apparatus for optimizing a memory, and a computer-readable storage medium.
Background
With the continuous development of terminal technology, a large number of APPs (applications) with different functions are emerging continuously, and thus, the number of APPs installed on a terminal is also large. However, as the number of APPs on the terminal increases, when the terminal runs multiple APPs at the same time, the memory resource of the terminal is severely insufficient, which causes the terminal to be stuck, and therefore, optimization of the memory is very important.
At present, the memory is optimized mainly in a manual mode, that is, when a terminal is jammed, a user can manually clean the memory of the terminal so as to optimize the memory.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a method, an apparatus, and a computer-readable storage medium for memory optimization.
In a first aspect, a method for optimizing a memory is provided, which is applied to a terminal, and the method includes:
acquiring packet names and historical behavior data of N first application programs APP continuously used before the current time, wherein N is a positive integer larger than 1;
according to the packet names and the historical behavior data of the N first APPs, the using probabilities of a plurality of second APPs which can be used after the current time are predicted through a specified prediction model, wherein the specified prediction model is obtained through statistics of packet names and historical behavior data training of the APPs in terminals owned by a plurality of users;
and optimizing the memory according to the use probabilities of the plurality of second APPs.
Optionally, the performing memory optimization according to the usage probabilities of the plurality of second APPs includes:
selecting at least one second APP from the plurality of second APPs according to a preset rule according to the using probabilities of the plurality of second APPs;
and pre-loading the at least one second APP into the memory.
Optionally, before the pre-loading the at least one second APP into the memory, the method further includes:
detecting the loaded APPs in the memory to determine second APPs which are not loaded in the memory from the at least one second APP;
and pre-loading the second APP which is not loaded in the memory into the memory.
Optionally, the selecting, according to the usage probabilities of the plurality of second APPs and according to a preset rule, at least one second APP from the plurality of second APPs includes:
and selecting at least one second APP from the plurality of second APPs, wherein the use probabilities of the second APPs are all larger than a first preset probability, and the sum of the use probabilities of the second APPs is larger than a second preset probability, and the first preset probability is smaller than the second preset probability.
Optionally, the performing memory optimization according to the usage probabilities of the plurality of second APPs includes:
selecting a second APP with a use probability smaller than a third preset probability from the plurality of second APPs;
and cleaning the second APP with the use probability smaller than the third preset probability, which is selected from the memory.
Optionally, the historical behavior data of the N first APPs includes at least a historical usage order.
Optionally, the historical behavior data of the N first APPs further includes a historical use place and a historical use date, where the historical use date refers to whether the historical use time is a holiday or a working day.
Optionally, the specified prediction model is a long-short term memory (LSTM) network model.
In a second aspect, an apparatus for memory optimization is provided, the apparatus comprising:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring packet names and historical behavior data of N first application programs APP which are continuously used before the current time, and N is a positive integer greater than 1;
the prediction module is used for predicting the use probability of a plurality of second APPs used after the current time by specifying a prediction model according to the packet names and the historical behavior data of the N first APPs, wherein the specified prediction model is obtained by counting the packet names and the historical behavior data of the APPs in the terminals owned by the users;
and the optimization module is used for optimizing the memory according to the use probabilities of the plurality of second APPs.
Optionally, the optimization module comprises:
the first selection submodule is used for selecting at least one second APP from the plurality of second APPs according to a preset rule and the use probability of the plurality of second APPs;
the first loading submodule is used for pre-loading the at least one second APP into the memory.
Optionally, the optimization module further comprises:
the detection submodule is used for detecting the loaded APPs in the memory so as to determine the second APPs which are not loaded in the memory from the at least one second APP;
and the second loading submodule is used for pre-loading the second APP which is not loaded in the memory into the memory.
Optionally, the first selecting submodule includes:
the selecting unit is used for selecting at least one second APP from the plurality of second APPs, the using probability of the second APP is larger than a first preset probability, the sum of the using probabilities of the second APPs is larger than a second preset probability, and the first preset probability is smaller than the second preset probability.
Optionally, the optimization module comprises:
the second selection submodule is used for selecting a second APP with the use probability smaller than a third preset probability from the plurality of second APPs;
and the cleaning submodule is used for cleaning the second APP with the use probability smaller than the third preset probability in the memory.
Optionally, the historical behavior data of the N first APPs includes at least a historical usage order.
Optionally, the historical behavior data of the N first APPs further includes a historical use place and a historical use date, where the historical use date refers to whether the historical use time is a holiday or a working day.
Optionally, the specified prediction model is a long-short term memory (LSTM) network model.
In a third aspect, an apparatus for optimizing memory is provided, where the apparatus includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of any of the methods of the first aspect described above.
In a fourth aspect, a computer-readable storage medium is provided, having instructions stored thereon, which when executed by a processor, implement the steps of any of the methods of the first aspect described above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of the method of any of the first aspects above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, packet names and historical behavior data of N first APPs that are continuously used before current time are obtained, according to the packet names and the historical behavior data of the N first APPs, the use probabilities of a plurality of second APPs that will be used after current time are predicted by specifying a prediction model, and memory optimization is performed according to the use probabilities of the plurality of second APPs. That is, through N first APPs and the appointed prediction model that use before the current time in succession, predict the probability of use of a plurality of second APPs that the user will use after the current time, so, can predict the probability of use of the APP that the user will use before the user uses APP to can carry out memory optimization according to the probability of use of APP that predicts automatically, improved memory optimization's accuracy when need not user manual operation.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating a method for memory optimization in accordance with an example embodiment.
FIG. 2 is a flow diagram illustrating a method for memory optimization in accordance with an example embodiment.
FIG. 3 is a flow diagram illustrating a method for memory optimization in accordance with an example embodiment.
FIG. 4 is a block diagram illustrating an apparatus for memory optimization in accordance with an example embodiment.
FIG. 5 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before explaining the embodiments of the present disclosure in detail, an application scenario of the embodiments of the present disclosure is introduced:
with the increasing popularity of terminals, APP is becoming an indispensable part of the terminals. In order to pursue stronger terminal function, the user often can install the stronger APP of a plurality of functions on the terminal, because the function of APP is stronger, shared terminal memory is just bigger, consequently, the stronger APP of a plurality of functions can occupy a large amount of memory spaces of terminal, leads to memory resource to be nervous, and the terminal operation is not smooth. And, every APP all need occupy the memory resource of terminal at the operation in-process, consequently, when a plurality of APPs of terminal simultaneous operation, these a plurality of APPs can contend for and occupy the memory simultaneously, lead to the memory resource supply not enough, and the conflict aggravation, the cell-phone appears blocking. Therefore, a method for optimizing a terminal memory is needed.
At present, the memory is optimized in a manual mode, that is, when the terminal is jammed, a user can manually clean the memory of the terminal to optimize the memory. The duration of the memory optimization process is generally long, a user can only wait for the memory to be optimized and then use the terminal again, and in the method, the user needs to manually operate to optimize the memory, so that the operation is complicated, and the user experience is low. Therefore, the embodiment of the present disclosure provides a method for optimizing a memory, which does not require a user to perform a manual operation and improves accuracy of memory optimization.
Fig. 1 is a flowchart illustrating a method for optimizing a memory according to an exemplary embodiment, which is applied to a terminal, and as shown in fig. 1, the method includes the following steps.
In step 101, packet names and historical behavior data of N first applications APP continuously used before the current time are obtained, where N is a positive integer greater than 1.
In step 102, according to the packet names and the historical behavior data of the N first APPs, the usage probabilities of a plurality of second APPs to be used after the current time are predicted by specifying a prediction model, which is obtained by counting packet names and historical behavior data of APPs in terminals owned by a plurality of users.
In step 103, a memory optimization is performed according to the usage probabilities of the plurality of second APPs.
In the embodiment of the disclosure, packet names and historical behavior data of N first APPs continuously used before current time are obtained, wherein N is a positive integer greater than 1; according to the packet names and the historical behavior data of the N first APPs, the use probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users; and optimizing the memory according to the using probabilities of the plurality of second APPs. That is, through N first APPs and the appointed prediction model that use before the current time in succession, predict the probability of use of a plurality of second APPs that the user will use after the current time, so, can predict the probability of use of the APP that the user will use before the user uses APP to can carry out memory optimization according to the probability of use of APP that predicts automatically, improved memory optimization's accuracy when need not user manual operation.
Optionally, performing memory optimization according to the usage probabilities of the plurality of second APPs includes:
selecting at least one second APP from the plurality of second APPs according to a preset rule according to the using probabilities of the plurality of second APPs;
and pre-loading the at least one second APP into the memory.
Optionally, before the at least one second APP is pre-loaded into the memory, the method further includes:
detecting the loaded APPs in the memory to determine second APPs which are not loaded in the memory from at least one second APP;
and pre-loading the second APP which is not loaded in the memory into the memory.
Optionally, according to the usage probabilities of the plurality of second APPs, selecting at least one second APP from the plurality of second APPs according to a preset rule, including:
at least one second APP with the use probability greater than a first preset probability and the sum of the use probabilities greater than a second preset probability is selected from the plurality of second APPs, and the first preset probability is smaller than the second preset probability.
Optionally, performing memory optimization according to the usage probabilities of the plurality of second APPs includes:
selecting a second APP with the use probability smaller than a third preset probability from the plurality of second APPs;
and cleaning the second APP selected from the memory, wherein the use probability of the second APP is smaller than the third preset probability.
Optionally, the historical behavior data of the N first APPs includes at least a historical usage order.
Optionally, the historical behavior data of the N first APPs further includes a historical use place and a historical use date, where the historical use date refers to whether the historical use time is a holiday or a working day.
Optionally, the specified prediction model is a long-short term memory LSTM network model.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present disclosure, and the embodiments of the present disclosure are not described in detail again.
When memory optimization is performed according to the use probabilities of the plurality of second APPs, on one hand, the second APPs which may be used by the user can be pre-stored in the memory before the user uses the second APPs, so that the user can quickly start the second APPs; on the other hand, the second APP with the smaller use probability can be cleared from the memory, and the effective occupancy rate of the memory is improved. Next, two optimization methods will be described by the following two embodiments.
Fig. 2 is a flowchart of a method for optimizing a memory according to an exemplary embodiment, which is applied to a mobile terminal, and this embodiment explains in detail an optimization manner of loading a second APP with a higher use probability in the memory. As shown in fig. 2, the method includes the following steps.
In step 201, packet names and historical behavior data of N first APPs continuously used before the current time are obtained, where N is a positive integer greater than 1.
In this information age today, historical data is of considerable importance, and a large amount of historical data may reflect the characteristics of an event and future trends. Therefore, in order to reflect the characteristics and the trend of the user using the APP, in the embodiment of the present disclosure, the historical data of the user using the APP may be obtained, that is, the package names and the historical behavior data of the N first APPs continuously used before the current time are obtained.
It should be noted that the terminal may store the usage record of each time the user uses the APP into the history, and since the user uses the APP, the usage record of the APP stored in the history has a certain usage order. Therefore, historical behavior data of N first APPs continuously used before the current time may be acquired from the history, and the historical behavior data of the N first APPs at least includes the historical usage order.
In practical implementation, the historical behavior data of the N first APPs may further include a historical usage location reflecting a location where the user uses the APP and a historical usage date reflecting a date when the user uses the APP. The historical use date is the time of the historical use, and is a holiday or a working day.
It should be further noted that, in order to identify different APPs, package names of the APPs may also be stored in the history record, where the different APPs have different package names, and the package name is used to uniquely identify the APP. Of course, the APP may also be uniquely identified in other ways, which is not limited in the embodiment of the present disclosure.
In step 202, according to the packet names and the historical behavior data of the N first APPs, the usage probabilities of a plurality of second APPs to be used after the current time are predicted by specifying a prediction model, which is obtained by counting packet names and historical behavior data of APPs in terminals owned by a plurality of users.
As can be seen from the above description, the historical data may reflect a future development trend of an event, and the prediction model may predict the development trend of an event, so that, after packet names and historical behavior data of N first APPs that are continuously used before the current time are obtained, the usage probabilities of a plurality of second APPs that will be used after the current time may be predicted by the packet names and historical behavior data of the N first APPs and the specified prediction model. In practical implementation, the acquired package names and historical behavior data of the N first APPs may be input into the specified prediction model, and then the usage probabilities of a plurality of second APPs that may be used by the user after the current time are predicted by the specified prediction model.
The appointed prediction model is obtained by counting packet names and historical behavior data of the APPs of the multiple users using the terminals owned by the users, and each APP has a unique packet name, so that when the packet names and the historical behavior data of the N first APPs are input into the appointed prediction model, the probability of the APP which is possibly used by the users next time can be predicted and obtained through the appointed prediction model. For example, the use sequence of the first APP used by the user history is APP1, APP2, APP3, APP1, APP3, when APP1, APP2, APP3, APP1, APP3 are input into the specified prediction model, the second APP that the user may use after the current time is predicted to be APP4, APP5, APP2, and the like, and of course, other second APPs may be predicted, which is not limited by the embodiment of the present disclosure.
In order to make prediction more accurately by the specified prediction model, historical usage locations and historical usage dates of the APPs used by the user in the history may be input into the specified prediction model for prediction, for example, when the usage location of APP1 is company, the usage date is working day, the usage location of APP2 is home, the usage date is mid-autumn festival, the usage location of APP3 is scenic spot, the usage date is national celebration festival, the usage location of APP1 is company, the usage date is working day, the usage location of APP3 is company, the usage date is weekend, and these historical usage locations and historical usage dates are input into the specified prediction model, it is predicted that the second APPs that the user may use after the current time may be APP6, APP7, APP3, and other second APPs may also be predicted, which is not limited by the present disclosure.
It should be noted that, after the current time, the user may use each APP on the terminal, the use probability of each APP is different, but the sum of the probabilities of all APPs is 100%, so that the use probability of each APP on the terminal, that is, the use probability of a plurality of second APPs that the user may use after the current time, may be predicted by the prediction model.
For example, there are APP1, APP2, APP3, APP4, APP5 that a user may use after a current time predicted by a specified prediction model, where the probability of use of APP1 is 35%, the probability of use of APP2 is 35%, the probability of use of APP3 is 25%, the probability of use of APP4 is 3.5%, the probability of use of APP5 is 1.5%, and the sum of the probabilities of APP1, APP2, APP3, APP4, and APP5 is 100%.
In addition, the specified prediction model is obtained by counting packet names and historical behavior data of APPs in terminals owned by a plurality of users. In practical implementation, relevant data of the APPs of the terminals owned by the users can be collected and counted by the server, required features are extracted from the relevant data, and then a specified prediction model is obtained according to the extracted features through training. Wherein, collecting and counting the related data of the APP in the terminal owned by each user at least comprises: package name and historical behavior data of APP.
It should be noted that, after the features are extracted, the initialized network model may be trained according to the extracted features, and after the training is completed, the specified prediction model is obtained. In practical implementation, the extracted features may be input into the initialized network model for training to obtain a plurality of parameters in the initialized network model, and then the plurality of parameters are input into the initialized network model, so as to finally obtain the specified prediction model.
The initialization network model may be selected from a plurality of network models, and since the APP that a user may use is strongly correlated with time, in the embodiment of the present disclosure, an LSTM (Long Short-Term Memory) network model is selected as the initialization network model, and the network model is a time recursive neural network model and is suitable for processing and predicting important events with relatively Long intervals and delays in a time sequence. Of course, other initialization network models may be selected, and the embodiment of the present disclosure is not limited thereto.
Since the designated prediction model needs to predict the usage probabilities of the plurality of second APPs that may be used by the user after the current time, that is, when the user inputs the obtained package names and historical behavior data of the N first APPs into the designated prediction model, the designated prediction model can predict the usage probabilities of the plurality of second APPs that may be used by the user after the current time, in the process of obtaining the designated model through training the extracted features, the extracted features at least can cause the designated prediction model to successfully predict the usage probabilities of the plurality of second APPs that may be used by the user after the current time.
In actual implementation, the feature to be extracted can be selected from statistical related data by comprehensively considering the difficulty of technical implementation and the actual demand, and in a possible implementation manner, the packet names of the APPs in the terminals owned by a plurality of users and the opening time of the APPs at the front end and the opening time of the APPs at the rear end in the historical behavior data can be extracted. However, in order to more clearly and accurately predict the usage probabilities of the second APPs that may be used by the user after the current time by specifying the prediction model, in another possible implementation manner, the packet names of the APPs in the terminals that the users use and the APP start time at the front end, the APP start time at the back end, the APP usage location, and the APP usage date in the historical behavior data may also be extracted. For ease of understanding, two possible implementations of extracting features are described below:
a first possible implementation: and extracting the packet name of the APP, the opening time of the APP at the front end and the opening time of the APP at the rear end.
Under the possible implementation mode, three characteristics of the packet name of the APP, the opening time of the APP at the front end and the opening time of the APP at the back end are extracted, and the specified prediction model required by the embodiment of the disclosure can be trained and obtained through the three characteristics.
A second possible implementation: extracting the package name of the APP, the opening time of the APP at the front end, the opening time of the APP at the rear end, the using place of the APP and the using date of the APP.
Under the possible implementation mode, the using place of the APP and the using date of the APP are added on the basis of the features extracted according to the implementation mode, the using place of the APP can reflect the place when the user uses the APP, and the using date of the APP can reflect the date when the user uses the APP. The date of use of the APP indicates whether the time of use of the APP is a holiday or a workday.
It can be seen from the above two possible implementation manners that the number of the features extracted in the first possible implementation manner is less than the number of the features extracted in the second possible implementation manner, and therefore, the features extracted in the first possible implementation manner can be selected to train to obtain the specified prediction model under the condition that the technology is limited to a certain extent, and more features can be added to train to obtain the specified prediction model under the condition that the technology allows and expects that the specified prediction model can be predicted more accurately, for example, the features extracted in the second possible implementation manner are selected to train to obtain the specified prediction model.
The specified prediction model is obtained by counting packet names and historical behavior data of APPs in terminals owned by a plurality of users, so that the specified prediction model is universal, and can be used by all general user terminals. The more the number of the users is counted, the more targeted the specified prediction model obtained by training is, and the more accurate the data predicted by the specified prediction model is, therefore, in the embodiment of the present disclosure, the number of the users to be counted may be millions or tens of millions of users, or of course, other specific numbers may also be used, which is not limited in the embodiment of the present disclosure.
It should be further noted that, after the specified prediction model is obtained through training at the server side, the specified prediction model may be integrated into the terminal, and then the usage probabilities of the plurality of second APPs that may be used by the user after the current time are predicted on the terminal through the prediction model.
In step 203, at least one second APP is selected from the plurality of second APPs according to a preset rule according to the usage probabilities of the plurality of second APPs.
Because the use probabilities of the second APPs predicted by the specified prediction model to be possibly used by the user are different, the use probabilities of some second APPs are higher, and the use probabilities of some second APPs are lower, at least one second APP can be selected from the plurality of second APPs according to the preset rule.
It should be noted that at least one second APP, of which the usage probabilities are all greater than a first preset probability and the sum of the usage probabilities is greater than a second preset probability, may be selected from the plurality of second APPs, where the first preset probability is smaller than the second preset probability.
For example, assuming that the first preset probability is 30% and the second preset probability is 50%, in the above example, the probability of using APP1 is 35%, the probability of using APP2 is 35%, the probability of using APP3 is 25%, the probability of using APP4 is 3.5%, and the probability of using APP5 is 1.5%, where APPs with probabilities of using more than 30% are APP1 and APP2, and the sum of the probabilities of using APP1 and APP2 is more than 50%, and thus, APP1 and APP2 may be selected from APPs 1, APP2, APP3, APP4 and APP 5.
In addition, the sum of the use probabilities of all the second APPs can be averaged, and the second APP with the use probability larger than the average value can be selected. Taking the above example as an example, the sum of the probabilities of use of APP1, APP2, APP3, APP4 and APP5 is 100%, and the average value is 20%, and the probabilities of use of APP1, APP2 and APP3 are all greater than the average value, so APP1, APP2 and APP3 can be selected from APP1, APP2, APP3, APP4 and APP 5.
Of course, at least one second APP may be selected from the plurality of second APPs according to other preset rules, which is not limited in the embodiment of the present disclosure.
In step 204, the at least one second APP is preloaded into the memory.
In the related art, when a user cold starts an APP, the user can really enter the APP only after waiting for the end of the opening interface of the APP. Therefore, in order to change the APP from the cold start to the hot start before the user uses the APPs, after at least one second APP is selected from the plurality of second APPs according to the usage probability of the plurality of second APPs and the preset rule, the at least one second APP may be pre-loaded into the memory, so as to implement the hot start of the at least one second APP.
It should be noted that, when a user opens an APP, if the APP is not loaded in the memory of the terminal, when the APP is opened, an opening interface appears first, and at this time, the user needs to wait for the opening interface to be closed before really entering the APP to realize the opening of the APP, and this opening mode is called as cold start. Correspondingly, when the user opens an APP, if the APP is loaded in the memory of the terminal, the opening interface will not appear when the APP is opened, and at this moment, the user can directly enter the APP to realize the opening of the APP, and this opening mode is called hot start.
Since the second APP may already exist in the memory before the at least one second APP is preloaded into the memory, the loaded APPs in the memory may be detected first to determine the second APP not loaded in the memory from the at least one second APP, and the second APP not loaded in the memory is preloaded into the memory.
Therefore, the at least one second APP is pre-loaded into the memory, and the hot start of the at least one second APP can be realized, so that the problem that a user can really enter the APP only by waiting for the end of the opening interface of the APP before starting the APP is avoided.
In the embodiment of the disclosure, packet names and historical behavior data of N first APPs continuously used before current time are obtained, wherein N is a positive integer greater than 1; according to the packet names and the historical behavior data of the N first APPs, the use probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users; and optimizing the memory according to the using probabilities of the plurality of second APPs. That is, through N first APPs and the appointed prediction model that use before the current time in succession, predict the probability of use of a plurality of second APPs that the user will use after the current time, so, can predict the probability of use of the APP that the user will use before the user uses APP to can carry out memory optimization according to the probability of use of APP that predicts automatically, improved memory optimization's accuracy when need not user manual operation.
Fig. 3 is a flowchart illustrating another method for optimizing a memory according to an exemplary embodiment, and this embodiment will explain in detail an optimization manner for clearing a second APP with a smaller probability of being used in the memory. As shown in fig. 3, the method includes the following steps.
In step 301, packet names and historical behavior data of N first applications APP continuously used before the current time are obtained, where N is a positive integer greater than 1.
The implementation process of this step is the same as the implementation process of step 201, and therefore, the detailed description of this step is similar to that of step 201, and is not repeated here.
In step 302, according to the packet names and the historical behavior data of the N first APPs, the usage probabilities of a plurality of second APPs to be used after the current time are predicted by specifying a prediction model, where the specified prediction model is obtained by counting packet names and historical behavior data of APPs in terminals owned by a plurality of users.
The implementation process of this step is the same as the implementation process of step 202, and therefore, the detailed description of this step is similar to that of step 202, and is not repeated here.
In step 303, a second APP with a probability of use less than a third preset probability is selected from the plurality of second APPs.
Since the predicted use probabilities of the plurality of second APPs by the predetermined prediction model are different, on one hand, at least one second APP may be selected from the plurality of second APPs through step 203, and on the other hand, a second APP with a use probability smaller than a third preset probability may also be selected from the plurality of second APPs through this step.
It should be noted that a fixed value may be used as the third preset probability, assuming that the third preset probability is 5%, referring to an example in step 202, the probability of using APP1 is 35%, the probability of using APP2 is 35%, the probability of using APP3 is 25%, the probability of using APP4 is 3.5%, and the probability of using APP5 is 1.5%, then the probabilities of using APP4 and APP5 are both less than 5%, and therefore, APP4 and APP5 may be selected from APP1, APP2, APP3, APP4, and APP 5.
In addition, the sum of the probabilities of use of all the second APPs may be averaged, and the average value is taken as the third preset probability, and in the above example, the sum of the probabilities of use of APP1, APP2, APP3, APP4 and APP5 is 100%, and the average value is taken as 20%, and the probabilities of use of APP4 and APP5 are both smaller than the average value, so APP4 and APP5 may be selected from APP1, APP2, APP3, APP4 and APP 5. Of course, other values may also be used as the third preset probability, which is not limited in the embodiment of the present disclosure.
In step 304, the second APP with the usage probability smaller than the third predetermined probability is cleared.
Because there is the second APP with the smaller use probability in the use probabilities of the plurality of second APPs predicted by the specified prediction model, and if the second APPs with the smaller use probabilities still occupy the memory, the second APPs with the larger use probabilities occupy the memory resources, so that the second APPs with the larger use probabilities may not be smooth during operation, and further the user experience is reduced. Therefore, in order to avoid the above problem, after the second APP with the usage probability smaller than the third predetermined probability is selected from the plurality of second APPs, the second APP with the usage probability smaller than the third predetermined probability selected in the memory may be cleaned.
Currently, when a memory is cleared, the memory is cleared based on a LRU (Least Recently Used) policy, that is, the Recently Used APP is retained, and the Least Recently Used APP is preferentially cleared. However, in many cases, when a user just uses an APP, the APP may not be used for a long time, or the APP used by the user next time may be an APP that has not been used for a long time before, so the LRU-based policy cannot be effectively applied to the actual situation of the user using the APP when cleaning the memory. Therefore, the embodiment of the present disclosure provides to clear up the memory according to the usage probability of the APP, that is, after predicting the usage probability of the APP that will be used after the current time of the user, the memory is cleared up according to the usage probability, so as to effectively fit the actual situation that the user uses the APP, and improve the efficiency of memory cleaning.
In the embodiment of the disclosure, packet names and historical behavior data of N first APPs continuously used before current time are obtained, wherein N is a positive integer greater than 1; according to the packet names and the historical behavior data of the N first APPs, the use probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users; and optimizing the memory according to the using probabilities of the plurality of second APPs. That is, through N first APPs and the appointed prediction model that use before the current time in succession, predict the probability of use of a plurality of second APPs that the user will use after the current time, so, can predict the probability of use of the APP that the user will use before the user uses APP to can carry out memory optimization according to the probability of use of APP that predicts automatically, improved memory optimization's accuracy when need not user manual operation.
Fig. 4 is a block diagram illustrating an apparatus 400 for memory optimization in accordance with an example embodiment. As shown in fig. 4, the apparatus 400 includes an obtaining module 401, a predicting module 402, and an optimizing module 403.
An obtaining module 401, configured to obtain packet names and historical behavior data of N first application programs APP that are continuously used before current time, where N is a positive integer greater than 1;
a prediction module 402, configured to predict, according to the packet names and the historical behavior data of the N first APPs, usage probabilities of a plurality of second APPs that will be used after the current time by specifying a prediction model, where the specified prediction model is obtained by performing statistics on packet names and historical behavior data of APPs in terminals that a plurality of users use;
an optimizing module 403, configured to perform memory optimization according to the usage probabilities of the plurality of second APPs.
Optionally, the optimization module comprises:
the first selection submodule is used for selecting at least one second APP from the plurality of second APPs according to a preset rule and the use probability of the plurality of second APPs;
the first loading submodule is used for pre-loading the at least one second APP into the memory.
Optionally, the optimization module further comprises:
the detection submodule is used for detecting the loaded APPs in the memory so as to determine the second APPs which are not loaded in the memory from the at least one second APP;
and the second loading submodule is used for pre-loading the second APP which is not loaded in the memory into the memory.
Optionally, the first selecting submodule includes:
the selecting unit is used for selecting at least one second APP from the plurality of second APPs, the using probabilities of the second APPs are all larger than a first preset probability, the sum of the using probabilities is larger than a second preset probability, and the first preset probability is smaller than the second preset probability.
Optionally, the optimization module comprises:
the second selection submodule is used for selecting a second APP with the use probability smaller than a third preset probability from the plurality of second APPs;
and the cleaning submodule is used for cleaning the second APP with the use probability smaller than the third preset probability in the memory.
Optionally, the historical behavior data of the N first APPs includes at least a historical usage order.
Optionally, the historical behavior data of the N first APPs further includes a historical use place and a historical use date, where the historical use date refers to whether the historical use time is a holiday or a working day.
Optionally, the specified prediction model is a long-short term memory LSTM network model.
In the embodiment of the disclosure, packet names and historical behavior data of N first APPs continuously used before current time are obtained, wherein N is a positive integer greater than 1; according to the packet names and the historical behavior data of the N first APPs, the use probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users; and optimizing the memory according to the using probabilities of the plurality of second APPs. That is, through N first APPs continuously used before the current time and the specified prediction model, the use probabilities of a plurality of second APPs used by the user after the current time are predicted, so that the use probability of the APPs to be used by the user can be predicted before the user uses the APPs, the memory optimization can be automatically performed according to the predicted use probability of the APPs, and the accuracy of the memory optimization is improved while manual operation of the user is not needed.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 5 is a block diagram illustrating an apparatus 500 for memory optimization in accordance with an example embodiment. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 5, the apparatus 500 may include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, and communication component 516.
The processing component 502 generally controls overall operation of the device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 502 may include one or more processors 520 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interaction between the processing component 502 and other components. For example, the processing component 502 can include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
The memory 504 is configured to store various types of data to support operations at the apparatus 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 504 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power supplies for the apparatus 500.
The multimedia component 508 includes a screen that provides an output interface between the device 500 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 508 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 500 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 510 is configured to output and/or input audio signals. For example, audio component 510 includes a Microphone (MIC) configured to receive external audio signals when apparatus 500 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 504 or transmitted via the communication component 516. In some embodiments, audio component 510 further includes a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 514 includes one or more sensors for providing various aspects of status assessment for the device 500. For example, the sensor assembly 514 may detect the open/closed status of the device 500, the relative positioning of the components, such as the display and keypad of the device 500, the change in position of the device 500 or a component of the device 500, the presence or absence of user contact with the device 500, the orientation or acceleration/deceleration of the device 500, and the change in temperature of the device 500. The sensor assembly 514 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 515 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The apparatus 500 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 516 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 516 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the methods provided by the embodiments shown in fig. 1, 2 or 3 and described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 504 comprising instructions, executable by the processor 520 of the apparatus 500 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform a method of memory optimization.
A computer program product comprising instructions which, when executed by a processor of a mobile terminal device, enable the mobile terminal device to perform the above-described memory optimization method of fig. 1, 2 and 3.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (8)

1. A method for optimizing a memory is applied to a terminal, and the method comprises the following steps:
acquiring packet names and historical behavior data of N first applications APP continuously used before the current time, wherein the historical behavior data of the N first applications at least comprise a historical use sequence, a historical use place and a historical use date, the historical use date refers to whether the historical use time is a holiday or a working day, and N is a positive integer greater than 1;
according to the packet names and the historical behavior data of the N first APPs, the using probabilities of a plurality of second APPs used after the current time are predicted through a specified prediction model, and the specified prediction model is obtained by counting packet names and historical behavior data of the APPs in the terminals owned by the users;
selecting at least one second APP from the plurality of second APPs, wherein the using probabilities of the at least one second APP are all larger than a first preset probability, the sum of the using probabilities is larger than a second preset probability, the first preset probability is smaller than the second preset probability, and the at least one second APP is pre-loaded into the memory; selecting a second APP with the use probability smaller than a third preset probability from the plurality of second APPs, and cleaning the second APP with the use probability smaller than the third preset probability selected from the memory;
before predicting the use probabilities of a plurality of second APPs that will be used after the current time by specifying a prediction model according to the package names and the historical behavior data of the N first APPs, the method further includes:
collecting and counting related data of APPs in terminals owned by the users, wherein the related data comprises packet names and historical behavior data of the APPs;
extracting features from the related data, wherein the features comprise packet names of APPs in terminals owned by the users and opening time of the APPs at the front end, opening time of the APPs at the back end, using places of the APPs and using dates of the APPs in the historical behavior data;
inputting the characteristics into an initialization network model for training to obtain the specified prediction model.
2. The method of claim 1, wherein prior to said preloading said at least one second APP into said memory, further comprising:
detecting the loaded APPs in the memory to determine second APPs which are not loaded in the memory from the at least one second APP;
and pre-loading the second APP which is not loaded in the memory into the memory.
3. The method of claim 1, wherein the specified predictive model is a long-short term memory (LSTM) network model.
4. An apparatus for optimizing memory, applied to a terminal, the apparatus comprising:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring packet names and historical behavior data of N first applications APP which are continuously used before current time, the historical behavior data of the N first applications at least comprise a historical use sequence, a historical use place and a historical use date, the historical use date refers to whether the historical use time is a holiday or a working day, and N is a positive integer greater than 1;
the prediction module is used for predicting the use probability of a plurality of second APPs used after the current time by specifying a prediction model according to the packet names and the historical behavior data of the N first APPs, wherein the specified prediction model is obtained by counting the packet names and the historical behavior data of the APPs in the terminals owned by the users;
the optimization module is used for selecting at least one second APP from the plurality of second APPs, wherein the using probabilities of the second APP are all larger than a first preset probability, and the sum of the using probabilities of the second APP is larger than a second preset probability; pre-loading the at least one second APP into the memory; selecting a second APP with the use probability smaller than a third preset probability from the plurality of second APPs, and cleaning the second APP with the use probability smaller than the third preset probability selected from the memory;
the apparatus also includes means for:
collecting and counting related data of APPs in terminals owned by the users, wherein the related data comprises packet names and historical behavior data of the APPs;
extracting features from the related data, wherein the features comprise packet names of APPs in terminals owned by the users and opening time of the APPs at the front end, opening time of the APPs at the back end, using places of the APPs and using dates of the APPs in the historical behavior data;
and inputting the characteristics into an initialization network model for training to obtain the specified prediction model.
5. The apparatus of claim 4, wherein the optimization module further comprises:
the detection submodule is used for detecting the loaded APPs in the memory so as to determine the second APPs which are not loaded in the memory from the at least one second APP;
and the second loading submodule is used for pre-loading the second APP which is not loaded in the memory into the memory.
6. The apparatus of claim 4, in which the specified predictive model is a long-short term memory (LSTM) network model.
7. An apparatus for memory optimization, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of any one of claims 1-3.
8. A computer-readable storage medium having instructions stored thereon, wherein the instructions, when executed by a processor, implement the steps of the method of any of claims 1-3.
CN201810159902.8A 2018-02-26 2018-02-26 Memory optimization method and device and computer readable storage medium Active CN108268322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810159902.8A CN108268322B (en) 2018-02-26 2018-02-26 Memory optimization method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810159902.8A CN108268322B (en) 2018-02-26 2018-02-26 Memory optimization method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108268322A CN108268322A (en) 2018-07-10
CN108268322B true CN108268322B (en) 2022-07-01

Family

ID=62774364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810159902.8A Active CN108268322B (en) 2018-02-26 2018-02-26 Memory optimization method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108268322B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725702A (en) * 2018-12-28 2019-05-07 三星电子(中国)研发中心 A kind of intelligent terminal power-economizing method and equipment based on AI prediction
CN109656722B (en) * 2019-01-04 2021-05-11 Oppo广东移动通信有限公司 Memory optimization method and device, mobile terminal and storage medium
WO2020154902A1 (en) * 2019-01-29 2020-08-06 深圳市欢太科技有限公司 Application processing method and apparatus, and storage medium, server and electronic device
CN109976908B (en) * 2019-03-15 2021-08-06 北京工业大学 RNN time sequence prediction-based dynamic server cluster expansion method
CN111768329B (en) * 2019-04-01 2024-03-15 维塔科技(北京)有限公司 Method and device for collecting execution time of kernel, storage medium and electronic equipment
CN110647294B (en) * 2019-09-09 2022-03-25 Oppo广东移动通信有限公司 Storage block recovery method and device, storage medium and electronic equipment
CN110737523A (en) * 2019-10-18 2020-01-31 湖南快乐阳光互动娱乐传媒有限公司 method and terminal for improving application starting speed through memory cleaning
CN111078405B (en) * 2019-12-10 2022-07-15 Oppo(重庆)智能科技有限公司 Memory allocation method and device, storage medium and electronic equipment
CN116627534B (en) * 2021-11-19 2024-04-05 荣耀终端有限公司 Application processing method and device
CN114675551A (en) * 2022-02-23 2022-06-28 青岛海尔科技有限公司 Method and device for determining operation behavior, storage medium and electronic device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679382B (en) * 2013-11-29 2018-09-07 华为技术有限公司 Application program display methods and device
CN105528659A (en) * 2016-01-27 2016-04-27 浙江大学 Mobile terminal APP usage prediction method combining with time-context based on sequence mode
CN107249074A (en) * 2017-05-16 2017-10-13 努比亚技术有限公司 Application program quick start method, mobile terminal and computer-readable recording medium
CN107133094B (en) * 2017-06-05 2021-11-02 努比亚技术有限公司 Application management method, mobile terminal and computer readable storage medium
CN107728874A (en) * 2017-09-06 2018-02-23 阿里巴巴集团控股有限公司 The method, apparatus and equipment of user prompt operation are provided
CN107544898A (en) * 2017-09-08 2018-01-05 北京小米移动软件有限公司 Data capture method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN108268322A (en) 2018-07-10

Similar Documents

Publication Publication Date Title
CN108268322B (en) Memory optimization method and device and computer readable storage medium
US20180088764A1 (en) Method, apparatus, and storage medium for sharing content
EP3188066A1 (en) A method and an apparatus for managing an application
EP3057048A1 (en) Transfer method and apparatus
US10409684B2 (en) Method, device and storage medium for cleaning memory
CN107193653B (en) Bandwidth resource allocation method, device and storage medium
EP3176709A1 (en) Video categorization method and apparatus, computer program and recording medium
CN106919629B (en) Method and device for realizing information screening in group chat
EP3173963A1 (en) Unlocking method and apparatus, computer program and recording medium
CN109246184B (en) Time information acquisition method and device and readable storage medium
CN108427618B (en) Method and device for determining stuck state and computer readable storage medium
EP3798936A1 (en) Method and device for managing information
CN107025421B (en) Fingerprint identification method and device
CN109062625B (en) Application program loading method and device and readable storage medium
CN106528247B (en) Data refreshing method and device
CN105786561B (en) Method and device for calling process
EP3171326A1 (en) Contact managing method and apparatus, computer program and recording medium
CN107203279B (en) Keyword prompting method and device
US10671827B2 (en) Method and device for fingerprint verification
CN107341000B (en) Method and device for displaying fingerprint input image and terminal
CN112883314B (en) Request processing method and device
CN114124866A (en) Session processing method, device, electronic equipment and storage medium
CN109144587B (en) Terminal control method, device, equipment and storage medium
CN108427582B (en) Method and device for determining stuck state and computer readable storage medium
CN113064739A (en) Inter-thread communication method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant