CN107797831B - Background application cleaning method and device, storage medium and electronic equipment - Google Patents
Background application cleaning method and device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN107797831B CN107797831B CN201711123657.7A CN201711123657A CN107797831B CN 107797831 B CN107797831 B CN 107797831B CN 201711123657 A CN201711123657 A CN 201711123657A CN 107797831 B CN107797831 B CN 107797831B
- Authority
- CN
- China
- Prior art keywords
- model
- training
- samples
- level
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/445—Program loading or initiating
- G06F9/44594—Unloading
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The embodiment of the application discloses a background application cleaning method, a background application cleaning device, a background application cleaning storage medium and electronic equipment.
Description
Technical Field
The application relates to the technical field of communication, in particular to a background application cleaning method and device, a storage medium and electronic equipment.
Background
With the development of the technology, the functions of electronic devices such as smart phones and tablet computers become more and more powerful, and especially the multitasking capability further improves the user experience. By utilizing the multitasking technology, when a plurality of applications are installed on the electronic equipment, the simultaneous operation of the plurality of applications can be supported, namely one application is operated in the foreground, and other applications can be operated in the background without exiting or closing. However, when the background applications are more and more, or the background applications are not cleaned for a long time, a large amount of system resources will be occupied, which results in that the available memory of the electronic device becomes less, the occupancy rate of a Central Processing Unit (CPU) is too high, and the problems of slow running speed, blocking, too fast power consumption and the like of the electronic device occur.
Disclosure of Invention
In view of this, embodiments of the present application provide a background application cleaning method, apparatus, storage medium, and electronic device, which can improve the smoothness of operation of the electronic device and reduce power consumption.
The embodiment of the application provides a background application cleaning method, which comprises the following steps:
collecting a plurality of characteristic information of an application as samples, and constructing a sample set of the application;
obtaining model information of a preset training model, and screening samples in the sample set according to the model information of the preset training model to determine training samples of the preset training model;
storing the training samples of the preset training model into a cache module;
acquiring a training sample of a preset training model from the cache module, and training the preset training model by using the acquired training sample to obtain a prediction model;
and when the application enters the background, predicting the current characteristic information of the application by using the prediction model, and determining whether the application can be cleaned according to a prediction result.
An embodiment of the present application further provides a background application cleaning device, including:
the acquisition module is used for acquiring a plurality of characteristic information of the application as samples and constructing a sample set of the application;
the screening module is used for obtaining model information of a preset training model and screening the samples in the sample set according to the model information of the preset training model so as to determine training samples of the preset training model;
the storage module is used for storing the training samples of the preset training model into the cache module;
the training module is used for acquiring training samples of a preset training model from the cache module and training the preset training model by using the acquired training samples to obtain a prediction model;
and the prediction module is used for predicting the current characteristic information of the application by using the prediction model when the application enters the background, and determining whether the application can be cleaned according to the prediction result.
An embodiment of the present application further provides a storage medium, on which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute any one of the above background application cleaning methods.
An embodiment of the present application further provides an electronic device, which includes a processor and a memory, where the memory has a computer program, and the processor is configured to execute any one of the foregoing background application cleaning methods by calling the computer program.
In the embodiment of the application, a plurality of applied characteristic information is collected to serve as samples, an applied sample set is constructed, then model information of a preset training model is obtained, and the samples in the sample set are screened according to the model information of the preset training model to determine the training samples of the preset training model; the method comprises the steps of storing a training sample of a preset searching model into a cache module, then obtaining the training sample of the preset training model from the cache module, training the preset training model by using the obtained training sample, and obtaining a prediction model, so that when an application enters a background, the current characteristic information of the application is predicted by using the prediction model, and whether the application can be cleaned or not is determined according to a prediction result, so that the automatic cleaning of the background application is realized, the operation smoothness of the electronic equipment is improved, and the power consumption is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a background application cleaning method according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a background application cleaning method according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a sample graph in the background application cleaning method according to the embodiment of the present application.
Fig. 4 is a schematic flowchart of a process of training a preset training model by using an acquired training sample in the background application cleaning method according to the embodiment of the present application.
Fig. 5 is a schematic structural diagram of a background application cleaning apparatus according to an embodiment of the present application.
Fig. 6 is another schematic structural diagram of a background application cleaning apparatus according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The term module, as used herein, may be considered a software object executing on the computing system. The various components, modules, engines, and services described herein may be viewed as objects implemented on the computing system. The apparatus and method described herein may be implemented in software, but may also be implemented in hardware, and are within the scope of the present application.
The terms "first", "second", and "third", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules listed, but rather, some embodiments may include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the application provides a background application cleaning method, and an execution main body of the background application cleaning method may be the background application cleaning device provided in the embodiment of the application, or an electronic device integrated with the background application cleaning device, where the background application cleaning device may be implemented in a hardware or software manner. The electronic device may be a smart phone, a tablet computer, a palm computer, a notebook computer, or a desktop computer.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of a background application cleaning method provided in an embodiment of the present application, taking a background application cleaning apparatus as an electronic device as an example, where the electronic device may collect a plurality of feature information of an application as samples to construct an application sample set, then obtain model information of a preset training model, and screen the samples in the sample set according to the model information of the preset training model to determine a training sample of the preset training model, store the training sample in a cache module, so that when the background application cleaning is performed subsequently using the preset training model, the training sample of the preset training model may be directly obtained from the cache module, and the obtained training sample is used to train the preset training model to obtain a prediction model, when the application enters the background, the current feature information of the application is predicted according to the prediction model, whether the application can be cleaned is determined according to the prediction result, so that automatic cleaning of the background application is realized, the operation smoothness of the electronic equipment can be improved, the power consumption is reduced, the training samples are stored in the cache module, the training samples are prevented from being reloaded when the training samples are obtained every time, and the operation efficiency can be improved.
As will be described in detail below.
Referring to fig. 2, fig. 2 is a schematic flowchart of a background application cleaning method provided in the embodiment of the present application, which specifically includes the following steps:
201. and acquiring a plurality of characteristic information of the application as samples to construct a sample set of the application.
The application may be any application installed in the electronic device, and may be, for example, an instant messaging application, a multimedia application, a game application, an information application, a shopping application, a navigation application, or a photographing application, among others. The plurality of feature information may include application-related feature information, such as: applying the duration of the cut-in to the background; the screen-off duration of the electronic equipment is prolonged when the application is switched into the background; the number of times the application enters the foreground; the time the application is in the foreground; the mode that the application enters the background, such as being switched into by a home key, being switched into by a return key, being switched into by other applications, and the like; the types of applications include primary (common applications), secondary (other applications), and the like. The plurality of feature information may further include related feature information of the electronic device where the application is located, for example: the screen-off time, the screen-on time and the current electric quantity of the electronic equipment, the wireless network connection state of the electronic equipment, whether the electronic equipment is in a charging state or not and the like. The above examples of the characteristic information do not limit the present application.
The sample set of the application may comprise a plurality of samples collected at a preset frequency during the historical time period. Historical time periods, such as the past 7 days, 10 days; the preset frequency may be, for example, one acquisition every 10 minutes, one acquisition every half hour. It is to be understood that one characteristic information of an application acquired at one time constitutes one sample, and a plurality of samples constitute a sample set of the application.
202. And obtaining model information of a preset training model, and screening samples in the sample set according to the model information of the preset training model to determine training samples of the preset training model.
The predetermined training model may be, for example, a bayesian model, a hidden markov model, a convolutional neural network model, a logistic regression model, or the like, which is not limited thereto. The model information of the preset training model may be, for example, the number of input ends of the preset training model, for example, when the number of input ends of the preset training model is ten, ten samples may be screened from the sample set as training samples of the preset training model, and the screening manner may be various, for example, random screening may be performed, or feature information related to the application itself may be screened, or feature information related to the electronic device may be screened, and the screened samples are used as training samples of the preset training model.
203. And storing the training samples of the preset training model into a cache module.
204. And obtaining training samples of a preset training model from the buffer module, and training the preset training model by using the obtained training samples to obtain a prediction model.
Before cleaning the application by using the preset training model, training the preset training model by using the training sample to obtain the model parameters of the preset training model, so that the preset training model can be determined, and a prediction model is obtained, wherein the prediction model is the preset training model with the determined model parameters.
204. And when the application enters the background, predicting the current characteristic information of the application by using the prediction model, and determining whether the application can be cleaned according to the prediction result.
The current characteristic information of the application is obtained, the prediction is carried out by using the prediction model, the obtained prediction result comprises cleanable and uncleanable results, when the prediction result is cleanable, the application is determined to be cleanable, and when the prediction result is uncleanable, the application is determined to be uncleanable, so that the automatic cleaning of the application is realized, the operation smoothness of the electronic equipment can be improved, and the power consumption is reduced. The specific prediction modes of different preset training models are different, and the prediction modes can be determined according to the preset training models which are actually set.
In this embodiment, samples required by the preset training model are screened out and stored in the cache module, so that the problem that the preset training model needs to traverse all samples to screen the required samples for training in each operation can be avoided, the samples need to be reloaded into the cache module in each operation can be avoided, and the operation efficiency is effectively improved.
The following takes a convolutional neural network model as an example to further explain how the embodiments of the present application are applied.
Specifically, the obtaining of the training sample of the preset training model from the buffer module specifically includes obtaining the training sample of the convolutional neural network model from the buffer module. Further, the training samples may be constructed as a two-dimensional sample map. Each training sample may be represented by a numerical value, e.g., a training sample is a charging state of the electronic device, which may represent uncharged and charging by 0 or 1, respectively; or 0-100 can be used for representing the residual capacity, or the capacity is divided into 5 grades, and 0-5 are used for representing the residual capacities of different grades respectively. The sample graph may be a graph of 12 × 12 pixels, where each pixel corresponds to a training sample, i.e., a feature information. Of course, the sample graph may adjust the included pixel points according to the need, such as 10 × 10, 16 × 16, 12 × 16, and the like. The larger the data size, the more accurate the subsequent prediction results. It should be noted that the pixel point may be represented by 1, or may be represented by (0, 1). And storing the acquired characteristic information in a two-dimensional mathematical image mode, wherein different characteristic values are recorded at pixel points (x, y) similarly to a gray scale image.
As shown in fig. 3, fig. 3 is a schematic diagram of a sample graph provided in the embodiment of the present application. The training samples can be divided into several types of feature information according to the types of the feature information, for example, the training samples can be divided into four types of application use information, state information of the electronic device, time information and position information, then the four types of training samples are respectively formed into a sub-sample graph 3061, and then the four sub-sample graphs 3061 are matrix-arranged to form a large sample graph 306. The subsample map 3061 may be, for example, 6 × 6 pixels, and if the training samples are not enough to fill the subsample map, the insufficient positions are zero-padded. The four subsample map 3061 forms a 12 × 12 large sample map 306. Both the sample map 306 and the sub-sample map 3061 are two-dimensional maps.
The training of the preset training model by using the obtained training samples may specifically include training of a convolutional neural network model by using the obtained training samples. The roll paper convolution neural network model comprises a convolution layer, a full-connection layer and a classifier which are connected in sequence. Specifically, the convolutional neural network mainly comprises a network structure part and a network training part, wherein the network structure part comprises a convolutional layer and a full-link layer which are sequentially connected. A pooling layer may also be included between the convolutional layer and the fully-connected layer.
Optionally, the network structure portion of the convolutional neural network module may include five layers of networks, the first three layers are convolutional layers, sizes of convolutional cores are unified to be 3 × 3, sliding step lengths are unified to be 1, due to the small dimension, a pooling layer may not be used, and the second two layers are full connection layers, which are 20 neurons and 2 neurons respectively.
It should be noted that the network structure portion may further include other convolutional layers, such as 3 convolutional layers, 7 convolutional layers, 9 convolutional layers, and the like, and may further include other fully-connected layers, such as 1 fully-connected layer, 3 fully-connected layers, and the like. A pooling layer may be added or not. The convolution kernel size may take other sizes, such as 2 x 2. Convolution kernels of different sizes are adopted by different convolution layers, for example, a convolution kernel of 3 x 3 is adopted by a convolution layer of the first layer, and a convolution kernel of 2 x 2 is adopted by a convolution layer of the other layer. The sliding step length may be unified to 2 or other values, or different sliding step lengths may be adopted, for example, the sliding step length of the first layer is 2, the sliding step length of the other layers is 1, and the like.
The network training part comprises a classifier which is a Softmax classifier.
Referring to fig. 4, fig. 4 is a schematic flow chart illustrating a process of training a preset training model by using an acquired training sample according to an embodiment of the present application, which may specifically include the following steps:
401. the sample map is input into the convolutional layer to obtain a first intermediate value.
402. And inputting the first intermediate value into the full-connection layer to obtain a second intermediate value.
403. And inputting the second intermediate value into the classifier to obtain the probability corresponding to the plurality of prediction results.
In some embodiments, the probability of obtaining the predicted result may be obtained by inputting a second intermediate value into the classifier based on a first preset formula, where the first preset formula is:
wherein Z isKA target second intermediate value, C a number of categories of the predicted outcome,is the jth second intermediate value.
404. And obtaining a loss value according to the plurality of prediction results and the plurality of probabilities corresponding to the prediction results.
In some embodiments, obtaining the loss value may obtain the loss value according to a plurality of predicted results and a plurality of probabilities corresponding to the predicted results based on a second preset formula, where the second preset formula is:
where C is the number of categories of the predicted outcome, ykAre true values.
405. And training according to the loss value to obtain an optimized parameter.
Training can be performed using a random gradient descent method based on the loss value. Training may also be performed according to a gradient descent method.
Further, obtaining the loss value may obtain the loss value according to a plurality of sets of parameters based on a third preset formula, where each set of parameters includes a plurality of prediction results and a plurality of probabilities corresponding to the prediction results, and the third preset formula is:
where C is the number of categories of the predicted outcome, ykThe true values are E, the mean values.
Wherein the optimal parameters can be obtained by training in a small batch mode. If the batch size is 128, E in the third preset formula is represented as an average value of 128 loss values.
Specifically, a plurality of training samples may be obtained first, the plurality of training samples construct a plurality of two-dimensional sample maps, then the plurality of sample maps are input into the reference model to obtain a plurality of loss values, and then an average value of the plurality of loss values is obtained.
The optimization parameters are optimization parameters (i.e., model parameters) of the convolutional neural network model, and the prediction model in the embodiment of the present application refers to the convolutional neural network model with the determined optimization parameters. Therefore, after the application enters the background, the current feature information of the application is acquired, for example, the current feature information may be multiple, and then a two-dimensional feature map is generated by using the multiple current feature information, so that prediction can be performed according to the prediction model and the feature map, and a prediction result is generated, and thus whether the application can be cleaned is determined according to the prediction result.
It should be noted that the training process of the training model may be on the server side or on the electronic device side. When the training process of the training model and the prediction process of the application by utilizing the trained training model are completed at the server side and the optimized training model needs to be used, a plurality of applied characteristic information can be input into the server, after the actual prediction of the server is completed, the prediction result is sent to the electronic equipment side, and the electronic equipment controls the preset background application program according to the prediction result.
When the training process of the training model and the application of the training model after training are completed on the electronic equipment side, and the optimized training model needs to be used, the current characteristic information of the application can be input into the electronic equipment, and after the actual prediction of the electronic equipment is completed, the electronic equipment controls the application according to the prediction result.
In some embodiments, after storing the training samples of the preset training model in the caching module, the method may further include: and sequentially arranging training samples of a preset training model according to the sequence of use. For example, the training samples used first are arranged in the front, and the training samples used last are arranged in the back of the group, so that the training samples can be quickly obtained when the preset training model is trained, and the operation efficiency is improved.
The preset training model may include at least two stages of calculation models, that is, through at least two calculation processes, training samples input by the two stages of calculation models may be different, for example, training samples selected from the applied sample set in step 202 may be divided into at least two parts, and the at least two parts are respectively input into the at least two stages of calculation models. Wherein, the training samples of the preset training model are ordered in sequence according to the sequence of use, which may include: marking training samples corresponding to each level of calculation model to obtain a training sample set corresponding to each level of calculation model; and sequencing the training sample set corresponding to each level of calculation model according to the level sequence of each level of calculation model so as to sequence the training samples of the preset training model according to the sequence of use.
The calculation process of the multi-stage calculation model is performed in sequence, namely after the calculation of the previous-stage calculation model is completed, the calculation of the next-stage calculation model is started, the level sequence of each-stage calculation model is the calculation sequence of the calculation model, namely, the training sample sets corresponding to each-stage calculation model are sequenced according to the calculation sequence of the multi-stage calculation model, so that the training sample set used firstly can be sequenced in front, and the training sample set used finally is sequenced in the back, and the calculation efficiency is improved. Further, obtaining a training sample of the preset training model from the buffer module, and training the preset training model by using the obtained training sample may include: and according to the arrangement sequence of the training sample sets, sequentially acquiring the training sample sets of each level from the buffer module, and inputting the training samples in the training sample sets of each level into the corresponding first-level calculation model, wherein the calculation result output by the previous-level calculation model is used as one training sample of the next-level calculation model and is input into the next-level calculation model. Therefore, the calculation result output by the last stage calculation model is the training result of the preset training model, namely the model parameter of the preset training model.
In some embodiments, before sequentially obtaining each level of training sample set from the buffer module according to the arrangement order of the training sample sets, the method may further include: and sequencing the training samples in the training sample set of each level according to the sequence of use. The training sample set of each level of calculation model corresponds to a first level training sample set, namely the training samples in the training sample set corresponding to the first level of calculation model are sequenced according to the use sequence of all the training samples used by the first level of calculation model, the training sample used firstly is arranged at the front, and the training sample used finally is arranged at the last, so that the operation efficiency can be further improved. Further, the training sample set includes a calculation result output by the previous-stage calculation model, and before inputting the training samples in the training sample set corresponding to the next-stage calculation model, the method further includes: adding the calculation result output by the previous-stage calculation model to a training sample set corresponding to the next-stage calculation model to update the training sample set corresponding to the next-stage calculation model; and sequencing the training samples in the training sample set of the updated next-level calculation model according to the sequence of use. Therefore, all training samples corresponding to the primary calculation model are sequenced according to the sequence of use, and the method is favorable for improving the operation efficiency.
Referring to fig. 5, an embodiment of the present application further provides a background application cleaning device, which may be concentrated in an electronic device such as a smart phone and a tablet computer. The background cleaning apparatus includes an acquisition module 501, a screening module 502, a storage module 503, a training module 504, and a prediction module 505.
The acquisition module 501 is configured to acquire a plurality of pieces of characteristic information of an application as samples, and construct a sample set of the application. The application may be any application installed in the electronic device, and may be, for example, an instant messaging application, a multimedia application, a game application, an information application, a shopping application, a navigation application, or a photographing application, among others. The plurality of feature information may include feature information related to the application itself, and may also include feature information related to the electronic device in which the application is located.
The sample set of the application may comprise a plurality of samples collected at a preset frequency during the historical time period. Historical time periods, such as the past 7 days, 10 days; the preset frequency may be, for example, one acquisition every 10 minutes, one acquisition every half hour. It is to be understood that one characteristic information of an application acquired at one time constitutes one sample, and a plurality of samples constitute a sample set of the application.
The screening module 502 is configured to obtain model information of a preset training model, and screen samples in the sample set according to the model information of the preset training model to determine a training sample of the preset training model. The predetermined training model may be, for example, a bayesian model, a hidden markov model, a convolutional neural network model, a logistic regression model, or the like, which is not limited thereto. The model information of the preset training model may be, for example, the number of input ends of the preset training model, for example, when the number of input ends of the preset training model is ten, ten samples may be screened from the sample set as training samples of the preset training model, and the screening manner may be various, for example, random screening may be performed, or feature information related to the application itself may be screened, or feature information related to the electronic device may be screened, and the screened samples are used as training samples of the preset training model.
The storage module 503 is configured to store the training samples of the preset training model in the cache module.
The training module 504 is configured to obtain training samples of a preset training model from the buffer module, and train the preset training model by using the obtained training samples to obtain a prediction model. Before cleaning the application by using the preset training model, training the preset training model by using the training sample to obtain the model parameters of the preset training model, so that the preset training model can be determined, and a prediction model is obtained, wherein the prediction model is the preset training model with the determined model parameters.
The prediction module 505 is configured to predict current feature information of the application by using the prediction model when the application enters the background, and determine whether the application can be cleaned according to a prediction result. The current characteristic information of the application is obtained, the prediction is carried out by using the prediction model, the obtained prediction result comprises cleanable and uncleanable results, when the prediction result is cleanable, the application is determined to be cleanable, and when the prediction result is uncleanable, the application is determined to be uncleanable, so that the automatic cleaning of the application is realized, the operation smoothness of the electronic equipment can be improved, and the power consumption is reduced. The specific prediction modes of different preset training models are different, and the prediction modes can be determined according to the preset training models which are actually set.
In this embodiment, samples required by the preset training model are screened out and stored in the cache module, so that the problem that the preset training model needs to traverse all samples to screen the required samples for training in each operation can be avoided, the samples need to be reloaded into the cache module in each operation can be avoided, and the operation efficiency is effectively improved.
In the embodiment of the present application, as shown in fig. 6, the background application cleaning apparatus may further include a sorting module 506. After the training samples of the preset training model are stored in the cache module, the sorting module 506 is configured to sequentially sort the training samples of the preset training model according to the sequence of use. For example, the training samples used first are arranged in the front, and the training samples used last are arranged in the back of the group, so that the training samples can be quickly obtained when the preset training model is trained, and the operation efficiency is improved.
In some embodiments, the predetermined training model may include at least two stages of computation models, that is, through at least two computation processes, the training samples input by the two stages of computation models may be different, for example, the training samples selected from the applied sample set in step 202 may be divided into at least two parts, and input into the at least two stages of computation models respectively. The sorting module 506 may be specifically configured to label the training samples corresponding to each level of the computation model to obtain a training sample set corresponding to each level of the computation model; and then, sequencing the training sample set corresponding to each level of calculation model according to the level sequence of each level of calculation model so as to sequence the training samples of the preset training model according to the sequence of use. The calculation process of the multi-stage calculation model is performed in sequence, namely after the calculation of the previous-stage calculation model is completed, the calculation of the next-stage calculation model is started, the level sequence of each-stage calculation model is the calculation sequence of the calculation model, namely, the training sample sets corresponding to each-stage calculation model are sequenced according to the calculation sequence of the multi-stage calculation model, so that the training sample set used firstly can be sequenced in front, and the training sample set used finally is sequenced in the back, and the calculation efficiency is improved.
The training module 504 is specifically configured to sequentially obtain training sample sets of each level from the buffer module according to the arrangement order of the training sample sets, and input the training samples in the training sample sets of each level into a corresponding one-level computation model, where a computation result output by a previous-level computation model is input into a next-level computation model as a training sample of the next-level computation model. Therefore, the calculation result output by the last stage calculation model is the training result of the preset training model, namely the model parameter of the preset training model.
Before the training module 504 sequentially obtains each level of training sample set from the buffer module according to the arrangement order of the training sample sets, the sorting module 506 is further configured to sort the training samples in each level of training sample set according to the sequence of use. The training sample set of each level of calculation model corresponds to a first level training sample set, namely the training samples in the training sample set corresponding to the first level of calculation model are sequenced according to the use sequence of all the training samples used by the first level of calculation model, the training sample used firstly is arranged at the front, and the training sample used finally is arranged at the last, so that the operation efficiency can be further improved.
Further, the training sample set includes a calculation result output by the previous-stage calculation model, and before the training samples in the training sample set corresponding to the next-stage calculation model are input to the next-stage calculation model, the sorting module 506 is further configured to add the calculation result output by the previous-stage calculation model to the training sample set corresponding to the next-stage calculation model to update the training sample set corresponding to the next-stage calculation model; and then sequencing the training samples in the training sample set of the updated next-level calculation model according to the sequence of use. Therefore, all training samples corresponding to the primary calculation model are sequenced according to the sequence of use, and the method is favorable for improving the operation efficiency.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
The embodiment of the application also provides the electronic equipment. Referring to fig. 7, an electronic device 700 includes a processor 701 and a memory 702. The processor 701 is electrically connected to the memory 702.
The processor 700 is a control center of the electronic device 700, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device 700 by running or loading a computer program stored in the memory 702, and calls data stored in the memory 702, and processes the data, thereby integrally monitoring the electronic device 700.
The memory 702 may be used to store software programs and modules, and the processor 701 may execute various functional applications and data processing by operating the computer programs and modules stored in the memory 702. The memory 702 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, a computer program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 702 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 702 may also include a memory controller to provide the processor 701 with access to the memory 702.
In this embodiment, the processor 701 in the electronic device 700 loads instructions corresponding to one or more processes of the computer program into the memory 702 according to the following steps, and the processor 701 executes the computer program stored in the memory 702, thereby implementing various functions as follows:
acquiring a plurality of applied characteristic information as samples, constructing an applied sample set, then acquiring model information of a preset training model, and screening the samples in the sample set according to the model information of the preset training model to determine training samples of the preset training model; the method comprises the steps of storing a training sample of a preset searching model into a cache module, then obtaining the training sample of the preset training model from the cache module, training the preset training model by using the obtained training sample to obtain a prediction model, predicting current characteristic information of an application by using the prediction model when the application enters a background, and determining whether the application can be cleaned according to a prediction result.
After the training samples of the preset training model are stored in the cache module, the training samples of the preset training model can be sequenced in sequence according to the sequence of use.
The preset training model comprises at least two stages of calculation models, training samples corresponding to each stage of calculation models can be marked to obtain training sample sets corresponding to each stage of calculation models, and then the training sample sets corresponding to each stage of calculation models are sequenced according to the stage sequence of each stage of calculation models so as to sequence the training samples of the preset training models in sequence according to the sequence of use.
When the preset training model is trained, each level of training sample set can be sequentially obtained from the buffer module according to the arrangement sequence of the training sample sets, the training samples in each level of training sample set are input into the corresponding first-level calculation model, and the calculation result output by the previous-level calculation model is input into the next-level calculation model as one training sample of the next-level calculation model.
Before the training samples in the training sample set corresponding to the next-stage calculation model are input into the next-stage calculation model, the calculation result output by the previous-stage calculation model can be added into the training sample set corresponding to the next-stage calculation model to update the training sample set corresponding to the next-stage calculation model; and sequencing the training samples in the training sample set of the updated next-level calculation model according to the sequence of use.
As can be seen from the above, in the electronic device according to the embodiment of the present application, a sample set of an application is constructed by collecting a plurality of feature information of the application as samples, then model information of a preset training model is obtained, the samples in the sample set are screened according to the model information of the preset training model to determine training samples of the preset training model, the training samples are stored in a cache module, then training samples of the preset training model are obtained from the cache module, the training model is trained by using the obtained training samples to obtain a prediction model, when the application enters a background, current feature information of the application is predicted by using the prediction model, and whether the application can be cleaned is determined according to a prediction result, so that automatic cleaning of the background application is achieved, operation smoothness of the electronic device is improved, and power consumption is reduced.
Referring to fig. 8, in some embodiments, the electronic device 700 may further include: a display 703, a radio circuit 704, an audio circuit 705, and a power supply 706. The display 703, the rf circuit 704, the audio circuit 705, and the power source 706 are electrically connected to the processor 701 respectively.
The display 703 may be used to display information entered by or provided to the user as well as various graphical user interfaces, which may be comprised of graphics, text, icons, video, and any combination thereof. The Display 703 may include a Display panel, which may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like, in some embodiments.
The radio frequency circuit 704 may be used for transceiving radio frequency signals to establish wireless communication with a network device or other electronic devices through wireless communication, and to transceive signals with the network device or other electronic devices.
The audio circuit 705 may be used to provide an audio interface between a user and an electronic device through a speaker, microphone.
The power supply 706 may be used to power various components of the electronic device 700. In some embodiments, the power supply 706 may be logically coupled to the processor 701 through a power management system, such that the power management system may perform functions of managing charging, discharging, and power consumption.
Although not shown in fig. 8, the electronic device 700 may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
An embodiment of the present application further provides a storage medium, where the storage medium stores a computer program, and when the computer program runs on a computer, the computer is caused to execute the background application cleaning method in any one of the above embodiments, for example: acquiring a plurality of applied characteristic information as samples, constructing an applied sample set, then acquiring model information of a preset training model, and screening the samples in the sample set according to the model information of the preset training model to determine training samples of the preset training model; the method comprises the steps of storing a training sample of a preset searching model into a cache module, then obtaining the training sample of the preset training model from the cache module, training the preset training model by using the obtained training sample to obtain a prediction model, predicting current characteristic information of an application by using the prediction model when the application enters a background, and determining whether the application can be cleaned according to a prediction result.
In the embodiment of the present application, the storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
It should be noted that, for the background application cleaning method in the embodiment of the present application, it can be understood by a person skilled in the art that all or part of the process of implementing the background application cleaning method in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer readable storage medium, such as a memory of an electronic device, and executed by at least one processor in the electronic device, and the process of executing the process can include, for example, the process of the embodiment of the background application cleaning method. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random access memory, etc.
For the background application cleaning device in the embodiment of the present application, each functional module may be integrated in one processing chip, or each module may exist alone physically, or two or more modules are integrated in one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The background application cleaning method, the background application cleaning device, the storage medium and the electronic device provided by the embodiment of the application are introduced in detail, a specific example is applied in the description to explain the principle and the implementation manner of the application, and the description of the embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (8)
1. A background application cleanup method, comprising:
collecting a plurality of characteristic information of the application as samples;
classifying the samples according to the types of the characteristic information, and constructing the sample set after application classification;
obtaining model information of a preset training model, and screening samples in the classified sample set according to the model information of the preset training model to determine training samples of the preset training model;
storing the training samples of the preset training model into a cache module;
the preset training model comprises at least two stages of calculation models, and training samples corresponding to each stage of calculation model are marked to obtain a training sample set corresponding to each stage of calculation model;
according to the level sequence of each level of calculation model, sequencing the training sample set corresponding to each level of calculation model so as to sequence the training samples of the preset training model in sequence according to the sequence of use;
according to the arrangement sequence of the training sample sets, sequentially obtaining the training sample sets of each level from the cache module, inputting the training samples in the training sample sets of each level into a corresponding first-level calculation model, wherein a calculation result output by a previous-level calculation model is used as a training sample of a next-level calculation model and is input into the next-level calculation model, and a prediction model is obtained;
and when the application enters the background, predicting the current characteristic information of the application by using the prediction model, and determining whether the application can be cleaned according to a prediction result.
2. The method according to claim 1, wherein before sequentially obtaining each level of training sample set from the buffer module according to the ranking order of the training sample sets, the method further comprises:
and sequencing the training samples in the training sample set of each level according to the sequence of use.
3. The method of claim 2, wherein before inputting the training samples in the training sample set corresponding to the next-stage computational model into the next-stage computational model, the method further comprises:
adding the calculation result output by the previous-stage calculation model to a training sample set corresponding to the next-stage calculation model to update the training sample set corresponding to the next-stage calculation model;
and sequencing the training samples in the training sample set of the updated next-level calculation model according to the sequence of use.
4. A background application cleaning apparatus, comprising:
the acquisition module is used for acquiring a plurality of applied characteristic information as samples, classifying the samples according to the types of the characteristic information and constructing a classified sample set of the applications;
the screening module is used for obtaining model information of a preset training model and screening the samples in the classified sample set according to the model information of the preset training model so as to determine training samples of the preset training model;
the storage module is used for storing the training samples of the preset training model into the cache module;
the preset training model comprises at least two stages of calculation models, and the sequencing module is used for marking training samples corresponding to each stage of calculation models to obtain a training sample set corresponding to each stage of calculation models; according to the level sequence of each level of calculation model, sequencing the training sample set corresponding to each level of calculation model so as to sequence the training samples of the preset training model in sequence according to the sequence of use;
the training module is used for sequentially acquiring training sample sets of each level from the cache module according to the arrangement sequence of the training sample sets, inputting the training samples in the training sample sets of each level into a corresponding first-level calculation model, and inputting a calculation result output by a previous-level calculation model into a next-level calculation model as a training sample of the next-level calculation model to obtain a prediction model;
and the prediction module is used for predicting the current characteristic information of the application by using the prediction model when the application enters the background, and determining whether the application can be cleaned according to the prediction result.
5. The apparatus of claim 4, wherein the sorting module is further configured to sort the training samples in the training sample sets of each level according to the order of use.
6. The apparatus of claim 5, wherein the ordering module is further configured to:
adding the calculation result output by the previous-stage calculation model to a training sample set corresponding to the next-stage calculation model to update the training sample set corresponding to the next-stage calculation model;
and sequencing the training samples in the training sample set of the updated next-level calculation model according to the sequence of use.
7. A storage medium having stored thereon a computer program, characterized in that, when the computer program is run on a computer, it causes the computer to execute the background application cleaning method according to any one of claims 1 to 3.
8. An electronic device comprising a processor and a memory, the memory having a computer program, wherein the processor is configured to execute the background application cleaning method according to any one of claims 1 to 3 by calling the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711123657.7A CN107797831B (en) | 2017-11-14 | 2017-11-14 | Background application cleaning method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711123657.7A CN107797831B (en) | 2017-11-14 | 2017-11-14 | Background application cleaning method and device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107797831A CN107797831A (en) | 2018-03-13 |
CN107797831B true CN107797831B (en) | 2021-06-01 |
Family
ID=61535920
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711123657.7A Active CN107797831B (en) | 2017-11-14 | 2017-11-14 | Background application cleaning method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107797831B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020206696A1 (en) * | 2019-04-12 | 2020-10-15 | 深圳市欢太科技有限公司 | Application cleaning method, apparatus, storage medium and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831442A (en) * | 2011-06-13 | 2012-12-19 | 索尼公司 | Abnormal behavior detection method and equipment and method and equipment for generating abnormal behavior detection equipment |
CN105389193A (en) * | 2015-12-25 | 2016-03-09 | 北京奇虎科技有限公司 | Accelerating processing method, device and system for application, and server |
CN106096538A (en) * | 2016-06-08 | 2016-11-09 | 中国科学院自动化研究所 | Face identification method based on sequencing neural network model and device |
CN107133094A (en) * | 2017-06-05 | 2017-09-05 | 努比亚技术有限公司 | Application management method, mobile terminal and computer-readable recording medium |
-
2017
- 2017-11-14 CN CN201711123657.7A patent/CN107797831B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831442A (en) * | 2011-06-13 | 2012-12-19 | 索尼公司 | Abnormal behavior detection method and equipment and method and equipment for generating abnormal behavior detection equipment |
CN105389193A (en) * | 2015-12-25 | 2016-03-09 | 北京奇虎科技有限公司 | Accelerating processing method, device and system for application, and server |
CN106096538A (en) * | 2016-06-08 | 2016-11-09 | 中国科学院自动化研究所 | Face identification method based on sequencing neural network model and device |
CN107133094A (en) * | 2017-06-05 | 2017-09-05 | 努比亚技术有限公司 | Application management method, mobile terminal and computer-readable recording medium |
Also Published As
Publication number | Publication date |
---|---|
CN107797831A (en) | 2018-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108337358B (en) | Application cleaning method and device, storage medium and electronic equipment | |
CN111813532B (en) | Image management method and device based on multitask machine learning model | |
WO2019062413A1 (en) | Method and apparatus for managing and controlling application program, storage medium, and electronic device | |
CN113039562A (en) | Probabilistic neural network architecture generation | |
CN107870810B (en) | Application cleaning method and device, storage medium and electronic equipment | |
CN108108455B (en) | Destination pushing method and device, storage medium and electronic equipment | |
CN107678531B (en) | Application cleaning method and device, storage medium and electronic equipment | |
CN107885545B (en) | Application management method and device, storage medium and electronic equipment | |
WO2019062418A1 (en) | Application cleaning method and apparatus, storage medium and electronic device | |
CN110705646B (en) | Mobile equipment streaming data identification method based on model dynamic update | |
CN107943582B (en) | Feature processing method, feature processing device, storage medium and electronic equipment | |
CN107608778B (en) | Application program control method and device, storage medium and electronic equipment | |
CN109992367A (en) | Application processing method and device, electronic equipment, computer readable storage medium | |
CN111222557A (en) | Image classification method and device, storage medium and electronic equipment | |
CN112906865B (en) | Neural network architecture searching method and device, electronic equipment and storage medium | |
CN111797870A (en) | Optimization method and device of algorithm model, storage medium and electronic equipment | |
WO2022035441A1 (en) | Dynamic dispatching with robustness for large-scale heterogeneous mining fleet via deep reinforcement learning | |
CN115879508A (en) | Data processing method and related device | |
CN107797831B (en) | Background application cleaning method and device, storage medium and electronic equipment | |
WO2019062411A1 (en) | Method for managing and controlling background application program, storage medium, and electronic device | |
CN108234758B (en) | Application display method and device, storage medium and electronic equipment | |
CN108681480B (en) | Background application program control method and device, storage medium and electronic equipment | |
CN107870809B (en) | Application closing method and device, storage medium and electronic equipment | |
CN112948763B (en) | Piece quantity prediction method and device, electronic equipment and storage medium | |
CN109961163A (en) | Gender prediction's method, apparatus, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |