WO2019128598A1 - Procédé de traitement d'application, dispositif électronique et support de stockage lisible par ordinateur - Google Patents

Procédé de traitement d'application, dispositif électronique et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2019128598A1
WO2019128598A1 PCT/CN2018/117694 CN2018117694W WO2019128598A1 WO 2019128598 A1 WO2019128598 A1 WO 2019128598A1 CN 2018117694 W CN2018117694 W CN 2018117694W WO 2019128598 A1 WO2019128598 A1 WO 2019128598A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature
sample set
information gain
sample
classification
Prior art date
Application number
PCT/CN2018/117694
Other languages
English (en)
Chinese (zh)
Inventor
方攀
陈岩
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019128598A1 publication Critical patent/WO2019128598A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5016Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals the resource being the memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals

Definitions

  • the present application relates to the field of data processing, and in particular, to an application processing method, an electronic device, and a computer readable storage medium.
  • the traditional method is to force the exit of the corresponding background application.
  • the traditional method is to select the length of the background in the background, the frequency of use, the duration of the application, etc., and select the longer duration of the background station and the application frequency.
  • a background application that is low or has a short duration forcibly exits the selected background application.
  • An application processing method, an electronic device, and a computer readable storage medium are provided according to various embodiments of the present application.
  • An application processing method includes: acquiring first feature data of each feature, wherein the first feature data is feature data of a corresponding feature at a predicted time; and acquiring is used to predict whether the user will use the preset time length a decision tree model of the target application, the start time of the preset duration is a predicted time; the first feature data is used as an input of the decision tree model, and the predicted result is output; and when the predicted result is not The target application is frozen when the target application is used within a preset duration.
  • An electronic device includes a memory and a processor, wherein the memory stores a computer program, and when the computer program is executed by the processor, the processor performs the following operations:
  • the first feature data is feature data of the corresponding feature at the predicted time; acquiring a decision tree model for predicting whether the user will use the target application within a preset duration, The starting time of the preset duration is a predicted time; the first feature data is used as an input of the decision tree model, and the predicted result is output; and when the predicted result is not used within a preset duration When the target application is described, the target application is frozen.
  • a computer readable storage medium having stored thereon a computer program that is executed by a processor to:
  • the first feature data is feature data of the corresponding feature at the predicted time; acquiring a decision tree model for predicting whether the user will use the target application within a preset duration, The starting time of the preset duration is a predicted time; the first feature data is used as an input of the decision tree model, and the predicted result is output; and when the predicted result is not used within a preset duration When the target application is described, the target application is frozen.
  • the decision tree model of the target application is preset, and the first feature data of each feature and the decision tree model of the target application are obtained, and the first feature data is used as an input of the decision tree model to obtain Whether the user output by the decision tree model uses the prediction result of the target application within a preset duration, and when the prediction result is that the target application is not used within the preset duration, the target application is frozen to limit the target.
  • the application's occupation of resources improves the accuracy of freezing the target application, which in turn increases the effectiveness of system resource release.
  • FIG. 1 is a schematic diagram showing the internal structure of an electronic device in an embodiment.
  • FIG. 2 is a partial block diagram of a system in an electronic device in an embodiment.
  • FIG. 3 is an application environment diagram of an application processing method in an embodiment.
  • FIG. 4 is a flow chart of an application processing method in one embodiment.
  • FIG. 5 is a flow chart of sample classification of a sample set according to an information gain rate classified by a feature according to a feature in one embodiment, and a decision tree model for predicting whether a user will use a target application within a preset duration.
  • Figure 6A is a schematic diagram of a decision tree in one embodiment.
  • Figure 6B is a schematic diagram of a decision tree in another embodiment.
  • Figure 6C is a schematic diagram of a decision tree in yet another embodiment.
  • FIG. 7 is a flow diagram of obtaining an information gain rate for a feature classification for a target sample set in one embodiment.
  • Figure 8 is a flow chart of an application processing method in another embodiment.
  • Figure 9 is a block diagram showing the structure of an application processing device in an embodiment.
  • Figure 10 is a block diagram showing the structure of an application processing device in another embodiment.
  • Figure 11 is a block diagram showing the structure of an application processing device in still another embodiment.
  • Figure 12 is a block diagram showing a portion of the structure of a handset in an embodiment.
  • first, second and the like may be used to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another.
  • first feature data may be referred to as second feature data without departing from the scope of the present invention, and similarly, the second feature data may be referred to as first feature data.
  • Both the first feature data and the second feature data are feature data, but they are not the same feature data.
  • an internal structure diagram of an electronic device includes a processor, memory, and display screen connected by a system bus.
  • the processor is used to provide computing and control capabilities to support the operation of the entire electronic device.
  • the memory is used to store data, programs, and/or instruction codes, etc., and the memory stores at least one computer program, which can be executed by the processor to implement an application processing method suitable for an electronic device provided in the embodiments of the present application.
  • the memory may include a non-volatile storage medium such as a magnetic disk, an optical disk, a read-only memory (ROM), or a random storage memory (Random-Access-Memory, RAM).
  • the memory includes a non-volatile storage medium and an internal memory.
  • Non-volatile storage media stores operating systems, databases, and computer programs.
  • the database stores data related to an application processing method provided by the above various embodiments, for example, information such as the name of each process or application may be stored.
  • the computer program can be executed by a processor for implementing an application processing method provided by various embodiments of the present application.
  • the internal memory provides a cached operating environment for operating systems, databases, and computer programs in non-volatile storage media.
  • the display screen can be a touch screen, such as a capacitive screen or an electronic screen, for displaying interface information of an application corresponding to the first process, and can also be used for detecting a touch operation applied to the display screen, and generating corresponding instructions, such as before execution. Switching instructions for background applications, etc.
  • the structure shown in FIG. 1 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • the electronic device further includes a network interface connected through a system bus, and the network interface may be an Ethernet card or a wireless network card, etc., for communicating with an external electronic device, for example, for communicating with a server.
  • the network interface may be an Ethernet card or a wireless network card, etc.
  • the external electronic device for example, for communicating with a server.
  • the architecture of the electronic device includes a JAVA spatial layer 210, a local framework layer 220, and a kernel space layer 230.
  • the JAVA spatial layer 210 can include a freeze management application 212, by which the electronic device can implement a freeze policy for each application, and perform freezing and thawing management operations on the related applications of the background power consumption.
  • the resource priority and restriction management module 222 and the platform freeze management module 224 are included in the local framework layer 220.
  • the electronic device can maintain different applications in different priorities and different resource organizations through the resource priority and limit management module 222, and adjust the resource group of the application according to the requirements of the upper layer to achieve optimized performance and save power. effect.
  • the electronic device can allocate the tasks that can be frozen in the background to the frozen layer corresponding to the preset different levels according to the length of the entering freeze time through the platform freeze management module 224.
  • the frozen layer can include three, respectively: the CPU Limit sleep mode, CPU freeze sleep mode, process deep freeze mode.
  • the CPU restricts the sleep mode to limit the CPU resources occupied by the related processes, so that the related processes occupy less CPU resources, and the free CPU resources are tilted to other unfrozen processes, thereby limiting the occupation of CPU resources.
  • the kernel space layer 230 includes a UID management module 231, a Cgroup module 232, a Binder management module 233, a process memory recovery module 234, and a freeze timeout exit module 235.
  • the UID management module 231 is configured to implement an application-based User Identifier (UID) to manage resources of a third-party application or freeze. Compared with the Process Identifier (PID) for process management and control, it is easier to uniformly manage the resources of a user's application through UID.
  • the Cgroup module 232 is used to provide a complete set of Central Processing Unit (CPU), CPUSET, memory, input/output (I/O), and Net related resource restriction mechanisms.
  • the Binder management module 233 is used to implement the priority control of the background binder communication.
  • the interface module of the local framework layer 220 includes a binder interface developed to the upper layer, and the upper layer framework or application sends a resource restriction or frozen instruction to the resource priority and restriction management module 222 and the platform freeze management module 224 through the provided binder interface.
  • the process memory recovery module 234 is configured to implement the process deep freeze mode, so that when a third-party application is in a frozen state for a long time, the file area of the process is mainly released, thereby saving the memory module and speeding up the application next time. The speed at startup.
  • the freeze timeout exit module 235 is configured to resolve an exception generated by the freeze timeout scenario.
  • an electronic device may collect feature data of a preset feature for embodying a user behavior habit as a sample under different time periods. Forming a sample set, classifying the sample set according to the information gain rate of the feature classification for the sample, to construct a decision tree model for predicting whether the target application will be used within a preset duration; and then predicting the preset feature at the moment
  • the feature data is used as an input to the decision tree model to obtain a prediction result.
  • the prediction result is that the target application is not used within the preset duration when the prediction result is that the target application is not frozen, the target application is not frozen.
  • an application processing method is provided.
  • the embodiment is applied to the electronic device shown in FIG. 1 as an example for description.
  • the method includes:
  • Operation 402 Acquire first feature data of each feature, where the first feature data is feature data of the corresponding feature at the predicted time.
  • the feature data is data for characterizing the operational habits of the user's application to the electronic device.
  • the feature may be one or more dimensions of the device hardware and software features of the electronic device, the device usage state, and the user operation time duration.
  • the electronic device can bury the point in the preset path of the system, and detect the feature data of each feature in real time according to the preset sampling frequency.
  • the electronic device may also record the feature data of the relevant feature and the application identifier of the application at the startup time when the application is activated, and associate the application identifier with the recorded feature data.
  • the feature includes a current time period when the corresponding data is recorded, a current date category, an application identifier of the previous foreground application, an application identifier of the previous foreground application, a current wireless fidelity WiFi (WIreless Fidelity) connection status, The WiFi ID of the connected WiFi, the duration of the application staying in the background, the duration of the application during the background pause, the current plug-in status of the headset, the current charging status, the current battery level, the way the application is switched, the category of the application, and the application One or more of the characteristics such as the number of times to switch to the foreground.
  • WIreless Fidelity WIreless Fidelity
  • the date category may include a working day and a rest day, and the WiFi connection status includes the unconnected WiFi and the connected WiFi, and the WiFi identifier may be a Service Set Identifier (SSID) or a Basic Service Set Identifier (BSSID). Any of the types of information that can uniquely identify the WiFi; the category of the application includes a social application, a payment application, a game application, a tool application, and the like, and the classification of the application may include multiple types, and is not limited thereto.
  • the application switching means that the application switches from the foreground to the background or from the background to the foreground.
  • the switching manner may include an application switching formed by directly opening an application according to an application startup icon, an application switching formed by clicking an application notification message, and directly exiting the application. Application switching, etc.
  • the previous foreground application indicates the application that was running in the foreground at the moment when the feature was recorded.
  • the previous application indicates the application that was last running in the foreground at the time of recording the feature.
  • the first feature data refers to the feature data of the corresponding feature at the predicted time.
  • the target application for example, the first feature data may include the current time period and the current date category at the predicted time. , the previous foreground application name, the previous foreground application name, the current WiFi connection status, the WiFi identity of the connected WiFi, the duration of the target application staying in the background, the duration of the target application during the background pause, and the current headset plugging and unplugging One or more characteristics of the status, the current state of charge, the current battery level, the manner in which the target application is switched, the category of the target application, the number of times the target application switches to the foreground, and the like.
  • the predicted time may be the current time.
  • the electronic device may initiate a freeze instruction for one or more applications running in the background by freezing the management application 212, and the freeze instruction may begin to acquire the first feature data of each feature according to the instruction. .
  • the electronic device may trigger the application freeze instruction when detecting that the available resource is lower than a preset threshold, or may initiate a freeze instruction to the application when detecting that an application is switched to the background.
  • the application for which the freeze instruction is directed is the target application.
  • Operation 404 obtaining a decision tree model for predicting whether the user will use the target application within a preset duration; the starting time of the preset duration is the predicted time.
  • the decision tree model is a model for predicting whether the target application will be used within the preset duration.
  • the starting time of the preset duration is the predicted time, that is, whether the predicted target application will be used within the preset duration from the current time.
  • the preset duration can be any suitable length of time, and can be set according to an empirical value, such as 1 minute, 5 minutes, 10 minutes, or 30 minutes.
  • the target application is the application of the benefit measurement, wherein the target application may be one or more applications in the electronic device that are currently running in the background.
  • the electronic device presets a decision tree model corresponding to the target application, and obtains a decision tree model corresponding to the target application to predict whether the target application will be used by the user within a preset duration.
  • different decision tree models can be set correspondingly.
  • a corresponding decision tree model can be set for each application, or a corresponding decision tree model can be set for each type of application.
  • the first feature data is used as an input of the decision tree model, and the prediction result is output.
  • the electronic device may use the collected first feature data of the respective features at the predicted time as the data of the decision tree model, and run the decision tree model to obtain an output result of the decision tree.
  • the result of this data is the predicted result.
  • the prediction result includes that the target application will not be used within the preset duration or will be used within the preset duration.
  • the electronic device can freeze the target application.
  • the electronic device may send the prediction result to the platform freeze management module 224 as shown in FIG. 2, and when the platform freeze management module 224 receives the prediction result that does not use the target application within the preset duration, the The target application initiates a freeze operation, thereby limiting the resources that the target application can use, such as adopting any one of a CPU-limited sleep mode, a CPU freeze sleep mode, or a process deep freeze mode to freeze the target application.
  • the application processing method described above by setting a decision tree model of the target application in advance, and acquiring the first feature data of each feature and the decision tree model of the target application, using the first feature data as an input of the decision tree model, Whether the user output by the decision tree model uses the prediction result of the target application within a preset duration, and when the prediction result is that the target application is not used within the preset duration, the target application is frozen to limit The target application's occupation of resources. Since it is necessary to consume a certain amount of system resources when freezing the application, whether the target application is used within the preset duration by performing the prediction, and freezing when not in use, thereby reducing the user to freeze after the target application is frozen. The application is used within the preset duration, and the application needs to be thawed, which causes unnecessary freezing operations, improves the accuracy of freezing the target application, and thus improves the effectiveness of releasing system resources.
  • the method before the operation 402, the method further includes: acquiring preset second feature data of each feature as a sample, and generating a sample set, where the second feature data is a correspondence when the reference application is started before the predicted time Feature data of the feature, the reference application includes a target application; when the data volume of the sample set exceeds a preset threshold, the sample set is sample-classified according to the information gain rate of the feature classification for the sample set, and is generated for predicting whether the user is preset The decision tree model of the target application is used within the duration.
  • the second feature data is feature data of each feature recorded before the predicted time, and the second feature data has an association relationship with the application identifier of the corresponding reference application, that is, before the predicted time, the reference application is detected
  • the feature data of each feature ie, the second feature data
  • the second feature data is recorded at the startup time, and the second feature data is associated with the application identifier of the reference application.
  • the electronic device may collect the second feature data of each feature according to a preset frequency through a historical time period, and use the second feature recorded each time as a sample to form a sample set.
  • a decision tree model of each application requiring prediction is constructed based on the accumulated sample set.
  • the historical time period may be, for example, the past 7 days or 10 days;
  • the preset frequency may be, for example, collected every 10 minutes and collected every half hour. It can be understood that the multi-dimensional feature data collected at one time constitutes one sample, and multiple samples collected multiple times constitute a sample set.
  • each sample in the sample set can be marked, the application identification of the sample is marked, and the sample label of each sample is obtained.
  • the tag for each sample is the application ID of the corresponding association. Since the target application is targeted, the sample tag can be classified into a target application and a non-target application, that is, the sample category includes a "target application” and a "non-target application.”
  • the sample tag of the second feature data of the recorded feature is set as the "target application”
  • the sample tag of the second feature data of the recorded feature is activated when the non-target application is started. Set to "non-target app".
  • the value "1" may be used to indicate "target application”
  • the value "0" may be used to indicate "non-target application", and vice versa.
  • the preset threshold may be any suitable value preset, for example, may be 10000, that is, when the number of samples recorded exceeds 10000, the decision tree model is started. Alternatively, the larger the number of samples, the more accurate the decision tree model is constructed.
  • the electronic device may quantize the feature information in the feature data that is not directly represented by the value by a specific value. For example, for the feature of the current WiFi connection state, the value 1 may be used to indicate that Connect WiFi, use the value 0 to indicate that WiFi is not connected (or vice versa).
  • the embodiment of the present application may perform sample classification on the sample set based on the information gain rate of the feature classification for the sample set to construct a decision tree model for predicting whether the user will use the target application within a preset duration.
  • a decision tree model can be constructed based on the C4.5 algorithm.
  • the decision tree is a tree built on the basis of decision-making.
  • a decision tree is a predictive model that represents a mapping between object attributes and object values.
  • Each node represents an object, and each forked path in the tree represents a certain Possible attribute values, and each leaf node corresponds to the value of the object represented by the path from the root node to the leaf node.
  • the decision tree has only a single output. If there are multiple outputs, separate decision trees can be created to handle different outputs.
  • the C4.5 algorithm is a kind of decision tree. It is a series of algorithms used in the classification problem of machine learning and data mining. It is an important algorithm improved by ID3. Its goal is to supervise learning: Given a data set, each of these tuples can be described by a set of attribute values, each of which belongs to a class in a mutually exclusive category. The goal of C4.5 is to find a mapping from attribute values to categories by learning, and this mapping can be used to classify entities with unknown new categories.
  • ID3 (Iterative Dichotomiser 3, iterative binary tree 3 generation) is based on the Occam razor principle, that is, to do more with as few things as possible. In information theory, the smaller the expected information, the greater the information gain and the higher the purity.
  • the core idea of the ID3 algorithm is to measure the choice of attributes with information gain, and select the attribute with the largest information gain after splitting to split. The algorithm uses a top-down greedy search to traverse possible decision spaces.
  • the information gain rate may be defined as the ratio of the information gain of the feature to the sample classification and the split information of the feature to the sample classification.
  • the specific information gain rate acquisition method is described below.
  • the information gain is for one feature. It is to look at a feature t. What is the amount of information when the system has it and without it? The difference between the two is the amount of information that the feature brings to the system, that is, the information gain. .
  • the split information is used to measure the breadth and uniformity of the feature split data (such as the sample set), and the split information can be the entropy of the feature.
  • the sample set is sample-classified according to the information gain rate of the feature classification for the sample set, and a decision tree model for predicting whether the user will use the target application within a preset duration is generated.
  • the sample set is used as the node information of the root node of the decision tree; and the node information of the root node is determined as the target sample set to be classified currently.
  • sample set is taken as the target sample set to be classified currently.
  • classification of sample sets includes "target applications” and “non-target applications.”
  • Operation 502 obtaining an information gain rate of the feature classification for the target sample set.
  • Operation 503 selecting a sample from the target sample set as the division feature according to the information gain rate.
  • the partitioning feature is a feature selected from the features according to the information gain rate of each feature for the sample set classification, and is used to classify the sample set.
  • the feature according to the information gain rate there are various ways to select the feature according to the information gain rate. For example, in order to improve the accuracy of the sample classification, the feature corresponding to the maximum information gain rate may be selected as the division feature.
  • a gain rate threshold may also be set; when the maximum information gain rate is greater than the threshold, the electronic device selects a feature corresponding to the information gain rate as a division feature.
  • the maximum target information gain rate may be selected from the information gain rate; whether the target information gain rate is greater than a preset threshold; and when the target information gain rate is greater than the preset threshold, the feature corresponding to the target information gain rate is selected as the current division feature. .
  • the current node When the target information gain rate is not greater than the preset threshold, the current node may be used as a leaf node, and the sample with the largest number of samples may be selected as the output of the leaf node.
  • the sample category includes “target application” and “non-target application”. When the output is “target application”, it means that the target application will be used within the preset duration. When the output is “non-target application”, This means that the target application will not be used within the preset duration.
  • the preset threshold can be set according to actual needs, such as 0.9, 0.8, and the like.
  • the preset gain rate threshold is 0.8, since the maximum information gain rate is greater than the preset threshold, feature 1 can be used as the division feature.
  • the preset threshold is 1, then the maximum information gain rate is less than the preset threshold.
  • the current node may be used as a leaf node.
  • Operation 504 dividing the target sample set according to the dividing feature, and generating at least one sub-sample set.
  • the electronic device classifies the samples according to the division features in various ways.
  • the sample set may be divided based on the feature values of the divided features.
  • the electronic device may also acquire feature values of the feature set in the target sample set; and divide the target sample set according to the feature value.
  • an electronic device may divide a sample with the same feature value in a sample set into the same subsample set.
  • the feature values of the divided features include: 0, 1, 2, then, at this time, the samples whose feature values are 0 can be classified into one class, and the samples with the feature value 1 are classified into one class, and the feature values are The samples of 2 are classified into one category.
  • Operation 505 removing the dividing feature of the sample in each sub-sample set; generating a child node of the current node, and removing the sub-sample set after dividing the feature as the node information of the child node.
  • Operation 506 determining whether the child node satisfies the preset classification termination condition; when satisfied, performing operation 507, when not satisfied, updating the target sample set to the subsample set after removing the division feature, and returning to performing operation 502.
  • the child node is used as a leaf node, and the output of the leaf node is set according to the category of the subsample set after the division feature is removed.
  • the electronic device may use the child node as a leaf node, that is, stop the sample set classification of the child node, and set the output of the leaf node based on the category of the sample in the removed sub-sample set.
  • the output of a leaf node based on the category of the sample. For example, the category with the largest number of samples in the sample set can be removed as the output of the leaf node.
  • the preset classification termination condition may be set according to actual requirements.
  • the electronic device uses the current child node as a leaf node, and stops classifying the sample set corresponding to the child node;
  • the preset classification termination condition may include: whether the number of categories of the samples in the removed sub-sample set of the child node is a preset number, and when yes, determining that the child node meets the preset classification termination condition.
  • a decision tree model of the target application can be constructed, so that it is predicted according to the decision tree model whether the target application will be used within a preset duration.
  • sample set D For example, for sample set D ⁇ sample 1, sample 2...sample i...sample n ⁇ , where the sample includes several features A.
  • the current node acts as a leaf node, and the sample category with the largest number of samples is selected as the output of the leaf node.
  • the division feature A g in the subsample sets D1 and D2 is removed, that is, AA g .
  • the child nodes d 1 and d 2 of the root node d are generated with reference to FIG. 6A, and the sub-sample set D 1 is taken as the node information of the child node d 1 and the sub-sample set D2 is taken as the node information of the child node d2.
  • the above-mentioned sub-sample set corresponding to the child node is continued to be classified according to the information gain classification method.
  • the child node d 2 can be used as an example to calculate the relative features of the A 2 sample set.
  • the maximum information gain rate g R (D, A) max is selected, when the maximum information gain rate g R (D, A) max is greater than the preset threshold ⁇
  • the feature corresponding to the information gain rate g R (D, A) may be selected as a division feature A g (such as feature Ai+1), and D 2 is divided into several sub-sample sets based on the division feature A g , such as D 2 Divided into subsample sets D 21 , D 22 , D 23 , and then the divided features A g in the subsample sets D 21 , D 22 , D 23 are removed, and the child nodes d 21 , d 22 of the current node d 2 are generated, d 23 , the sample sets D 21 , D 22 , and D 23 after the division feature A g are removed as the node information of the child nodes d 21 , d 22 , and d 23 , respectively.
  • the above-mentioned information gain rate classification based manner can be used to construct a decision tree as shown in FIG. 6B, and the output of the leaf node of the decision tree includes using the target application within a preset duration or at a preset duration.
  • the target application will not be used within.
  • the output results "yes” and “no” can correspond to "target application” and “non-target application”, which means “the target application will be used within the preset duration” and "will not be used within the preset duration” Target application.”
  • the feature values of the corresponding divided features may be marked on the path between the nodes. For example, in the above process based on information gain classification, the feature values of the corresponding divided features may be marked on the path of the current node and its child nodes.
  • the feature values of the partition feature A g include: 0, 1 may mark 1 on the path between d 2 and d, mark 0 on the path between d 1 and d, and so on, at each division Then, the corresponding division feature value such as 0 or 1 can be marked on the path of the current node and its child nodes, and the decision tree as shown in FIG. 6C can be obtained.
  • the output results "yes” and “no” can correspond to "target application” and "non-target application", which means “the target application will be used within the preset duration” and “will not be within the preset duration" Use the target app.”
  • the information gain rate of the feature acquired for the target sample set classification includes:
  • Operation 702 obtaining an information gain of the feature classification for the target sample set.
  • the information gain represents the degree of uncertainty in the information of a class of a feature ("target application” or “non-target application”).
  • Operation 704 acquiring split information of the feature classification for the target sample set.
  • the split information is used to measure the breadth and uniformity of the feature split data (such as the sample set), and the split information can be the entropy of the feature.
  • Operation 706 acquiring an information gain rate of the feature classification for the target sample set according to the information gain and the split information.
  • the information gain rate can be the ratio of the information gain of the feature to the sample set classification and the split information of the feature to the sample classification.
  • the information gain rate can be obtained by dividing the obtained information gain by the corresponding split information.
  • the information gain of the feature classification for the target sample set is also the difference between empirical entropy and conditional entropy.
  • the electronic device can obtain the empirical entropy of the target sample classification; acquire the conditional entropy of the feature for the classification result of the target sample set; and obtain the information gain of the feature for the target sample set classification according to the conditional entropy and the empirical entropy.
  • the electronic device can obtain the first probability that the positive sample appears in the sample set and the second probability that the negative sample appears in the sample set, the positive sample is the sample whose sample category is “target application”, and the negative sample is the sample category is “non- A sample of the target application; the empirical entropy of the sample is obtained from the first probability and the second probability.
  • the sample includes a multi-dimensional feature, such as feature A.
  • the information gain rate of feature A for sample classification can be obtained by the following formula:
  • g R (D, A) is the information gain rate of feature A for sample set D classification
  • g(D, A) is the information gain of feature A for sample classification
  • H A (D) is the split information of feature A, That is, the entropy of feature A.
  • H(D) is the empirical entropy of the sample set D classification
  • A) is the conditional entropy of the feature A for the sample set D classification.
  • the sample size of the sample category is "target application” is j
  • the information gain is the difference between the information of the decision tree before and after the attribute selection.
  • the empirical entropy H(D) of the sample classification is:
  • the electronic device may divide the sample set into several sub-sample sets according to the feature A, and then obtain the information entropy of each sub-sample set classification, and the probability that each feature value of the feature A appears in the sample set, According to the information entropy and the probability, the divided information entropy, that is, the conditional entropy of the feature A i for the sample set classification result can be obtained.
  • conditional entropy of the sample feature A for the sample set D classification result can be calculated by the following formula:
  • n is the number of values of the feature A, that is, the number of feature value types.
  • p i is the probability that the sample whose A eigenvalue is the ith value appears in the sample set D
  • a i is the ith value of A.
  • the A eigenvalues of the samples in the subsample set D i are all the ith values.
  • d, e are positive integers and less than n.
  • conditional entropy of feature A for the classification result of sample set D is:
  • A) p 1 H(D
  • A A 1 )+p 2 H(D
  • A A 2 )+p 3 H(D
  • A A 3 )
  • a 1 ) is the information entropy of the subsample set D 1 classification, that is, the empirical entropy, which can be calculated by the above formula of empirical entropy.
  • the information gain of the feature A for the sample set D classification can be calculated, for example, by The formula is calculated:
  • the information gain of the feature A for the sample set D classification is: the difference between the empirical entropy H(D) and the conditional entropy H(D
  • the split information of the feature classification for the sample set is the entropy of the feature.
  • the probability of the distribution of the features can be obtained based on the probability of distribution of the samples in the target sample set.
  • H A (D) can be obtained by the following formula:
  • D i is a sample set in which the feature set D of the sample set D is the i-th kind.
  • the method before freezing the target application, the method further includes: detecting whether the target application belongs to the whitelist application, and if so, freezing the target application; otherwise, performing freezing on the target application.
  • detecting whether the target application belongs to the whitelist application may be performed in any process before the target application is frozen.
  • the whitelist of the freeze-free application is preset in the electronic device, and the application in the whitelist can be customized for the user, or can be set as the system default.
  • the whitelist records application information of an application that can be freed from freezing, such as an application identifier of a recordable application.
  • FIG. 8 another application processing method is provided, the method comprising:
  • Operation 801 Acquire second preset feature data of each feature as a sample, and generate a sample set.
  • the acquired features may include a plurality of features, as shown in Table 1 below, for the 14 dimensions acquired by the electronic device.
  • the number of feature information included in one sample may be more than the number of information shown in Table 1, or may be less than the number of information shown in Table 1, and the specific feature information may also be as shown in Table 1. Different, it is not specifically limited herein.
  • the electronic device may acquire the feature information of the plurality of features described above as a sample according to a preset frequency in the latest time period.
  • the multi-dimensional feature data collected at one time constitutes one sample, and multiple samples collected multiple times constitute a sample set.
  • the electronic device may bury a point in the preset path of the system, and when detecting that an application is activated, record the feature data of the relevant feature and the application identifier of the application at the startup time of the application, and identify the application identifier. Associated with the recorded feature data.
  • the electronic device may mark each sample in the sample set to obtain a sample label of each sample, and the sample label may be an application identifier of the corresponding application, or an application category to which the corresponding application belongs.
  • the sample tag may be classified into a "target application” or a “non-target application” with respect to the target application to be detected, or may be classified into "the same application type as the target application” or “different from the application type of the target application”. .
  • Operation 802 when the data volume of the sample set exceeds a preset threshold, using the sample set as the node information of the root node of the decision tree; determining the node information of the root node as the current target sample set to be classified.
  • the sample set of the root node is determined as the target sample set to be classified currently. For example, referring to FIG. 6A, for the sample set D ⁇ sample 1, sample 2, ... sample i ... sample n ⁇ , the electronic device can be made into the root node d of the decision tree, and the sample set D is taken as the node of the root node d information.
  • Operation 803 obtaining an information gain rate of the feature classification for the target sample set, and determining a maximum information gain rate from the information gain rate.
  • it may first pass the formula H(D
  • A) p 1 H(D
  • A A 1 )+p 2 H(D
  • A A 2 )+p 3 H(D
  • A A 3 ) Calculate the conditional entropy H(D
  • Operation 804 detecting whether the maximum information gain rate is greater than a preset threshold, and when the maximum information gain rate is greater than the preset threshold, performing operation 805; otherwise, performing operation 806.
  • the electronic device can determine whether the maximum information gain g R (D, A) max is greater than a preset threshold ⁇ , which can be set according to actual needs.
  • the sample corresponding to the maximum information gain rate is selected as the partitioning feature, and the target sample set is divided according to the partitioning feature to generate at least one subsample set.
  • the current node is taken as a leaf node, and the sample with the largest number of samples is selected as the output of the leaf node.
  • the electronic device may select the feature A g as the dividing feature.
  • the electronic device may divide the sample set into several sub-sample sets according to the number of feature values of the divided features, and the number of the sub-sample sets is the same as the number of feature values. For example, the electronic device may divide the samples in the sample set with the same feature value into the same subsample set.
  • the feature values of the divided features include: 0, 1, 2, then, at this time, the samples whose feature values are 0 can be classified into one class, and the samples with the feature value 1 are classified into one class, and the feature values are The samples of 2 are classified into one category.
  • the sample set D can be divided into D 1 ⁇ sample 1, sample 2, ... sample k ⁇ and D 2 ⁇ sample k+1 ... sample n ⁇ . Then, a sub-division of sample sets features D 1 and D 2 A g may be removed, i.e., AA g.
  • one subsample set corresponds to one child node.
  • the test map 6A generates the child nodes d1 and d2 of the root node d, and uses the subsample set D 1 as the node information of the child node d 1 and the subsample set D 2 as the node information of the child node d 2 .
  • the electronic device may further set the divided feature values corresponding to the child nodes on the path of the child node and the current node, so as to facilitate subsequent prediction whether the application is used, refer to FIG. 6C.
  • operation 808 it is determined whether the number of categories of the sample in the subsample set after removing the partition feature corresponding to the child node is a preset number, and if yes, performing operation 809; otherwise, updating the target sample set to the subsample after removing the partition feature Set and return to perform operation 803.
  • the electronic device may continue to classify the sub-sample set corresponding to the child node by using the above-mentioned information gain classification method, for example, the child node d 2 may be used as an example to calculate the A 2 sample set.
  • the information gain rate g R (D, A) of each feature relative to the sample classification, the maximum information gain rate g R (D, A) max is selected, when the maximum information gain rate g R (D, A) max is greater than the preset For the threshold ⁇ , the feature corresponding to the information gain rate g R (D, A) may be selected as the partitioning feature A g (such as the feature A i+1 ), and the D2 is divided into several sub-sample sets based on the partitioning feature Ag, if D2 is divided into subsample sets D 21 , D 22 , D 23 , and then the partition features A g in the subsample sets D 21 , D 22 , D 23 are removed, and the child nodes d 21 , d 22 of the current node d 2 are generated. And d 23 , the sample sets D 21 , D 22 , and D 23 after the division feature A g are removed as the node information of the child nodes d 21 , d 22 , and
  • the electronic device may use the category of the sample in the subsample set as the output of the leaf node. If, after the removal, the subsample set has only the sample of the category "target application”, then the electronic device can use the "target application” as the output of the leaf node.
  • the child node is used as a leaf node, and the output of the leaf node is set according to the category of the subsample set after the division feature is removed.
  • the electronic device may use the sample category with the largest number of samples in the sub-sample set D 1 as the leaf node. Output.
  • the maximum number of samples "target application” the electronic device may be an output of leaf nodes d 1 "target application” as.
  • Operation 810 acquiring first feature data of each feature, where the first feature data is feature data of the corresponding feature at the predicted time.
  • the electronic device can acquire the feature data of the feature at the predicted time when the time at which the target application needs to be predicted is used.
  • Operation 811 obtaining a decision tree model for predicting whether the user will use the target application within a preset duration, and the starting time of the preset duration is the predicted time.
  • Operation 812 the first feature data is used as an input of the decision tree model, and the prediction result is output.
  • the operation 813 is performed; otherwise, the operation 815 is performed.
  • the electronic device can obtain a pre-built decision tree model, and use the first feature data as an input of the model to obtain a corresponding output result.
  • operation 813 it is detected whether the target application belongs to the whitelist application.
  • operation 814 is performed.
  • operation 815 is performed.
  • Operation 814 freeing the target application from freezing.
  • Operation 815 freezing the target application.
  • the platform freeze management module 224 may initiate a freeze operation on the target application to perform resources for the target application. Restrictions, such as the CPU limit sleep mode, CPU freeze sleep mode or process deep freeze mode, etc., can freeze the target application.
  • the application processing method described above by constructing a decision tree model for predicting whether the target application is within a preset duration, and acquiring first feature data of each feature and a decision tree model of the target application, the first feature
  • the data is used as an input of the decision tree model to obtain whether the user output by the decision tree model uses the prediction result of the target application within a preset duration, when the prediction result is that the target application is not used within the preset duration , the target application is frozen to limit the target application's occupation of resources. Since it is necessary to consume a certain amount of system resources when freezing the application, whether the target application is used within the preset duration by performing the prediction, and freezing when not in use, thereby reducing the user to freeze after the target application is frozen.
  • the application is used within the preset duration, and the application needs to be thawed, which causes unnecessary freezing operations, improves the accuracy of freezing the target application, and thus improves the effectiveness of releasing system resources.
  • an application processing apparatus which includes a feature data acquisition module 902, a decision tree model acquisition module 904, a prediction module 906, and an application processing module 908.
  • the feature data obtaining module 902 is configured to acquire first feature data of each feature, where the first feature data is feature data of the corresponding feature at the predicted time;
  • the decision tree model obtaining module 904 is configured to obtain whether the user is predicted to be in the The decision tree model of the target application is used within the preset duration, and the starting time of the preset duration is the predicted time;
  • the prediction module 906 is configured to use the first feature data as an input of the decision tree model, and output the predicted result;
  • the application processing module 908 The target application is frozen when the predicted result is that the target application will not be used within the preset duration.
  • FIG. 10 another application processing device is provided as shown in FIG. 10, the device further comprising:
  • the decision tree construction module 910 is configured to acquire the second feature data of each feature as a sample, and generate a sample set, where the second feature data is feature data of the corresponding feature when the reference application is started before the predicted time, and the reference
  • the application includes a target application; when the data volume of the sample set exceeds a preset threshold, the sample set is sample-classified according to the information gain rate of the feature classification for the sample set, and the generated sample is used to predict whether the user will use the target application within the preset duration.
  • the decision tree construction module 910 is further configured to use the sample set as the node information of the root node of the decision tree; determine the node information of the root node as the target sample set to be classified; and acquire the feature for the target sample set.
  • Information gain rate according to the information gain rate, the sample is selected from the target sample set as a partition feature; the target sample set is divided according to the partition feature to generate at least one subsample set; the partition feature of each subsample set is removed; and the current node is generated.
  • a child node and removing the subsample set after dividing the feature as the node information of the child node; determining whether the child node satisfies the preset classification termination condition; when the child node does not satisfy the preset classification termination condition, updating the target sample set to The sub-sample set after the feature is removed is removed, and the information gain rate of the acquired feature classification for the target sample set is returned.
  • the child node satisfies the preset classification termination condition
  • the child node is used as the leaf node, and the sub-node is removed according to the de-divided feature.
  • the category of the sample set sets the output of the leaf node.
  • the decision tree construction module 910 is further configured to determine a maximum information gain rate from the information gain rate; when the maximum information gain rate is greater than a preset threshold, select a sample corresponding to the maximum information gain rate as the partition feature. When the maximum information gain rate is not greater than the preset threshold, the current node is taken as the leaf node, and the sample with the largest number of samples is selected as the output of the leaf node.
  • the decision tree construction module 910 is further configured to determine whether the number of categories of the sample in the subsample set after the partition feature is removed by the child node is a preset number; and the subsample set after the child node corresponding to remove the partition feature When the number of categories of the sample is a preset number, it is determined that the child node satisfies the preset classification termination condition.
  • the decision tree construction module 910 is further configured to acquire an information gain of the feature classification for the target sample set; acquire split information of the feature classification for the target sample set; and classify the target sample set according to the information gain and the split information acquisition feature. Information gain rate.
  • the decision tree building module 910 is also used to pass Calculating the information gain rate of the feature classification for the target sample set; where D represents the sample set, g(D, A) is the information gain of feature A for the sample set D classification, and H A (D) is the split information of feature A, g R (D, A) is the information gain rate of feature A for sample set D; g(D, A) is passed Calculated; where H(D) is the empirical entropy of the sample set D classification, H(D
  • yet another application processing device is provided, the device further comprising:
  • the application detection module 912 is configured to detect whether the target application belongs to a whitelist application.
  • the application processing module 908 is further configured to freeze the target application when the prediction result is that the target application is not used within the preset duration; when the prediction result is that the target application is used within the preset duration, or the target application belongs to white When the list is applied, the target application is free from freezing.
  • each module in the application processing device is for illustrative purposes only. In other embodiments, the application processing device may be divided into different modules as needed to complete all or part of the functions of the application processing device.
  • the various modules in the application processing device described above may be implemented in whole or in part by software, hardware, and combinations thereof. Each of the above modules may be embedded in or independent of the processor in the electronic device, or may be stored in a memory in the electronic device in a software format, so that the processor calls to perform operations corresponding to the above modules.
  • each module in the application processing apparatus may be in the form of a computer program.
  • the computer program can run on an electronic device such as a terminal or a server.
  • the program module of the computer program can be stored on a memory of the electronic device.
  • the computer program is executed by the processor, the operation of the application processing method described in the embodiment of the present application is implemented.
  • modules in the application processing device described above may be implemented in whole or in part by software, hardware, and combinations thereof.
  • the above modules may be embedded in the hardware in the processor or in the memory in the server, or may be stored in the memory in the server, so that the processor calls the corresponding operations of the above modules.
  • the terms "module” and the like are intended to mean a computer-related entity, which may be hardware, a combination of hardware and software, software, or software in execution.
  • a module can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and a server can be a module.
  • One or more modules can reside within a process and/or a thread of execution, and a module can be located in a computer and/or distributed between two or more computers.
  • an electronic device including a memory, a processor, and a computer program stored on the memory and operable on the processor, and the application processing provided by the above embodiments is implemented when the processor executes the computer program The operation of the method.
  • a computer readable storage medium having stored thereon a computer program for performing application processing as described in various embodiments of the present application when executed by a processor The operation of the method.
  • a computer program product comprising instructions, when executed on a computer, causes the computer to perform the application processing methods described in the various embodiments of the present application.
  • the embodiment of the present application also provides a computer device. As shown in FIG. 12, for the convenience of description, only the parts related to the embodiments of the present application are shown. If the specific technical details are not disclosed, please refer to the method part of the embodiment of the present application.
  • the computer device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, a wearable device, and the like, taking a computer device as a mobile phone as an example. :
  • FIG. 12 is a block diagram showing a part of a structure of a mobile phone related to a computer device according to an embodiment of the present application.
  • the mobile phone includes: a radio frequency (RF) circuit 1210, a memory 1220, an input unit 1230, a display unit 1240, a sensor 1250, an audio circuit 1260, a wireless fidelity (WiFi) module 1270, and a processor 1280. And power supply 1290 and other components.
  • RF radio frequency
  • the RF circuit 1210 can be used for receiving and transmitting information during the transmission and reception of information or during the call.
  • the downlink information of the base station can be received and processed by the processor 1280.
  • the uplink data can also be sent to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
  • LNA Low Noise Amplifier
  • RF circuitry 1210 can also communicate with the network and other devices via wireless communication.
  • the above wireless communication may use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division). Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), e-mail, Short Messaging Service (SMS), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Pack
  • the memory 1220 can be used to store software programs and modules, and the processor 1280 executes various functional applications and data processing of the mobile phone by running software programs and modules stored in the memory 1220.
  • the memory 1220 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application required for at least one function (such as an application of a sound playing function, an application of an image playing function, etc.);
  • the data storage area can store data (such as audio data, address book, etc.) created according to the use of the mobile phone.
  • memory 1220 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the input unit 1230 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the handset 1200.
  • the input unit 1230 may include a touch panel 1231 and other input devices 1232.
  • the touch panel 1231 which may also be referred to as a touch screen, can collect touch operations on or near the user (such as a user using a finger, a stylus, or the like on the touch panel 1231 or near the touch panel 1231. Operation) and drive the corresponding connection device according to a preset program.
  • the touch panel 1231 may include two parts of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 1280 is provided and can receive commands from the processor 1280 and execute them.
  • the touch panel 1231 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 1230 may also include other input devices 1232.
  • other input devices 1232 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.).
  • the display unit 1240 can be used to display information input by the user or information provided to the user as well as various menus of the mobile phone.
  • the display unit 1240 may include a display panel 1241.
  • the display panel 1241 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the touch panel 1231 can cover the display panel 1241. When the touch panel 1231 detects a touch operation thereon or nearby, the touch panel 1231 transmits to the processor 1280 to determine the type of the touch event, and then the processor 1280 is The type of touch event provides a corresponding visual output on display panel 1241.
  • the touch panel 1231 and the display panel 1241 are used as two independent components to implement the input and input functions of the mobile phone, in some embodiments, the touch panel 1231 and the display panel 1241 may be integrated. Realize the input and output functions of the phone.
  • the handset 1200 can also include at least one type of sensor 1250, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1241 according to the brightness of the ambient light, and the proximity sensor may close the display panel 1241 and/or when the mobile phone moves to the ear. Or backlight.
  • the motion sensor may include an acceleration sensor, and the acceleration sensor can detect the magnitude of the acceleration in each direction, and the magnitude and direction of the gravity can be detected at rest, and can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching), and vibration recognition related functions (such as Pedometer, tapping, etc.; in addition, the phone can also be equipped with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors.
  • the acceleration sensor can detect the magnitude of the acceleration in each direction, and the magnitude and direction of the gravity can be detected at rest, and can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching), and vibration recognition related functions (such as Pedometer, tapping, etc.; in addition, the phone can also be equipped with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors.
  • Audio circuitry 1260, speaker 1261, and microphone 1262 can provide an audio interface between the user and the handset.
  • the audio circuit 1260 can transmit the converted electrical data of the received audio data to the speaker 1261, and convert it into a sound signal output by the speaker 1261; on the other hand, the microphone 1262 converts the collected sound signal into an electrical signal, by the audio circuit 1260. After receiving, it is converted into audio data, and then processed by the audio data output processor 1280, transmitted to another mobile phone via the RF circuit 1210, or outputted to the memory 1220 for subsequent processing.
  • WiFi is a short-range wireless transmission technology.
  • the mobile phone through the WiFi module 1270 can help users to send and receive e-mail, browse the web and access streaming media, etc. It provides users with wireless broadband Internet access.
  • FIG. 12 shows the WiFi module 1270, it will be understood that it does not belong to the essential configuration of the handset 1200 and may be omitted as needed.
  • the processor 1280 is a control center for the handset that connects various portions of the entire handset using various interfaces and lines, by executing or executing software programs and/or modules stored in the memory 1220, and invoking data stored in the memory 1220, The phone's various functions and processing data, so that the overall monitoring of the phone.
  • processor 1280 can include one or more processing units.
  • the processor 1280 can integrate an application processor and a modem, wherein the application processor primarily processes an operating system, a user interface, an application, etc.; the modem primarily processes wireless communications. It will be appreciated that the above described modem may also not be integrated into the processor 1280.
  • the processor 1280 can integrate an application processor and a baseband processor, and the baseband processor and other peripheral chips can form a modem.
  • the handset 1200 also includes a power source 1290 (such as a battery) that powers the various components.
  • the power source can be logically coupled to the processor 1280 via a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • the handset 1200 can also include a camera, a Bluetooth module, and the like.
  • the processor included in the mobile phone implements the application processing method described above when executing a computer program stored in the memory.
  • Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM), which acts as an external cache.
  • RAM is available in a variety of forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronization.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM dual data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM Link (Synchlink) DRAM
  • SLDRAM Memory Bus
  • Rambus Direct RAM
  • RDRAM Direct Memory Bus Dynamic RAM
  • RDRAM Memory Bus Dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

La présente invention concerne un procédé de traitement d'application, comprenant les étapes consistant à : acquérir des premières données de caractéristiques de chaque caractéristique, les premières données de caractéristiques étant des données de caractéristiques d'une caractéristique correspondante pendant un moment de prédiction ; acquérir un modèle d'arbre de décision destiné à être utilisé pour prédire si un utilisateur utilisera une application cible pendant une période de temps prédéfinie ; utiliser les premières données de caractéristique en tant qu'entrée pour le modèle d'arbre de décision, et délivrer en sortie le résultat de prédiction ; lorsque le résultat de prédiction indique que l'utilisateur n'utilisera pas l'application cible pendant la période de temps prédéfinie, geler l'application cible.
PCT/CN2018/117694 2017-12-29 2018-11-27 Procédé de traitement d'application, dispositif électronique et support de stockage lisible par ordinateur WO2019128598A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711484440.9 2017-12-29
CN201711484440.9A CN109992367A (zh) 2017-12-29 2017-12-29 应用处理方法和装置、电子设备、计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2019128598A1 true WO2019128598A1 (fr) 2019-07-04

Family

ID=67066384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117694 WO2019128598A1 (fr) 2017-12-29 2018-11-27 Procédé de traitement d'application, dispositif électronique et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN109992367A (fr)
WO (1) WO2019128598A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112491572B (zh) * 2019-09-12 2022-01-21 华为技术有限公司 终端之间连接状态的预测方法、装置和分析设备
CN111708427B (zh) * 2020-05-29 2022-07-22 广州三星通信技术研究有限公司 管理终端的方法和终端
CN112130991B (zh) * 2020-08-28 2024-07-23 北京思特奇信息技术股份有限公司 一种基于机器学习的应用程序控制方法和系统
CN112390388B (zh) * 2020-11-25 2022-06-14 创新奇智(青岛)科技有限公司 一种模型训练方法、曝气值预估方法、装置及电子设备
CN112256354B (zh) * 2020-11-25 2023-05-16 Oppo(重庆)智能科技有限公司 应用启动方法、装置、存储介质及电子设备
CN112330069A (zh) * 2020-11-27 2021-02-05 上海眼控科技股份有限公司 一种预警解除方法、装置、电子设备及存储介质
CN113627932B (zh) * 2021-08-11 2024-02-27 中国银行股份有限公司 无网络状态下控制终端应用账户等待时长的方法及装置
CN114416600B (zh) * 2022-03-29 2022-06-28 腾讯科技(深圳)有限公司 应用检测方法、装置、计算机设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868222A (zh) * 2015-09-17 2016-08-17 乐视网信息技术(北京)股份有限公司 一种任务调度方法及装置
CN106250532A (zh) * 2016-08-04 2016-12-21 广州优视网络科技有限公司 应用推荐方法、装置及服务器
CN106294743A (zh) * 2016-08-10 2017-01-04 北京奇虎科技有限公司 应用功能的推荐方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7016887B2 (en) * 2001-01-03 2006-03-21 Accelrys Software Inc. Methods and systems of classifying multiple properties simultaneously using a decision tree
CN105389193B (zh) * 2015-12-25 2019-04-26 北京奇虎科技有限公司 应用的加速处理方法、装置和系统、服务器
CN106793031B (zh) * 2016-12-06 2020-11-10 常州大学 基于集合竞优算法的智能手机能耗优化方法
CN107133094B (zh) * 2017-06-05 2021-11-02 努比亚技术有限公司 应用管理方法、移动终端及计算机可读存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868222A (zh) * 2015-09-17 2016-08-17 乐视网信息技术(北京)股份有限公司 一种任务调度方法及装置
CN106250532A (zh) * 2016-08-04 2016-12-21 广州优视网络科技有限公司 应用推荐方法、装置及服务器
CN106294743A (zh) * 2016-08-10 2017-01-04 北京奇虎科技有限公司 应用功能的推荐方法及装置

Also Published As

Publication number Publication date
CN109992367A (zh) 2019-07-09

Similar Documents

Publication Publication Date Title
WO2019128598A1 (fr) Procédé de traitement d'application, dispositif électronique et support de stockage lisible par ordinateur
US11244672B2 (en) Speech recognition method and apparatus, and storage medium
CN107679559B (zh) 图像处理方法、装置、计算机可读存储介质和移动终端
CN109992398B (zh) 资源管理方法、装置、移动终端及计算机可读存储介质
CN107368400B (zh) Cpu监测方法、装置、计算机可读存储介质和移动终端
CN107704070B (zh) 应用清理方法、装置、存储介质及电子设备
WO2019128546A1 (fr) Procédé de traitement de programme d'application, dispositif électronique et support de stockage lisible par ordinateur
CN112703714B (zh) 应用程序处理方法和装置、计算机设备、计算机可读存储介质
CN109144232B (zh) 进程处理方法和装置、电子设备、计算机可读存储介质
CN109992364B (zh) 应用冻结方法、装置、计算机设备和计算机可读存储介质
CN108108455B (zh) 目的地的推送方法、装置、存储介质及电子设备
WO2019137252A1 (fr) Procédé de traitement de mémoire, dispositif électronique et support de stockage lisible par ordinateur
CN110032266B (zh) 信息处理方法、装置、计算机设备和计算机可读存储介质
WO2019062416A1 (fr) Procédé et appareil de nettoyage d'application, support de stockage et dispositif électronique
CN111222563A (zh) 一种模型训练方法、数据获取方法以及相关装置
WO2019128574A1 (fr) Procédé et dispositif de traitement d'informations, dispositif informatique et support d'informations lisible par ordinateur
CN109992425B (zh) 信息处理方法、装置、计算机设备和计算机可读存储介质
CN109726726B (zh) 视频中的事件检测方法及装置
CN110018886B (zh) 应用状态切换方法和装置、电子设备、可读存储介质
WO2019128569A1 (fr) Procédé et appareil pour geler une application et support de stockage et terminal
CN109375995B (zh) 应用冻结方法和装置、存储介质、电子设备
WO2019128570A1 (fr) Appareil et procédé de gel d'une application et support de stockage et terminal
CN109388487B (zh) 应用程序处理方法和装置、电子设备、计算机可读存储介质
WO2019137187A1 (fr) Procédé et appareil de gestion de ressources, terminal mobile et support de stockage lisible par ordinateur
CN110007968B (zh) 信息处理方法、装置、计算机设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18897662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18897662

Country of ref document: EP

Kind code of ref document: A1