CN107885544B - Application program control method, device, medium and electronic equipment - Google Patents

Application program control method, device, medium and electronic equipment Download PDF

Info

Publication number
CN107885544B
CN107885544B CN201711044959.5A CN201711044959A CN107885544B CN 107885544 B CN107885544 B CN 107885544B CN 201711044959 A CN201711044959 A CN 201711044959A CN 107885544 B CN107885544 B CN 107885544B
Authority
CN
China
Prior art keywords
application program
layer
calculation
characteristic information
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711044959.5A
Other languages
Chinese (zh)
Other versions
CN107885544A (en
Inventor
梁昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711044959.5A priority Critical patent/CN107885544B/en
Publication of CN107885544A publication Critical patent/CN107885544A/en
Priority to PCT/CN2018/110518 priority patent/WO2019085749A1/en
Application granted granted Critical
Publication of CN107885544B publication Critical patent/CN107885544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44594Unloading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The present application providesAn application program control method, device, medium and electronic equipment are provided, wherein historical characteristic information x is acquirediAnd generating a training model by adopting a Back Propagation (BP) neural network algorithm, and when detecting that the application program enters a background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.

Description

Application program control method, device, medium and electronic equipment
Technical Field
The application relates to the field of electronic equipment terminals, in particular to an application program control method, device, medium and electronic equipment.
Background
End users use a large number of applications every day, and often, after one application is pushed to the background, precious system memory resources are occupied if not cleared in time, and system power consumption is affected. Therefore, it is desirable to provide an application management and control method, apparatus, medium, and electronic device.
Disclosure of Invention
The embodiment of the application provides an application program control method, an application program control device, an application program control medium and electronic equipment, so that an application program can be closed intelligently.
The embodiment of the application provides an application program control method, which is applied to electronic equipment and comprises the following steps:
obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Calculating a sample vector set by adopting a Back Propagation (BP) neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
The embodiment of the present application further provides an application management and control method and device, where the device includes:
an obtaining module, configured to obtain a sample vector set of the application program, where a sample vector in the sample vector set includes historical feature information x of multiple dimensions of the application programi
The generating module is used for calculating the sample vector set by adopting a BP neural network algorithm to generate a training model;
the calculation module is used for inputting the current characteristic information s of the application program into the training model for calculation when the application program enters a background; and
and the judging module is used for judging whether the application program needs to be closed or not.
The embodiment of the application also provides a medium, wherein a plurality of instructions are stored in the medium, and the instructions are suitable for being loaded by a processor to execute the application management and control method.
An embodiment of the present application further provides an electronic device, where the electronic device includes a processor and a memory, the electronic device is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute the following steps:
obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Calculating a sample vector set by adopting a BP neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
The application program control method, device, medium and electronic equipment provided by the application program control method and device acquire historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system diagram of an application management and control apparatus according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of an application scenario of an application management and control apparatus according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating an application management and control method according to an embodiment of the present disclosure.
Fig. 4 is another schematic flowchart of an application management and control method according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an apparatus according to an embodiment of the present disclosure.
Fig. 6 is another schematic structural diagram of an apparatus according to an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it is to be understood that the terms "center," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present application and for simplicity in description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed in a particular orientation, and be operated in a particular manner, and are not to be construed as limiting the present application. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise direct contact of the first and second features, or may comprise contact of the first and second features not directly but through another feature in between. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to the drawings, wherein like reference numbers refer to like elements throughout, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
While the principles of the application have been described in the foregoing text, it is not meant to be limiting and those of skill in the art will appreciate that various steps and operations described below may be implemented in hardware. The principles of the present application may be employed in numerous other general-purpose or special-purpose computing, communication environments or configurations.
The application program management and control method provided by the application program is mainly applied to electronic equipment, such as: the mobile terminal comprises intelligent mobile electronic equipment such as a bracelet, a smart phone, a tablet computer based on an apple system or an android system, or a notebook computer based on a Windows or Linux system. It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
Referring to fig. 1, fig. 1 is a system schematic diagram of an application management and control apparatus according to an embodiment of the present disclosure. The application program management and control device is mainly used for: obtaining historical characteristic information x of application program from databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, then inputting the current characteristic information s of the application program into the training model for calculation, and judging whether the application program can be closed or not according to a calculation result so as to control a preset application program, such as closing or freezing.
Specifically, please refer to fig. 2, and fig. 2 is a schematic view of an application scenario of the application management and control method according to the embodiment of the present application. In thatIn one embodiment, historical feature information x of an application program is obtained from a databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, inputting current characteristic information s of the application program into the training model for calculation when the application program control device detects that the application program enters a background of the electronic equipment, and judging whether the application program can be closed or not according to a calculation result. For example, the historical characteristic information x of the application program a is obtained from the databaseiThen, the history feature information xiCalculating through an algorithm to obtain a training model, secondly, when an application program control device detects that an application program a enters the background of the electronic equipment, inputting the current characteristic information s of the application program into the training model to calculate, judging that the application program a can be closed through a calculation result, closing the application program a, inputting the current characteristic information s of the application program b into the training model to calculate when the application program control device detects that the application program b enters the background of the electronic equipment, judging that the application program b needs to be reserved through the calculation result, and reserving the application program b.
An execution subject of the application management and control method may be the application management and control apparatus provided in the embodiment of the present invention, or an electronic device that becomes the application management and control apparatus, where the application management and control apparatus may be implemented in a hardware or software manner.
Referring to fig. 3, fig. 3 is a flowchart illustrating an application management and control method according to an embodiment of the present disclosure. The application program management and control method provided by the embodiment of the application program is applied to the electronic equipment, and the specific flow can be as follows:
step S101, obtaining the sample vector set of the application program, wherein the sample vectors in the sample vector set include the historical feature information x of multiple dimensions of the application programi
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 1.
Figure BDA0001452074140000051
Figure BDA0001452074140000061
TABLE 1
It should be noted that the 10-dimensional characteristic information shown in table 1 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
And step S102, calculating a sample vector set by adopting a BP neural network algorithm to generate a training model.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating an application management and control method according to an embodiment of the present disclosure. In one embodiment, the step S102 may include:
step S1021: defining a network structure; and
step S1022: and bringing the sample vector set into a network structure for calculation to obtain a training model.
In step S1021, the defining the network structure comprises:
step S1021a, an input layer is set, the input layer comprises N nodes, the number of the nodes of the input layer and the historical characteristic information xiAre the same in dimension.
Wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
Step S1021b, setting a hidden layer, where the hidden layer includes M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
Step S1021c, setting a classification layer, wherein the classification layer adopts a softmax function, and the softmax function is
Figure BDA0001452074140000071
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000072
is the jth intermediate value.
Step S1021d, setting an output layer, the output layer comprising 2 nodes.
Step S1021e, setting an activation function, wherein the activation function adopts a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000073
Wherein f (x) ranges from 0 to 1.
Step S1021f, set a batch size, the batch size being a.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
In step S1021g, a learning rate is set, where the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the steps S1021a, S1021b, S1021c, S1021d, S1021e, S1021f, and S1021g can be flexibly adjusted.
In step S1022, the step of bringing the sample vector set into the network structure for calculation to obtain the training model may include:
step S1022a, the sample vector set is input to the input layer for calculation, so as to obtain an output value of the input layer.
Step S1022b, inputting the output value of the input layer into the hidden layer, and obtaining the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
Step S1022c, the output value of the hidden layer is input into the classification layer for calculation, and the prediction probability value [ p ] is obtained1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the last implied hierarchy is the input value of the hierarchy.
Step S1022d, the prediction probability value is brought into the output layer for calculation to obtain the prediction result value y, when p1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
Step S1022e, the network structure is corrected according to the prediction result value y, and a training model is obtained.
And S103, when the application program enters a background, inputting the current characteristic information S of the application program into the training model for calculation.
Referring to fig. 4, in an embodiment, the step S103 may include:
step S1031: and collecting the current characteristic information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
Step S1032: and substituting the current characteristic information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2When y is ═ 01]T
And step S104, judging whether the application program needs to be closed.
When y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
The application program control method provided by the application program control method obtains historical characteristic information xiBy using BP spiritAnd generating a training model through a network algorithm, and when detecting that the application program enters a background, bringing the current characteristic information s of the application program into the training model so as to judge whether the application program needs to be closed or not and intelligently close the application program.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an application management and control apparatus according to an embodiment of the present disclosure. The apparatus 30 includes an obtaining module 31, a generating module 32, a calculating module 33 and a judging module 34.
It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
The obtaining module 31 is configured to obtain a sample vector set of the application program, where sample vectors in the sample vector set include historical feature information x of multiple dimensions of the application programi
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Referring to fig. 6, fig. 6 is a schematic structural diagram of an application management and control apparatus according to an embodiment of the present disclosure. The apparatus 30 further comprises a detection module 35 for detecting that the application enters the background.
The device 30 may also include a storage module 36. The storage module 36 is used for storing the historical feature information x of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 2.
Figure BDA0001452074140000091
Figure BDA0001452074140000101
TABLE 2
It should be noted that the 10-dimensional characteristic information shown in table 2 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
The generating module 32 is configured to calculate a sample vector set by using a BP neural network algorithm, and generate a training model.
The generating module 32 trains the historical feature information x acquired by the acquiring module 31iInputting the historical characteristic information x in a BP neural network algorithmi
Referring to fig. 6, the generating module 32 includes a defining module 321 and a solving module 322.
The defining module 321 is configured to define a network structure.
The defining module 321 may include an input layer defining module 3211, a hidden layer defining module 3212, a classification layer defining module 3213, an output layer defining module 3214, an activation function defining module 3215, a batch size defining module 3216, and a learning rate defining module 3217.
The input layer definition module 3211 is configured to set an input layer, where the input layer includes N nodes, and the number of nodes of the input layer and the historical feature information xiAre the same in dimension.
Wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
The hidden layer defining module 3212 is configured to set a hidden layer, which includes M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
The classification layer definition module 3213 is configured to set a classification layer, where the classification layer uses a softmax function, and the softmax function is
Figure BDA0001452074140000111
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000112
is the jth intermediate value.
The output layer defining module 3214 is configured to set an output layer, where the output layer includes 2 nodes.
The activation function definition module 3215 is configured to set an activation function, where the activation function is a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000121
Wherein f (x) ranges from 0 to 1.
The batch size definition module 3216 is configured to set a batch size, which is a.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
The learning rate defining module 3217 is configured to set a learning rate, where the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the input layer defining module 3211 setting an input layer, the hidden layer defining module 3212 setting a hidden layer, the classification layer defining module 3213 setting a classification layer, the output layer defining module 3214 setting an output layer, the activation function defining module 3215 setting an activation function, the batch size defining module 3216 setting a batch size, and the learning rate defining module 3217 setting a learning rate may be flexibly adjusted.
The solving module 322 is configured to bring the sample vector set into a network structure for calculation, so as to obtain a training model.
The solving module 322 may include a first solving module 3221, a second solving module 3222, a third solving module 3223, a fourth solving module 3224 and a correcting module.
The first solving module 3221 is configured to input the sample vector set in the input layer for calculation, so as to obtain an output value of the input layer.
The second solving module 3222 is configured to input the output value of the input layer in the hidden layer, so as to obtain the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
The third step ofThe solution module 3223 is configured to input the output value of the hidden layer in the classification layer for calculation, so as to obtain the prediction probability value [ p [ ]1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
The fourth solving module 3224 is configured to bring the prediction probability value into an output layer for calculation, so as to obtain a prediction result value y, when p is1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
The correcting module 3225 is configured to correct the network structure according to the prediction result value y, so as to obtain a training model.
The calculation module 33 is configured to input the current feature information s of the application program into the training model for calculation when the application program enters the background.
Referring to fig. 6, in an embodiment, the calculation module 33 may include an acquisition module 331 and an operation module 332.
The collecting module 331 is configured to collect current feature information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
The operation module 332 is used for substituting the current feature information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2 When y is [ 01 ]]T
In an embodiment, the acquiring module 331 is configured to acquire current feature information s at regular time according to a predetermined acquiring time and store the current feature information s in the storage module 36, and the acquiring module 331 is further configured to acquire the current feature information s corresponding to a time point when the application program enters the background, and input the current feature information s into the operation module 332 for being taken into the training model for calculation.
The determining module 34 is configured to determine whether the application needs to be closed.
When y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
The apparatus 30 may further include a shutdown module 37 configured to shutdown the application when it is determined that the application needs to be shutdown.
The device for the application program management and control method provided by the application obtains the historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device 500 includes: a processor 501 and a memory 502. The processor 501 is electrically connected to the memory 502.
The processor 501 is a control center of the electronic device 500, connects various parts of the whole electronic device 500 by using various interfaces and lines, executes various functions of the electronic device and processes data by running or loading an application program stored in the memory 502 and calling the data stored in the memory 502, thereby monitoring the whole electronic device 500.
In this embodiment, the processor 501 in the electronic device 500 loads instructions corresponding to processes of one or more application programs into the memory 502 according to the following steps, and the processor 501 runs the application programs stored in the memory 502, so as to implement various functions:
obtaining the sample vector set of the application program, wherein the sample vector packet in the sample vector setHistorical feature information x comprising multiple dimensions of the applicationi
Calculating a sample vector set by adopting a neural network algorithm to generate a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation; and
and judging whether the application program needs to be closed or not.
It should be noted that the application program may be a chat application program, a video application program, a music application program, a shopping application program, a shared bicycle application program, or a mobile banking application program.
Acquiring a sample vector set of the application program from a sample database, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Wherein, the characteristic information of the plurality of dimensions can refer to table 3.
Figure BDA0001452074140000141
Figure BDA0001452074140000151
TABLE 3
It should be noted that the 10-dimensional characteristic information shown in table 3 is only one in the embodiment of the present application, but the present application is not limited to the 10-dimensional characteristic information shown in table 1, and may also be one of the 10-dimensional characteristic information, or at least two of the 10-dimensional characteristic information, or may also include characteristic information of other dimensions, for example, whether charging is currently performed, the current amount of power is currently performed, or whether WiFi is currently connected, or the like.
In one embodiment, historical feature information may be selected for 6 dimensions:
A. the time that the application resides in the background;
B. whether the screen is bright, for example, the screen is bright and is marked as 1, and the screen is off and is marked as 0;
C. counting the total use times in the week;
D. counting the total use time of the week;
E. whether WiFi is on or not, for example, WiFi is on and is recorded as 1, WiFi is off and is recorded as 0; and
F. whether charging is currently in progress, e.g., currently charging, noted as 1, and not currently charging, noted as 0.
In one embodiment, the processor 501 calculates the sample vector set by using a BP neural network algorithm, and generating the training model further includes:
defining a network structure; and
and bringing the sample vector set into a network structure for calculation to obtain a training model.
Wherein the defining a network structure comprises:
setting an input layer, wherein the input layer comprises N nodes, the number of the nodes of the input layer is equal to the historical characteristic information xiAre the same in dimension;
wherein the historical feature information xiIs less than 10, and the number of nodes of the input layer is less than 10, so as to simplify the operation process.
In one embodiment, the historical feature information xiIs 6 dimensions, the input layer comprises 6 nodes.
Setting a hidden layer, wherein the hidden layer comprises M nodes.
Wherein the hidden layer may comprise a plurality of hidden hierarchies. The number of nodes of each implicit hierarchy is less than 10, so that the operation process is simplified.
In one embodiment, the hidden layers may include a first hidden layer, a second hidden layer, and a third hidden layer. The first implied hierarchy includes 10 nodes, the second implied hierarchy includes 5 nodes, and the third implied hierarchy includes 5 nodes.
Setting a classification layer, wherein the classification layer adopts a softmax function, and the softmax function is
Figure BDA0001452074140000161
Wherein p is a prediction probability value, ZKIs a median value, C is the number of classes of prediction,
Figure BDA0001452074140000162
is the jth intermediate value.
And setting an output layer, wherein the output layer comprises 2 nodes.
Setting an activation function, wherein the activation function adopts a sigmoid function, and the sigmoid function is
Figure BDA0001452074140000163
Wherein f (x) ranges from 0 to 1.
And setting a batch size, wherein the batch size is A.
The batch size can be flexibly adjusted according to actual conditions. The batch size may be 50-200 a.
In one embodiment, the batch size is 128.
And setting a learning rate, wherein the learning rate is B.
Wherein, the learning rate can be flexibly adjusted according to the actual situation. The learning rate may be 0.1-1.5.
In one embodiment, the learning rate is 0.9.
It should be noted that the sequence of the setting input layer, the setting hidden layer, the setting classification layer, the setting output layer, the setting activation function, the setting batch size, and the setting learning rate can be flexibly adjusted.
The step of bringing the sample vector set into a network structure for calculation to obtain a training model may include:
and inputting the sample vector set in the input layer for calculation to obtain an output value of the input layer.
And inputting the output value of the input layer into the hidden layer to obtain the output value of the hidden layer.
Wherein the output value of the input layer is the input value of the hidden layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the input layer is the input value of the first implied hierarchy. The output value of the first implied hierarchy is the input value of the second implied hierarchy. And the output value of the second implied hierarchy is the input value of the third implied hierarchy, and so on.
Inputting the output value of the hidden layer in the classification layer for calculation to obtain the prediction probability value [ p1p2]T
Wherein the output value of the hidden layer is the input value of the classification layer.
In one embodiment, the hidden layer may include a plurality of hidden hierarchies. The output value of the last implied hierarchy is the input value of the hierarchy.
Substituting the prediction probability value into an output layer for calculation to obtain a prediction result value y, when p is1Greater than p2When y is [10 ]]TWhen p is1P is less than or equal to2When y is [ 01 ]]T
Wherein the output value of the classification layer is the input value of the output layer.
And correcting the network structure according to the predicted result value y to obtain a training model.
When the application program enters the background, the step of inputting the current characteristic information s of the application program into the training model for calculation comprises the following steps:
and collecting the current characteristic information s of the application program.
Acquiring the dimension of the current characteristic information s of the application program and the acquired historical characteristic information x of the application programiAre the same.
And substituting the current characteristic information s into the training model for calculation.
Inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value [ p ] of the classification layer1’ p2’]TWhen p is1' greater than p2When y is 10]TWhen p is1' less than or equal to p2When y is ═ 01]T
In the step of judging whether the application program needs to be closed, when y is ═ 10]TDetermining that the application needs to be closed; when y is ═ 01]TAnd determining that the application needs to be reserved.
The memory 502 may be used to store applications and data. The memory 502 stores programs containing instructions executable in the processor. The programs may constitute various functional modules. The processor 501 executes various functional applications and data processing by executing programs stored in the memory 502.
In some embodiments, as shown in fig. 8, fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. The electronic device 500 further comprises: radio frequency circuit 503, display 504, control circuit 505, input unit 506, audio circuit 507, sensor 508, and power supply 509. The processor 501 is electrically connected to the radio frequency circuit 503, the display 504, the control circuit 505, the input unit 506, the audio circuit 507, the sensor 508, and the power supply 509.
The radio frequency circuit 503 is used for transceiving radio frequency signals to communicate with a server or other electronic devices through a wireless communication network.
The display 504 may be used to display information entered by or provided to the user as well as various graphical user interfaces of the terminal, which may be comprised of images, text, icons, video, and any combination thereof.
The control circuit 505 is electrically connected to the display 504 and is configured to control the display 504 to display information.
The input unit 506 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The audio circuit 507 may provide an audio interface between the user and the terminal through a speaker, microphone.
The sensor 508 is used to collect external environmental information. The sensors 508 may include one or more of ambient light sensors, acceleration sensors, gyroscopes, and the like.
The power supply 509 is used to power the various components of the electronic device 500. In some embodiments, power supply 509 may be logically coupled to processor 501 through a power management system to manage charging, discharging, and power consumption management functions through the power management system.
Although not shown in fig. 8, the electronic device 500 may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
The electronic equipment provided by the application acquires the historical characteristic information xiAnd generating a training model by adopting a BP neural network algorithm, and when detecting that the application program enters the background, bringing the current characteristic information s of the application program into the training model, further judging whether the application program needs to be closed, and intelligently closing the application program.
The embodiment of the present invention further provides a medium, where multiple instructions are stored, and the instructions are suitable for being loaded by a processor to execute the application management and control method according to any one of the above embodiments.
The application program control method, device, medium and electronic device provided by the embodiment of the invention belong to the same concept, and the specific implementation process is detailed in the whole specification and is not described herein again.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The application management and control method, device, medium and electronic device provided by the embodiments of the present application are described in detail above, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the descriptions of the above embodiments are only used to help understanding the present application. Meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An application program control method is applied to electronic equipment, and is characterized by comprising the following steps:
obtaining a sample vector set of the application program, wherein the sample vectors in the sample vector set comprise historical feature information x of multiple dimensions of the application programi
Defining a network structure, including setting an input layer, a hidden layer, a classification layer, an output layer, an activation function, a batch size and a learning rate, wherein the input layer comprises N nodes, the number of the nodes of the input layer and the historical characteristic information xiThe hidden layer comprises M nodes, the classification layer adopts a Softmax function, the output layer comprises 2 nodes, the activation function adopts a sigmoid function, and the sigmoid function is
Figure FDA0002320346460000011
Wherein f (x) ranges from 0 to 1, the batch size is A, and the learning rate is B;
inputting the sample vector set into an input layer for calculation to obtain an output value of the input layer, inputting the output value of the input layer into the hidden layer to obtain the output value of the hidden layer, inputting the output value of the hidden layer into the classification layer for calculation to obtain a prediction probability value, substituting the prediction probability value into the output layer for calculation to obtain a prediction result value y, and correcting the network structure according to the prediction result value y to obtain a training model;
when an application program enters a background, inputting the current characteristic information s of the application program into the training model for calculation to obtain a calculation result; and
and judging whether the application program needs to be closed or not according to the calculation result.
2. The application management and control method according to claim 1, characterized in that: and in the step of inputting the current characteristic information s of the application program into the training model for calculation, inputting the current characteristic information s into the training model for calculation to obtain the prediction probability value of the classification layer.
3. The application management and control method according to claim 1, characterized in that: the hidden layers comprise a first hidden layer, a second hidden layer and a third hidden layer, and the number of nodes of each of the first hidden layer, the second hidden layer and the third hidden layer is less than 10.
4. An application management and control apparatus, comprising:
an obtaining module, configured to obtain a sample vector set of the application program, where a sample vector in the sample vector set includes historical feature information x of multiple dimensions of the application programi
A generating module for defining a network structure, including setting an input layer, a hidden layer, a classification layer, an output layer, an activation function, a batch size and a learning rate, wherein the input layer includes N nodes, the number of the nodes of the input layer and the historical characteristic information xiThe hidden layer comprises M nodes, the classification layer adopts a Softmax function, the output layer comprises 2 nodes, the activation function adopts a sigmoid function, and the sigmoid function is
Figure FDA0002320346460000021
Wherein the range of f (x) is 0 to 1, the batch size is A, the learning rate is B, the sample vector set is input into an input layer for calculation to obtain an output value of the input layer, the output value of the input layer is input into the hidden layer to obtain an output value of the hidden layer, the output value of the hidden layer is input into the classification layer for calculation to obtain a prediction probability value, the prediction probability value is substituted into the output layer for calculation to obtain a prediction result value y, and the network structure is corrected according to the prediction result value y to obtain a training model;
the calculation module is used for inputting the current characteristic information s of the application program into the training model for calculation when the application program is substituted into the background to obtain a calculation result; and
and the judging module is used for judging whether the application program needs to be closed or not according to the calculation result.
5. The application management and control apparatus according to claim 4, wherein: and the computing module is used for inputting the current characteristic information s of the application program into the training model for computing when the application program enters a background to obtain the prediction probability value of the classification layer.
6. The application management and control apparatus according to claim 4, wherein: the system also comprises a detection module used for detecting that the application program enters the background.
7. The application management and control apparatus according to claim 4, wherein: the system also comprises a storage module used for storing the characteristic information of the application program.
8. The application management and control apparatus according to claim 4, wherein: the system also comprises a closing module used for closing the application program when judging that the application program needs to be closed.
9. A medium, characterized by: the medium has stored therein a plurality of instructions adapted to be loaded by a processor to perform the application hosting method of any one of claims 1 to 3.
10. An electronic device, characterized in that: the electronic device comprises a processor and a memory, wherein the electronic device is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for executing the application program management and control method according to any one of claims 1 to 3.
CN201711044959.5A 2017-10-31 2017-10-31 Application program control method, device, medium and electronic equipment Active CN107885544B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711044959.5A CN107885544B (en) 2017-10-31 2017-10-31 Application program control method, device, medium and electronic equipment
PCT/CN2018/110518 WO2019085749A1 (en) 2017-10-31 2018-10-16 Application program control method and apparatus, medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711044959.5A CN107885544B (en) 2017-10-31 2017-10-31 Application program control method, device, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN107885544A CN107885544A (en) 2018-04-06
CN107885544B true CN107885544B (en) 2020-04-10

Family

ID=61783058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711044959.5A Active CN107885544B (en) 2017-10-31 2017-10-31 Application program control method, device, medium and electronic equipment

Country Status (2)

Country Link
CN (1) CN107885544B (en)
WO (1) WO2019085749A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885544B (en) * 2017-10-31 2020-04-10 Oppo广东移动通信有限公司 Application program control method, device, medium and electronic equipment
CN109101326A (en) * 2018-06-06 2018-12-28 三星电子(中国)研发中心 A kind of background process management method and device
CN110286949A (en) * 2019-06-27 2019-09-27 深圳市网心科技有限公司 Process based on the read-write of physical host storage device hangs up method and relevant device
CN110286961A (en) * 2019-06-27 2019-09-27 深圳市网心科技有限公司 Process based on physical host processor hangs up method and relevant device
CN110275760A (en) * 2019-06-27 2019-09-24 深圳市网心科技有限公司 Process based on fictitious host computer processor hangs up method and its relevant device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389193A (en) * 2015-12-25 2016-03-09 北京奇虎科技有限公司 Accelerating processing method, device and system for application, and server
CN106648023A (en) * 2016-10-02 2017-05-10 上海青橙实业有限公司 Mobile terminal and power-saving method of mobile terminal based on neural network
CN107133094A (en) * 2017-06-05 2017-09-05 努比亚技术有限公司 Application management method, mobile terminal and computer-readable recording medium
CN107145215A (en) * 2017-05-06 2017-09-08 维沃移动通信有限公司 A kind of background application method for cleaning and mobile terminal

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306095B (en) * 2011-07-21 2017-04-05 宇龙计算机通信科技(深圳)有限公司 Application management method and terminal
KR20160091786A (en) * 2015-01-26 2016-08-03 삼성전자주식회사 Method and apparatus for managing user
US10572797B2 (en) * 2015-10-27 2020-02-25 Pusan National University Industry—University Cooperation Foundation Apparatus and method for classifying home appliances based on power consumption using deep learning
CN106909447B (en) * 2015-12-23 2019-11-15 北京金山安全软件有限公司 Background application processing method and device and terminal
CN105718027B (en) * 2016-01-20 2019-05-31 努比亚技术有限公司 The management method and mobile terminal of background application
CN105808410B (en) * 2016-03-29 2019-05-31 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN106354836A (en) * 2016-08-31 2017-01-25 南威软件股份有限公司 Advertisement page prediction method and device
CN107643948B (en) * 2017-09-30 2020-06-02 Oppo广东移动通信有限公司 Application program control method, device, medium and electronic equipment
CN107608748B (en) * 2017-09-30 2019-09-13 Oppo广东移动通信有限公司 Application program management-control method, device, storage medium and terminal device
CN107885544B (en) * 2017-10-31 2020-04-10 Oppo广东移动通信有限公司 Application program control method, device, medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389193A (en) * 2015-12-25 2016-03-09 北京奇虎科技有限公司 Accelerating processing method, device and system for application, and server
CN106648023A (en) * 2016-10-02 2017-05-10 上海青橙实业有限公司 Mobile terminal and power-saving method of mobile terminal based on neural network
CN107145215A (en) * 2017-05-06 2017-09-08 维沃移动通信有限公司 A kind of background application method for cleaning and mobile terminal
CN107133094A (en) * 2017-06-05 2017-09-05 努比亚技术有限公司 Application management method, mobile terminal and computer-readable recording medium

Also Published As

Publication number Publication date
WO2019085749A1 (en) 2019-05-09
CN107885544A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
CN107885544B (en) Application program control method, device, medium and electronic equipment
CN107643948B (en) Application program control method, device, medium and electronic equipment
CN107463403B (en) Process control method, device, storage medium and electronic equipment
US11249645B2 (en) Application management method, storage medium, and electronic apparatus
CN107608748B (en) Application program management-control method, device, storage medium and terminal device
US20200241483A1 (en) Method and Device for Managing and Controlling Application, Medium, and Electronic Device
CN113284142B (en) Image detection method, image detection device, computer-readable storage medium and computer equipment
CN107765853A (en) Using method for closing, device, storage medium and electronic equipment
CN111797288A (en) Data screening method and device, storage medium and electronic equipment
CN111797861A (en) Information processing method, information processing apparatus, storage medium, and electronic device
KR20180010243A (en) Method and apparatus for human face model matrix training and storage medium
CN110838306B (en) Voice signal detection method, computer storage medium and related equipment
CN111738365A (en) Image classification model training method and device, computer equipment and storage medium
CN112036492B (en) Sample set processing method, device, equipment and storage medium
CN107861770B (en) Application program management-control method, device, storage medium and terminal device
CN108875901A (en) Neural network training method and generic object detection method, device and system
CN107766892B (en) Application program control method and device, storage medium and terminal equipment
CN111984803B (en) Multimedia resource processing method and device, computer equipment and storage medium
CN112948763B (en) Piece quantity prediction method and device, electronic equipment and storage medium
CN109066870B (en) Charging management method, device, medium and electronic equipment applying method
CN110809083B (en) Mobile terminal information reminding method, mobile terminal and storage medium
CN107844375B (en) Using method for closing, device, storage medium and electronic equipment
CN111599417A (en) Method and device for acquiring training data of solubility prediction model
CN111242322A (en) Detection method and device for rear door sample and electronic equipment
CN110556884A (en) charging management method, device, medium and electronic equipment applying method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant before: Guangdong OPPO Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant