CN108762493A - A kind of method and mobile terminal of control application program - Google Patents

A kind of method and mobile terminal of control application program Download PDF

Info

Publication number
CN108762493A
CN108762493A CN201810462421.4A CN201810462421A CN108762493A CN 108762493 A CN108762493 A CN 108762493A CN 201810462421 A CN201810462421 A CN 201810462421A CN 108762493 A CN108762493 A CN 108762493A
Authority
CN
China
Prior art keywords
operator
facial features
mobile terminal
application program
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810462421.4A
Other languages
Chinese (zh)
Other versions
CN108762493B (en
Inventor
胡青松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810462421.4A priority Critical patent/CN108762493B/en
Publication of CN108762493A publication Critical patent/CN108762493A/en
Application granted granted Critical
Publication of CN108762493B publication Critical patent/CN108762493B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

An embodiment of the present invention provides a kind of method and mobile terminal of control application program, this method includes:In the start-up course of application program, the facial characteristics of operator is obtained, the facial characteristics includes facial expression;According to the correspondence of control instruction and facial characteristics, the corresponding control instruction of the facial characteristics of operator is determined;Application program is controlled accordingly according to control instruction.When there is the application program started in mobile terminal, whether mobile terminal can need to handle the application program accordingly according to facial characteristics automatic decision, when needing to handle the application program, application program is handled accordingly automatically according to the corresponding control instruction of facial characteristics, reduce manual operation of the operator to application program, the problem of application program launching is led to due to maloperation can be quickly coped with, the application program occupying system resources for avoiding maloperation from starting, reduce the energy consumption of mobile terminal.

Description

Method for controlling application program and mobile terminal
Technical Field
The present invention relates to the field of mobile terminal technologies, and in particular, to a method for controlling an application and a mobile terminal.
Background
Currently, when an operator starts an application program, the situation that the application program is started due to misoperation of the operator may occur, and when the operator finds that the started application program is started by misoperation, the operator needs to manually click a Home key (start key) to put the application program in the background or manually click a backspace key to finish the application program.
Therefore, when the problem of starting the application program due to misoperation is solved, the existing method needs an operator to manually input an operation instruction after finding the misoperation, so that the application program started by the misoperation cannot be timely ended or put into a background to run.
Disclosure of Invention
The embodiment of the invention provides a method for controlling an application program and a mobile terminal, and aims to solve the problem that the application program started by misoperation cannot be timely and correspondingly processed.
According to an aspect of an embodiment of the present invention, there is provided a method of controlling application control, including:
acquiring facial features of an operator in the starting process of an application program, wherein the facial features comprise facial expressions;
determining a control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features;
and correspondingly controlling the application program according to the control instruction.
According to another aspect of the embodiments of the present invention, there is also provided a mobile terminal, including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring facial features of an operator in the starting process of an application program, and the facial features comprise facial expressions;
the determining module is used for determining a control instruction corresponding to the facial feature of the operator according to the corresponding relation between the control instruction and the facial feature;
and the control module is used for correspondingly controlling the application program according to the control instruction.
According to yet another aspect of the embodiments of the present invention, there is also provided a mobile terminal including a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program implementing the steps of the method of controlling an application program as described above when executed by the processor.
The embodiment of the invention has the following beneficial effects:
when detecting that the application program which is being started exists in the mobile terminal, the mobile terminal can automatically judge whether corresponding processing needs to be carried out on the application program according to the obtained facial features of the operator, and when the application program needs to be processed, the corresponding processing is automatically carried out on the application program according to the control instruction corresponding to the facial features, so that manual operation of the operator on the application program which is being started is reduced, the problem that the application program is started due to misoperation can be quickly solved, the application program which is started by misoperation is prevented from occupying system resources, and the energy consumption of the mobile terminal is reduced.
Drawings
Fig. 1 is a flowchart of a method for controlling application control according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for controlling application control according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating positions of an operator and a mobile terminal according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating a location of an operator and a mobile terminal according to another embodiment of the present invention;
FIG. 5 is a flowchart of another method for controlling application control according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terms first, second and the like in the description and in the claims of the present invention are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
The embodiment of the invention provides a method for controlling application programs, wherein an execution main body of the method can be a mobile terminal, and the mobile terminal can be intelligent equipment such as a smart phone and a computer.
Fig. 1 is a flowchart of a method for controlling application program control according to an embodiment of the present invention, which includes the following specific steps:
step 101: in the starting process of the application program, acquiring facial features of an operator;
an application is a computer program, which is a visual user interface, that is designed to perform one or more specific tasks and that can interact with an operator. For example, applications include: WeChat, Paibao, Taobao, etc.
In the embodiment of the present invention, the facial features are used to indicate whether the operator has started the application program to run by mistake, and the facial features include: facial expression (facial expression) or, further, facial expression-combined gestures, wherein facial expression means various emotional states expressed by changes in eye muscles, face muscles, and mouth muscles, etc.
In an embodiment of the present invention, the facial features of the operator include, but are not limited to: head shaking expression, glazer expression, frown expression, etc., facial expression includes in combination with the gesture: a nasal expression, a chin expression, etc.
Step 102: and determining a control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features.
In the embodiment of the present invention, the correspondence between the control instruction and the facial feature may be configured in advance, and the specific process of establishing the correspondence between the control instruction and the facial feature includes:
collecting at least one kind of characteristic information, configuring a control instruction corresponding to the at least one kind of characteristic information, and establishing a corresponding relation between the control instruction and the characteristic information, see table 1.
Table 1: correspondence of control instructions to facial features
It should be noted that the one-to-one correspondence relationship between the feature information and the control command is only an example, and for example: the control instruction corresponding to the shaking expression can also be a control instruction for instructing the mobile terminal to return to the previous interface. The corresponding relationship between the control command and the feature information in the embodiment of the present invention is not particularly limited.
Step 103: and correspondingly controlling the application program according to the control instruction.
When the control instruction is a control instruction corresponding to the shaking-head expression, stopping starting of the application program;
and when the control instruction is the control instruction corresponding to the glazele expression, placing the started application program in the background of the mobile terminal for running.
When the control instruction is a control instruction corresponding to the nose-touching expression, directly displaying the desktop of the mobile terminal, for example: and if the Application program which is being started is detected to be a first APP (Application), and the operator is detected to touch the nose expression, exiting the first APP and displaying the desktop of the mobile terminal.
When the control instruction is a control instruction corresponding to the chin touch expression, returning to the previous interface, for example: if the running application program is the first APP, the second APP is detected to be started at the same time, and at the moment, if the situation that the operator touches the chin expression is detected, the application interface of the first APP is returned.
In the embodiment of the invention, when the application program which is being started exists in the mobile terminal, the mobile terminal can automatically judge whether the corresponding processing needs to be carried out on the application program according to the acquired facial features (such as facial expressions and the like of the operator) of the operator, and when the application program needs to be processed, the corresponding processing is automatically carried out on the application program according to the control instruction corresponding to the facial features, so that the manual operation of the operator on the application program which is being started is reduced, the problem that the application program is started due to misoperation can be quickly solved, the application program which is started by misoperation is prevented from occupying system resources, and the energy consumption of the mobile terminal is reduced.
The embodiment of the invention also provides another method for controlling the application program, and the execution main body of the method can be a mobile terminal, wherein the mobile terminal can be intelligent equipment such as a smart phone, a computer and the like.
Fig. 2 is a flowchart of another method for controlling application program control according to an embodiment of the present invention, which includes the following specific steps:
step 201: whether the application program which is being started exists in the mobile terminal is detected.
An application is a computer program, which is a visual user interface, that is designed to perform one or more specific tasks and that can interact with an operator. For example, applications include: a third application, a second application, a first application, and so on.
Step 202: when the application program which is being started is detected, a distance value between the face of the operator and the display screen of the mobile terminal and/or an angle value between the face of the operator and the display screen of the mobile terminal are/is acquired. When the distance value is less than or equal to the distance preset value and/or the angle value is less than or equal to the angle preset value, performing the step of acquiring the facial features of the operator, namely performing step 203; otherwise, the flow ends.
for example, referring to fig. 3, taking the display screen of the mobile terminal as a reference plane and the center of the display screen of the mobile terminal as a reference point, a distance value L between the face of the operator and the display screen of the mobile terminal and/or an angle value α between the face of the operator and the display screen of the mobile terminal are obtained, for example, the distance preset value may be 1m, the angle preset value may be 30 °, that is, if L ≦ 1m, and α ≦ 30 °, step 203 is performed.
In the embodiment of the invention, the distance preset value and the angle preset value can be adjusted by monitoring the use habits of the operators, so that the use habits of the operators are closer to the habits of each operator, the mobile terminal becomes more and more intelligent, and the occurrence of misoperation is reduced.
It should be noted that the above description of the distance preset value and the angle preset value is only an example and is not limiting, and it should be understood that the distance preset value and the angle preset value are not specifically limited in the embodiment of the present invention.
Step 203: and when the distance value is smaller than or equal to the distance preset value and/or the angle value is smaller than or equal to the angle preset value, acquiring the facial features of the operator.
In the embodiment of the present invention, the facial feature of the operator may be a facial expression or gesture of the operator, but is not limited thereto.
In steps 202 and 203, acquiring a distance value between the face of the operator and a display screen of the mobile terminal; if the distance value is smaller than or equal to the preset distance value, acquiring facial features of the operator; or acquiring an angle value between the face of the operator and a display screen of the mobile terminal; if the angle value is smaller than or equal to the preset angle value, acquiring facial features of the operator; or acquiring a distance value between the face of the operator and a display screen of the mobile terminal and an angle value between the face of the operator and the display screen of the mobile terminal; and if the distance value is smaller than or equal to a preset distance value and the angle value is smaller than or equal to a preset angle value, acquiring the facial features of the operator.
Step 204: detecting whether the facial features of the operator are matched with the set standard facial features; when it is detected that the facial features of the operator match the set standard facial features, executing a step of determining control instructions corresponding to the facial features of the operator according to the corresponding relationship between the control instructions and the facial features, namely executing step 205; otherwise, the flow ends.
In this embodiment of the present invention, the facial feature may be a facial expression or a facial expression combined gesture, and if the facial expression is a facial expression, the facial expression includes: head shaking expression, glazer expression, frown expression, etc., facial expression includes in combination with the gesture: and (3) touching a nose expression, a chin expression and the like, and then taking a shaking head expression and a glaring expression as examples to explain the matching process of the facial features and the set standard facial features.
1) If the facial features are oscillating expressions, whether the facial features of an operator are matched with the set standard facial features can be detected by judging whether the angular speed of the face oscillation is within a preset oscillation speed range; and if the angular speed of the human face swing is within the preset swing speed range, the facial features of the operator are determined to be successfully matched with the set standard facial features.
For example, if the human face continuously swings for 2 times from left to right and then from right to left, or continuously swings for 2 times from left to right and then from right to left, and the swing angular velocity is within 3.7 to 5.7rad/s, it is determined that the facial features corresponding to the image information of the detected object are successfully matched with the pre-stored facial features.
Referring to fig. 4, if the swing angle of the face with respect to the vertical center line of the face is 45 °, the total swing angle of the face is
45°+45°*2+45°*2+45°=270°
The swing angular velocity of the human face is
270°/(360°/(2π))=4.7rad/s
Therefore, the swing angular speed of the human face is within the preset range of 3.7-5.7 rad/s, and the facial features of the operator are successfully matched with the set standard facial features.
It should be noted that, the radian and the speed of the face swing of each person are different, the vertical offset of the nose bridge relative to the mobile terminal is also different, and the preset swing speed range can be adjusted by monitoring the use behavior habits of the operator, so that the swing speed range is closer to the habits of each operator, the mobile terminal becomes more and more intelligent, and the occurrence of misoperation is reduced.
It should be noted that, in the embodiment of the present invention, the value range of the preset swing speed range is not specifically limited.
2) If the facial features are glazeous expressions, whether the facial features of the operator are matched with the set standard facial features or not can be detected according to the residence time of the eyes of the operator on the display screen; and if the residence time of the eyes of the operator on the display screen is within the preset time range, confirming that the facial features corresponding to the image information of the detected object are successfully matched with the pre-stored facial features.
For example, when the eyeball of the operator moves to the upper right corner or the upper left corner and the residence time of the eyes of the operator on the display screen is 0.5-1 second, the face features corresponding to the image information of the detected object are determined to be successfully matched with the pre-stored face features.
It should be noted that each gazelle expression is different, and the preset time range can be adjusted by monitoring the use behavior habits of the operators, so that the use behavior habits are closer to the habits of each operator, the mobile terminal becomes more and more intelligent, and the occurrence of misoperation is reduced.
It should be noted that the above description of the matching process of the facial features and the set standard facial features is only an example and is not limiting.
Step 205: and when the facial features of the operator are detected to be matched with the set standard facial features, determining the control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features.
In the embodiment of the present invention, the correspondence between the control instruction and the facial feature needs to be configured in advance, and the specific process of establishing the correspondence between the control instruction and the facial feature includes:
at least one facial feature is collected, a control instruction corresponding to the at least one facial feature is configured, and the corresponding relation between the control instruction and the facial feature is established, which is shown in table 2.
Table 2: correspondence of control instructions to facial features
Referring to fig. 5, the process of establishing the correspondence between the control instruction and the facial feature specifically includes:
step 501: at least one facial feature is collected by a collection module.
Step 502: judging whether the acquisition module is started for the first time; if the acquisition module is started for the first time, executing step 503; otherwise, go to step 504.
Step 503: setting a password; after the password is set, step 506 is performed.
Step 504: judging whether the password verification exceeds the preset times within 1 hour, wherein the preset times can be 3 times; if the password verification exceeds the preset times within 1 hour, locking the acquisition function; otherwise, step 505 is performed.
Step 505: judging whether the password is successfully verified, if so, executing step 506 and starting to collect the facial features of the user; otherwise, proceed to step 504.
Step 506: collecting the shaking expression of the operator, and then executing step 507;
step 507: acquiring facial features corresponding to the shaking expression, and then executing step 508;
step 508: control instructions corresponding to facial features corresponding to the moving head expressions are configured, and then step 509 is executed. For example, the operation instruction corresponding to the shaking expression is set to return to the desktop.
Step 509: judging whether to continue to collect facial features; if the facial features continue to be collected, go to step 510; otherwise, the corresponding relation building process of the control instruction and the facial features is completed, and the flow is ended.
Step 510: collecting the glazeous expression of the operator, and then executing step 511;
step 511: collecting facial features corresponding to the glazeous expression of the operator, and then executing step 512;
step 512: and configuring an operation instruction corresponding to the facial feature corresponding to the glaring expression of the operator, and then executing step 513.
For example, the operation instruction corresponding to the facial feature corresponding to the glaring expression is set to place the application program being started in the background of the mobile terminal for running.
Step 513: judging whether to continue to collect the facial features of the operator or not, and if so, continuing to collect the expression of the operator; otherwise, the corresponding relation building process of the control instruction and the facial features is completed, and the flow is ended.
Step 206: and correspondingly controlling the application program according to the control instruction.
For example, the facial features are facial expressions, and the facial expressions include: and controlling the started application program according to the control instruction corresponding to the head shaking expression or the eye glaring expression.
Specifically, according to the control instruction, stopping the starting of the application program, or placing the application program being started in the background of the mobile terminal for running includes:
when the facial features corresponding to the control instructions are shaking expressions, stopping starting of the application program; or when the facial feature corresponding to the control instruction is the glaring expression, placing the started application program in the background of the mobile terminal for running.
In the embodiment of the invention, when detecting that the application program which is being started exists in the mobile terminal, the mobile terminal judges whether the distance value between the face of the operator and the display screen of the mobile terminal is less than or equal to the preset distance value and/or whether the angle value between the face of the operator and the display screen of the mobile terminal is less than or equal to the preset angle value, and detects whether the facial features of the operator are matched with the set standard facial features, the mobile terminal can automatically judge whether corresponding processing is needed to be carried out on the application program according to the obtained facial features of the operator (such as facial expressions of the operator, and the like), when the application program is needed to be processed, the corresponding processing is automatically carried out on the application program according to the control instructions corresponding to the facial features, so that the manual operation of the operator on the application program which is being started is reduced, and the problem of starting the application program caused by misoperation can be quickly solved, the application program started by misoperation is prevented from occupying system resources, and the energy consumption of the mobile terminal is reduced.
Furthermore, the distance preset value, the angle preset value and the standard facial features can be adjusted by monitoring the using habits of the operators, so that the using habits of the operators are closer to the habits of each operator, the method can become more and more intelligent, and the occurrence of misoperation can be reduced.
Referring to fig. 6, an embodiment of the present invention provides a mobile terminal, where the mobile terminal 600 includes:
a first obtaining module 601, configured to obtain facial features of an operator during starting of an application program, where the facial features include facial expressions;
a determining module 602, configured to determine, according to a correspondence between the control instruction and the facial feature, a control instruction corresponding to the facial feature of the operator;
and the control module 603 is configured to perform corresponding control on the application program according to the control instruction.
On the basis of the above embodiment, the mobile terminal further includes: a second obtaining module, wherein,
the second obtaining module is further configured to: acquiring a distance value between the face of the operator and a display screen of the mobile terminal; if the distance value is smaller than or equal to a preset distance value, triggering the first acquisition module 601 to acquire the facial features of the operator;
or,
the second obtaining module is further configured to: acquiring an angle value between the face of the operator and a display screen of the mobile terminal; if the angle value is smaller than or equal to the preset angle value, triggering the first obtaining module 601 to obtain the facial features of the operator;
or,
the second obtaining module is further configured to: acquiring a distance value between the face of the operator and a display screen of the mobile terminal and an angle value between the face of the operator and the display screen of the mobile terminal; if the distance value is less than or equal to the preset distance value and the angle value is less than or equal to the preset angle value, the first obtaining module 601 is triggered to obtain the facial features of the operator.
In the embodiment of the present invention, optionally, the control module 603 includes:
the first control unit is used for stopping the starting of the application program when the facial feature corresponding to the control instruction is the shaking expression; or
And the second control unit is used for placing the started application program in the background of the mobile terminal to run when the facial feature corresponding to the control instruction is the glazeous expression.
On the basis of the above embodiment, the mobile terminal further includes:
an acquisition module for acquiring at least one facial feature;
and the configuration module is used for configuring a control instruction corresponding to at least one facial feature and establishing a corresponding relation between the control instruction and the facial feature.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1, fig. 2, and fig. 4, and for avoiding repetition, details are not described here again.
In the embodiment of the present invention, when it is detected that an application program being started exists in the mobile terminal, the mobile terminal can automatically determine whether corresponding processing needs to be performed on the application program according to the obtained facial features of the operator (for example, facial expressions of the operator) through the determining module 602 and the control module 603, and when the application program needs to be processed, automatically perform corresponding processing on the application program according to a control instruction corresponding to the facial features, so as to reduce manual operations of the operator on the application program being started, quickly solve the problem that the application program being started is started due to a misoperation, avoid the application program being started by the misoperation from occupying system resources, and reduce energy consumption of the mobile terminal.
Furthermore, the distance preset value, the angle preset value and the standard facial features can be adjusted by monitoring the using behavior habits of the operator, so that the mobile terminal is closer to the habits of each operator, the mobile terminal can become more and more intelligent, and the occurrence of misoperation can be reduced.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 100 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein a computer program stored in the memory 709 and executable on the processor 710, when executed by the processor 710, performs the steps of: in the starting process of the application program, acquiring facial features of an operator; determining a control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features; and correspondingly controlling the application program according to the control instruction.
In the embodiment of the present invention, when it is detected that the application program being started exists in the mobile terminal, the mobile terminal can automatically determine whether corresponding processing needs to be performed on the application program according to the obtained facial features of the operator (for example, facial expressions of the operator, etc.) through the processor 710, and when the application program needs to be processed, automatically perform corresponding processing on the application program according to the control instruction corresponding to the facial features, so as to reduce manual operations of the operator on the application program being started, quickly solve the problem that the application program is started due to misoperation, avoid that the application program being started by misoperation occupies system resources, and reduce energy consumption of the mobile terminal.
Further, the processor 710 may adjust the distance preset value, the angle preset value, or the standard facial features by monitoring the usage behavior habit of the operator, so that the mobile terminal is closer to the habit of each operator, and further the mobile terminal becomes more and more intelligent, the occurrence of misoperation can be reduced, and unnecessary triggering of the mobile terminal can be reduced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules that are not shown, and thus will not be described in detail herein.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above method for controlling application program control, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method for controlling an application program, applied to a mobile terminal, is characterized in that the method comprises the following steps:
acquiring facial features of an operator in the starting process of an application program, wherein the facial features comprise facial expressions;
determining a control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features;
and correspondingly controlling the application program according to the control instruction.
2. The method of claim 1, wherein prior to said obtaining facial features of an operator, the method further comprises:
acquiring a distance value between the face of the operator and a display screen of the mobile terminal;
the acquiring of the facial features of the operator comprises:
if the distance value is smaller than or equal to the preset distance value, acquiring facial features of the operator;
or,
prior to the acquiring facial features of the operator, the method further comprises:
acquiring an angle value between the face of the operator and a display screen of the mobile terminal;
the acquiring of the facial features of the operator comprises:
if the angle value is smaller than or equal to the preset angle value, acquiring facial features of the operator;
or,
prior to the acquiring facial features of the operator, the method further comprises:
acquiring a distance value between the face of the operator and a display screen of the mobile terminal and an angle value between the face of the operator and the display screen of the mobile terminal;
the acquiring of the facial features of the operator comprises:
and if the distance value is smaller than or equal to a preset distance value and the angle value is smaller than or equal to a preset angle value, acquiring the facial features of the operator.
3. The method according to claim 1, wherein before determining the control instruction corresponding to the facial feature of the operator according to the correspondence between the control instruction and the facial feature, the method further comprises:
detecting whether the facial features of the operator are matched with the set standard facial features;
the determining the control instruction corresponding to the facial feature of the operator according to the corresponding relationship between the control instruction and the facial feature includes:
and when the facial features of the operator are detected to be matched with the set standard facial features, determining the control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features.
4. The method according to claim 1, wherein the performing corresponding control on the application program according to the control instruction comprises:
when the facial features corresponding to the control instructions are shaking expressions, stopping starting of the application program; or
And when the facial features corresponding to the control instructions are the glazeous expressions, placing the started application program in the background of the mobile terminal for running.
5. The method according to any one of claims 1 to 4, further comprising:
collecting at least one facial feature;
and configuring a control instruction corresponding to the at least one facial feature, and establishing a corresponding relation between the control instruction and the facial feature.
6. A mobile terminal, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring facial features of an operator in the starting process of an application program, and the facial features comprise facial expressions;
the determining module is used for determining a control instruction corresponding to the facial feature of the operator according to the corresponding relation between the control instruction and the facial feature;
and the control module is used for correspondingly controlling the application program according to the control instruction.
7. The mobile terminal of claim 6, wherein the mobile terminal further comprises: a second obtaining module, wherein,
the second obtaining module is further configured to: acquiring a distance value between the face of the operator and a display screen of the mobile terminal; if the distance value is smaller than or equal to a preset distance value, triggering the first acquisition module to acquire the facial features of the operator;
or,
the second obtaining module is further configured to: acquiring an angle value between the face of the operator and a display screen of the mobile terminal; if the angle value is smaller than or equal to the preset angle value, triggering the first acquisition module to acquire the facial features of the operator;
or,
the second obtaining module is further configured to: acquiring a distance value between the face of the operator and a display screen of the mobile terminal and an angle value between the face of the operator and the display screen of the mobile terminal; and if the distance value is smaller than or equal to a preset distance value and the angle value is smaller than or equal to a preset angle value, triggering the first acquisition module to acquire the facial features of the operator.
8. The mobile terminal of claim 6, wherein the mobile terminal further comprises:
the second detection module is used for detecting whether the facial features of the operator are matched with the set standard facial features;
and when the facial features of the operator are detected to be matched with the set standard facial features, triggering the determining module to execute the step of determining the control instruction corresponding to the facial features of the operator according to the corresponding relation between the control instruction and the facial features.
9. The mobile terminal of claim 6, wherein the control module comprises:
the first control unit is used for stopping the starting of the application program when the facial feature corresponding to the control instruction is a shaking expression; or
And the second control unit is used for placing the started application program in the background of the mobile terminal to run when the facial feature corresponding to the control instruction is the glaring expression.
10. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the method of controlling an application according to any one of claims 1 to 5.
CN201810462421.4A 2018-05-15 2018-05-15 Method for controlling application program and mobile terminal Active CN108762493B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810462421.4A CN108762493B (en) 2018-05-15 2018-05-15 Method for controlling application program and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810462421.4A CN108762493B (en) 2018-05-15 2018-05-15 Method for controlling application program and mobile terminal

Publications (2)

Publication Number Publication Date
CN108762493A true CN108762493A (en) 2018-11-06
CN108762493B CN108762493B (en) 2022-01-25

Family

ID=64007692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810462421.4A Active CN108762493B (en) 2018-05-15 2018-05-15 Method for controlling application program and mobile terminal

Country Status (1)

Country Link
CN (1) CN108762493B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109902606A (en) * 2019-02-21 2019-06-18 维沃移动通信有限公司 A kind of operating method and terminal device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130113580A (en) * 2012-04-06 2013-10-16 최승철 Facial expression and voice recognizing method for mobile application software
CN104063041A (en) * 2013-03-21 2014-09-24 联想(北京)有限公司 Information processing method and electronic equipment
US20150169051A1 (en) * 2013-12-13 2015-06-18 Sony Corporation Information processing device and information processing method
CN204440060U (en) * 2015-03-16 2015-07-01 佛山市顺德区美的电热电器制造有限公司 Household electrical appliance and operation interface control device thereof
CN106325524A (en) * 2016-09-14 2017-01-11 珠海市魅族科技有限公司 Method and device for acquiring instruction
CN106407796A (en) * 2016-08-25 2017-02-15 北京小米移动软件有限公司 Method and device for controlling flow, and terminal
CN107862059A (en) * 2017-11-14 2018-03-30 维沃移动通信有限公司 A kind of song recommendations method and mobile terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130113580A (en) * 2012-04-06 2013-10-16 최승철 Facial expression and voice recognizing method for mobile application software
CN104063041A (en) * 2013-03-21 2014-09-24 联想(北京)有限公司 Information processing method and electronic equipment
US20150169051A1 (en) * 2013-12-13 2015-06-18 Sony Corporation Information processing device and information processing method
CN204440060U (en) * 2015-03-16 2015-07-01 佛山市顺德区美的电热电器制造有限公司 Household electrical appliance and operation interface control device thereof
CN106407796A (en) * 2016-08-25 2017-02-15 北京小米移动软件有限公司 Method and device for controlling flow, and terminal
CN106325524A (en) * 2016-09-14 2017-01-11 珠海市魅族科技有限公司 Method and device for acquiring instruction
CN107862059A (en) * 2017-11-14 2018-03-30 维沃移动通信有限公司 A kind of song recommendations method and mobile terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109902606A (en) * 2019-02-21 2019-06-18 维沃移动通信有限公司 A kind of operating method and terminal device
CN109902606B (en) * 2019-02-21 2021-03-12 维沃移动通信有限公司 Operation method and terminal equipment

Also Published As

Publication number Publication date
CN108762493B (en) 2022-01-25

Similar Documents

Publication Publication Date Title
CN108459797B (en) Control method of folding screen and mobile terminal
CN107734175B (en) Notification message prompting method and mobile terminal
CN108182019A (en) A kind of suspension control display processing method and mobile terminal
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN108650408B (en) Screen unlocking method and mobile terminal
CN109523253B (en) Payment method and device
CN109710130B (en) Display method and terminal
CN109814799A (en) Screen response control mehtod and terminal device
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN110825223A (en) Control method and intelligent glasses
CN109164908B (en) Interface control method and mobile terminal
CN109002245B (en) Application interface operation method and mobile terminal
CN108540642B (en) Mobile terminal and operation method thereof
CN108089935B (en) Application program management method and mobile terminal
CN111352566B (en) Parameter adjusting method and electronic equipment
CN111261128B (en) Screen brightness adjusting method and electronic equipment
CN109739430B (en) Display method and mobile terminal
CN109194793B (en) Screen adjusting method and mobile terminal
CN108762493B (en) Method for controlling application program and mobile terminal
CN108108608B (en) Control method of mobile terminal and mobile terminal
CN108089799B (en) Control method of screen edge control and mobile terminal
CN109696201B (en) Terminal state determination method and terminal
CN110769153B (en) Image processing method and electronic equipment
CN110472520B (en) Identity recognition method and mobile terminal
CN108762547B (en) Operation method of touch terminal and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant