CN111142666A - Terminal control method, device, storage medium and mobile terminal - Google Patents

Terminal control method, device, storage medium and mobile terminal Download PDF

Info

Publication number
CN111142666A
CN111142666A CN201911379315.0A CN201911379315A CN111142666A CN 111142666 A CN111142666 A CN 111142666A CN 201911379315 A CN201911379315 A CN 201911379315A CN 111142666 A CN111142666 A CN 111142666A
Authority
CN
China
Prior art keywords
expression
mobile terminal
user
control method
terminal control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911379315.0A
Other languages
Chinese (zh)
Inventor
张华�
温鼎宁
贾宇
陶龙西
曾燕云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co Ltd filed Critical Huizhou TCL Mobile Communication Co Ltd
Priority to CN201911379315.0A priority Critical patent/CN111142666A/en
Publication of CN111142666A publication Critical patent/CN111142666A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The invention relates to the technical field of communication, and discloses a terminal control method, a device, a storage medium and a mobile terminal, wherein the terminal control method is applied to the mobile terminal and comprises the following steps: acquiring a head image of a user; generating an AR expression according to the feature information in the head image; and executing preset operation corresponding to the AR expression according to the AR expression. The terminal control method of the invention enables the user to control the mobile terminal without contacting the touch screen of the mobile terminal, thereby realizing more convenient use of the mobile terminal and solving the problem that the mobile terminal is inconvenient to operate by hand in certain scenes.

Description

Terminal control method, device, storage medium and mobile terminal
Technical Field
The invention relates to the technical field of communication, in particular to a terminal control method, a terminal control device, a storage medium and a mobile terminal.
Background
With the development of information technology, the functions of the terminal equipment are more and more abundant, and the terminal equipment is widely applied to the work and life of people. People can process complex and diverse tasks on the terminal equipment, such as reading electronic books, browsing multimedia pictures, playing music and videos, browsing web pages and the like, and therefore frequent interaction between users and the terminal equipment is required, such as basic operations of picture switching, music and video pause, playing, volume size adjustment, web page browsing page sliding and the like.
The traditional interaction process requires a user to operate on the touch screen with hands, however, it is inconvenient to operate the terminal device with hands in some situations, for example, when a patient with disabled hands uses the terminal device, and during outdoor activities in winter, it is necessary to provide a convenient terminal device control method so that the user can operate the terminal device more conveniently.
Disclosure of Invention
The invention provides a terminal control method, a terminal control device, a storage medium and a mobile terminal, and solves the problem that the mobile terminal is inconvenient to operate by hands in certain scenes.
The invention provides a terminal control method, which comprises the following steps:
acquiring a head image of a user;
generating an AR expression according to the feature information in the head image;
and executing preset operation corresponding to the AR expression according to the AR expression.
Further preferably, before the step of acquiring the head image of the user, the method further includes:
receiving a wake-up instruction from a user;
the wake-up instruction comprises a voice wake-up instruction.
Further preferably, the step of generating an AR expression according to feature information in the head image, where the feature information includes a head motion and a facial expression motion.
It is further preferred that the facial expression actions include a single action and a combination of multiple actions.
Further preferably, the executing, according to the AR expression, a preset operation corresponding to the AR expression includes:
and executing corresponding page operation according to the AR expression.
Further preferably, the executing, according to the AR expression, a corresponding page operation includes:
respectively executing page up-sliding operation, page down-sliding operation, page left-sliding operation and page right-sliding operation according to the head-up AR expression, the head-down AR expression, the left-viewing AR expression and the right-viewing AR expression;
and respectively executing a determining operation, a canceling operation, a homepage returning operation, a previous-level returning operation and a screen resting operation according to the smiling AR expression, the mouth opening AR expression, the left eye closing AR expression, the right eye closing AR expression and the double-eye closing AR expression.
The invention also provides a terminal control device, which is applied to a mobile terminal and comprises the following components:
a head image acquisition module: the head image acquisition device is used for acquiring a head image of a user;
AR expression generation module: the AR expression generation module is used for generating an AR expression according to the feature information in the head image;
an execution module: and the AR expression processing unit is used for executing preset operation corresponding to the AR expression according to the AR expression.
Further preferably, the terminal control device further includes:
a receiving module: for receiving a wake-up instruction from a user.
The present invention also provides a computer-readable storage medium having stored therein a plurality of instructions adapted to be loaded by a processor to perform the above-described terminal control method.
The invention also provides a mobile terminal, which comprises a processor and a memory, wherein the processor is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for executing the steps in the terminal control method.
According to the terminal control method, the terminal control device, the storage medium and the mobile terminal, after the awakening instruction of the user is received, the head image of the user is collected, the AR expression is generated according to the characteristic information in the head image, and the preset operation corresponding to the AR expression is executed according to the AR expression, so that the user can control the mobile terminal under the condition of not contacting a touch screen of the mobile terminal, the mobile terminal is used more conveniently, and the problem that the mobile terminal is inconvenient to operate by hands in certain scenes is solved.
Drawings
The technical solution and other advantages of the present invention will become apparent from the following detailed description of specific embodiments of the present invention, which is to be read in connection with the accompanying drawings.
Fig. 1 is a schematic structural diagram of a mobile terminal according to a first embodiment of the present invention;
fig. 2 is another schematic structural diagram of a mobile terminal according to a first embodiment of the present invention;
fig. 3 is a flowchart illustrating a terminal control method according to a second embodiment of the present invention;
fig. 4 is a block diagram of a terminal control device according to a third embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. In the drawings, elements having similar structures are denoted by the same reference numerals. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The embodiment of the invention aims at solving the problem that the mobile terminal is inconvenient to operate by hands in certain scenes.
A first embodiment of the present invention provides a mobile terminal, which may be a mobile terminal such as a smart phone, a tablet computer, or a personal computer. As shown in fig. 1, the mobile terminal 100 includes a processor 101, a memory 102. The processor 101 is electrically connected to the memory 102.
The processor 101 is a control center of the mobile terminal 100, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or loading an application program stored in the memory 102 and calling data stored in the memory 102, thereby performing overall monitoring of the mobile terminal.
In this embodiment, the processor 101 in the mobile terminal 100 loads instructions corresponding to processes of one or more application programs into the memory 102 according to the following steps, and the processor 101 runs the application programs stored in the memory 102, thereby implementing various functions, such as a terminal control method:
acquiring a head image of a user;
generating an AR expression according to the feature information in the head image;
and executing preset operation corresponding to the AR expression according to the AR expression.
The mobile terminal 100 may implement any of the steps of the terminal control method.
Fig. 2 is a block diagram illustrating a specific structure of the mobile terminal 100 according to an embodiment of the present invention. As shown in fig. 2, the mobile terminal 100 may include Radio Frequency (RF) circuitry 110, memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, audio circuitry 160, a transmission module 170 (e.g., wireless fidelity (WiFi), a wireless fidelity (WiFi)), a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 110 is used for receiving and transmitting electromagnetic waves, and performs interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The RF circuitry 110 may include various existing circuit components for performing these functions, such as antennas, radio frequency transceivers, digital signal processors, encryption/decryption chips, Subscriber Identity Module (SIM) cards, memory, and so forth. The RF circuitry 110 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols and technologies, including but not limited to Global System for Mobile Communication (GSM), Enhanced Mobile Communication (EDGE), Wideband Code Division Multiple Access (WCDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wi-Fi) (e.g., IEEE802.11a, IEEE802.11 b, IEEE 802.access g and/or IEEE802.11 n), Internet telephony (voice over Internet Protocol, VoIP), world wide Internet microwave Access (microwave for Wireless Communication, Max-1 Max), and other short message protocols, as well as any other suitable communication protocols, and may even include those that have not yet been developed.
The memory 120 may be used for storing software programs and modules, such as corresponding program instructions in the above-mentioned terminal control method, and the processor 180 executes various functional applications and data processing by running the software programs and modules stored in the memory 120, namely, implementing the acquisition of the frequency of the information transmission signal transmitted by the mobile terminal 100. Generating interference signals, and the like. Memory 120 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 120 may further include memory located remotely from the processor 180, which may be connected to the mobile terminal 100 through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphic user interfaces of the mobile terminal 100, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-emitting diode), or the like. Further, the touch-sensitive surface 631 may overlay the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in the figures touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The mobile terminal 100 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may generate an interrupt when the folder is closed or closed. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured in the mobile terminal 100, detailed descriptions thereof are omitted.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and mobile terminal 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then to the RF circuit 110 to be transmitted to, for example, another terminal, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of a peripheral headset with the mobile terminal 100.
The mobile terminal 100, which can assist the user in receiving requests, transmitting information, etc., through the transmission module 170 (e.g., Wi-Fi module), provides the user with wireless broadband internet access. Although the transmission module 170 is shown in the drawings, it is understood that it does not belong to the essential constitution of the mobile terminal 100 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the mobile terminal 100, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal 100 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile terminal. Optionally, processor 180 may include one or more processing cores; in some embodiments, the processor 180 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The mobile terminal 100 may also include a power supply 190 (e.g., a battery) for powering the various components, which may be logically coupled to the processor 180 via a power management system that may be used to manage charging, discharging, and power consumption management functions in some embodiments. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the mobile terminal 100 further includes a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the display unit of the mobile terminal 100 is a touch screen display, and the mobile terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
acquiring a head image of a user;
generating an AR expression according to the feature information in the head image;
and executing preset operation corresponding to the AR expression according to the AR expression.
In order to better operate the mobile terminal 100 provided in the above embodiment, when it is inconvenient to operate the mobile terminal 100 by hand in some scenes, the mobile terminal 100 may be controlled by the AR expression, so that the mobile terminal 100 is more convenient to operate while increasing the interest. Therefore, a second embodiment of the present invention provides a terminal control method, which is applied to a mobile terminal 100, and a flowchart of the terminal control method is shown in fig. 3, and includes the following specific steps:
step S101, receiving a wake-up instruction from a user.
Further, the wake-up instruction comprises a voice wake-up instruction.
The wake-up operation of the mobile terminal 100 such as a traditional mobile phone or tablet is usually performed by pressing a power key and double-clicking a touch screen, and after wake-up, a user can process complex and various tasks on the mobile terminal 100. However, in some usage scenarios or for some special groups of people, the traditional way of waking up may be inconvenient, for example, when cooking in the kitchen, doing outdoor exercises in winter, or for patients with both hands being disabled, pressing a power key or double clicking a touch screen may present difficulties. At this time, the mobile terminal 100 is awakened through the voice awakening instruction, so that convenience is brought to the user, and the use scene and the applicable population of the mobile terminal 100 can be expanded.
Further, after receiving the voice wake-up command from the user, the mobile terminal 100 further identifies the voice wake-up command, identifies whether the command is a wake-up command, that is, whether to unlock the mobile terminal, and also identifies whether the user who sent the command is the owner of the mobile terminal 100, that is, identifies the sound characteristics of the command, such as the tone, and the like. When the voice characteristics of the instruction are identified to be consistent with the voice characteristics of the instruction stored in the mobile terminal 100 in advance, the mobile terminal 100 is awakened and unlocked; otherwise, the mobile terminal 100 maintains the locked state. By the method, the mobile terminal 100 can be prevented from being awakened and unlocked after other people eavesdrop the voice awakening instruction, and the use safety is improved.
And step S102, acquiring a head image of the user.
After waking up the mobile terminal 100, an image of the user's head may be acquired. The acquiring of the head image of the user may specifically include: the image acquisition device is invoked and then the head image of the user is acquired. The head image of the user can be static or dynamic. The image capture device may include a front camera in the mobile terminal 100.
And S103, generating an AR expression according to the feature information in the head image.
The AR expression is different from the traditional expression, and the AR expression can comprise various types, such as cartoon character head images which are generated according to the head images of the user and have higher similarity with the real person; head images of the virtual character may also be included, such as cartoon animal head images, cartoon antique head images, etc. generated from the head images of the user. Since the feature information in the head image of the user collected by the mobile terminal 100 may include a head motion and a facial expression motion, the generated AR expression may be an AR expression with a head motion or an AR expression with a facial expression motion. Specifically, the facial expression motions may include single motions of eyes, mouth, head shaking, head nodding, and combinations of multiple motions. For example, the generated AR expression may be a heads-up AR expression, a smiling AR expression, an eye-closing and tongue-opening AR expression.
The operation corresponding to the AR expression can be set by a user in a user-defined mode. That is, the user makes a certain head motion or facial expression motion, the mobile terminal 100 collects the head motion or facial expression motion to generate an AR expression, and the user sets the AR expression to correspond to a specific operation.
For a head motion or a facial expression motion that some users may make unconsciously frequently, when the head motion or the facial expression motion is generated into an AR expression and set to correspond to a certain operation, the head motion or the facial expression motion may be set to correspond to the certain operation while remaining unchanged for a certain period of time. For example, an AR expression is generated according to the binocular closing motion acquired in step S102, and the corresponding operation is a screen turning operation. However, as the eyes may be closed during blinking, and blinking is usually an unconscious movement, if the screen closing operation corresponding to the eyes is directly set, some misoperation may result, so that when the user does not wish to screen close, the AR expression generated by acquiring the eyes closing movement is avoided, the mobile terminal 100 is controlled to screen close, and the AR expression generated when the head movement or the facial expression movement is kept unchanged within a certain time period can be set to correspond to a certain operation, so as to avoid the occurrence of misoperation. For example, the user may generate an AR expression after closing both eyes for a certain time by self-definition, and then control the mobile terminal 100 to turn off the screen. Wherein the certain time period may be 5 seconds.
It can be understood by those skilled in the art that the description is only one implementation manner for preventing misoperation, and in other embodiments of the present invention, the curvature of the mouth angle rising during smiling may be set to correspond to a certain operation, etc. to achieve precise control and avoid misoperation.
Further, step S103 further includes: identifying feature information in the head image to confirm whether actions in the feature information comprise actions in a custom setting; generating an AR expression when the action in the feature information is included in an action in a custom setting.
Through right the characteristic information is discerned, selects the action that includes in the custom setting to generate the AR expression, has realized screening the head action and the facial expression action that the user made, has avoided frequently generating AR expression increase mobile terminal 100's consumption, influences user experience, and controls mobile terminal 100 through the AR expression, when having richened mobile terminal 100's control mode, has still increased the interest.
And step S104, executing preset operation corresponding to the AR expression according to the AR expression.
Specifically, step S104 may include executing a corresponding page operation according to the AR expression. When a user reads an electronic book, browses multimedia pictures, plays music and videos, browses web pages, etc., on the mobile terminal 100, the user may need to frequently operate the touch screen of the mobile terminal 100 to implement operations of different pages. By presetting different AR expressions and different page operations in one-to-one correspondence, corresponding page operations can be executed according to the AR expressions, and the method specifically comprises the following application scenes:
when some application windows can be closed by sliding the page upwards or the long images are continuously browsed by sliding the page upwards, the user performs the head-up action, the mobile terminal 100 collects the head-up action to generate the head-up AR expression, and the page sliding operation is executed.
When the user needs to slide down the operation page of the mobile terminal 100, the user performs a head lowering action, the mobile terminal 100 collects the head lowering action to generate a head lowering AR expression, and executes a page sliding operation.
When the user needs to slide the operation page of the terminal 100 to the left, the user performs a left-looking action, the mobile terminal 100 collects the left-looking action to generate a left-looking AR expression, and executes a page left-sliding operation.
When the user needs to slide the operation page of the terminal 100 to the right, the user performs a right-looking action, the mobile terminal 100 collects the right-looking action to generate a right-looking AR expression, and executes a page right-sliding operation.
When a user clicks a dialog box on an operation page in the process of using the mobile terminal 100 and needs to click for determination, the user performs a smiling action, the mobile terminal 100 collects the smiling action to generate a smiling AR expression, and determination operation is executed.
When the user needs to click to cancel in the process of using the mobile terminal 100, the user makes a mouth opening action, the mobile terminal 100 collects the mouth opening action to generate a mouth opening AR expression, and cancel operation is executed.
When a user needs to return to a homepage in the process of using the mobile terminal 100, the homepage is usually returned in the existing mode by pressing a Home key, in this embodiment, the user only needs to do a left eye closing action, and the mobile terminal 100 collects the left eye closing action to generate a left eye closing AR expression and executes a homepage returning operation.
When a user needs to return to a previous operation in the process of using the mobile terminal 100, the existing method usually returns to the previous operation by clicking a return key, in this embodiment, the user only needs to do a right eye closing action, and the mobile terminal 100 collects the right eye closing action to generate an AR expression for closing the right eye, and executes the operation for returning to the previous operation.
After the mobile terminal 100 is used, when it is necessary to turn off the screen, the existing method generally includes that the user presses a power key to turn off the screen or the mobile terminal 100 automatically turns off the screen after waiting for a period of time, in this embodiment, the user only needs to do the closing movement and keep for a period of time, and the mobile terminal 100 collects the closing movement to generate the closing AR expression and executes the screen turning operation. Wherein, the time for closing and holding the eyes of the user can be 5 seconds.
In other embodiments of the present invention, the same AR expression may also correspond to other operations, and is not limited to page operations; the same operation can be controlled by other AR expressions.
In order to better implement the terminal control method described in the second embodiment, the third embodiment of the present invention will be further described from the perspective of a terminal control device, where the terminal control device may be specifically implemented as an independent entity, and may also be integrated into the mobile terminal 100 provided in the first embodiment of the present invention, and the mobile terminal 100 may include a mobile phone, a tablet computer, and the like.
As shown in fig. 4, fig. 4 is a block diagram of a terminal control device according to a third embodiment of the present invention, which is applied to a mobile terminal 100, and the terminal control device may include:
head image acquisition module 202: the head image acquisition device is used for acquiring a head image of a user;
the AR expression generation module 203: the AR expression generation module is used for generating an AR expression according to the feature information in the head image;
the execution module 204: and the AR expression processing unit is used for executing preset operation corresponding to the AR expression according to the AR expression.
Further, the terminal control apparatus may further include:
the receiving module 201: for receiving a wake-up instruction from a user.
Specifically, the receiving module 201 may further include an instruction receiving module 2011 and an instruction identifying module 2012. When the instruction receiving module 2011 receives that the wake-up instruction from the user is a voice instruction, the instruction identifying module 2012 identifies the content included in the voice instruction, and identifies the sound feature of the voice instruction, for example, the tone and the tone of the voice instruction. When the content of the voice command is recognized to be a pre-stored awakening command and the recognized sound characteristic is consistent with the sound characteristic of the pre-stored command, awakening and unlocking; when the content of the voice command is recognized to be not the prestored awakening command or the recognized sound characteristic is inconsistent with the sound characteristic of the prestored command, the awakening unlocking operation is not carried out.
The head image acquisition module 202 may include a calling module 2021 and an acquisition module 2022. When the receiving module 201 receives a wake-up instruction from a user, the invoking module 2021 invokes the acquiring module 2022, and immediately the acquiring module 2022 starts to acquire a head image of the user and acquire feature information in the head image, so that the subsequent AR expression generating module 203 generates an AR expression having corresponding feature information.
The AR expression generation module 203 may include a feature information recognition module 2031 and an expression generation module 2032. The feature information recognition module 2031 is configured to determine whether the action in the feature information includes an action in a custom setting, and when the action in the feature information includes an action in a custom setting, the expression generation module 2032 generates an AR expression.
The executing module 204 may include a recognizing module 2041 and an operation executing module 2042. When the AR expression generating module 203 generates an AR expression having certain feature information, the identifying module 2041 identifies the AR expression to find an operation corresponding to the AR expression. When the operation corresponding to the AR expression is found, the operation execution module 2042 executes the corresponding operation.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily, and implemented as the same or a plurality of entities, where the specific implementation of the above modules may refer to the foregoing method embodiment, and specific beneficial effects that can be achieved may also refer to the beneficial effects in the foregoing method embodiment, which are not described herein again.
It will be understood by those skilled in the art that all or part of the steps in the terminal control method provided in the second embodiment of the present invention may be implemented by instructions, or by instructions controlling associated hardware, and the instructions may be stored in a computer-readable storage medium and loaded and executed by a processor. To this end, a fourth embodiment of the present invention provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute any step of the terminal control method provided in the second embodiment.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute any step in the terminal control method provided in the second embodiment, the beneficial effects that can be achieved by the terminal control method provided in the second embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The terminal control method, the terminal control device, the storage medium and the mobile terminal provided by the embodiments of the present invention are described in detail above, and a specific example is applied in the text to explain the principle and the implementation of the present invention, and the description of the above embodiments is only used to help understanding the technical scheme and the core idea of the present invention; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A terminal control method is applied to a mobile terminal, and is characterized by comprising the following steps:
acquiring a head image of a user;
generating an AR expression according to the feature information in the head image;
and executing preset operation corresponding to the AR expression according to the AR expression.
2. The terminal control method according to claim 1, further comprising, before the step of capturing the head image of the user:
receiving a wake-up instruction from a user;
the wake-up instruction comprises a voice wake-up instruction.
3. The terminal control method according to claim 1, wherein the step of generating an AR expression is performed based on feature information in the head image, the feature information including a head action and a facial expression action.
4. The terminal control method according to claim 3, wherein the facial expression action includes a single action and a combination of multiple actions.
5. The terminal control method according to claim 1, wherein the performing, according to the AR expression, a preset operation corresponding to the AR expression includes:
and executing corresponding page operation according to the AR expression.
6. The terminal control method according to claim 5, wherein the executing the corresponding page operation according to the AR expression includes:
respectively executing page up-sliding operation, page down-sliding operation, page left-sliding operation and page right-sliding operation according to the head-up AR expression, the head-down AR expression, the left-viewing AR expression and the right-viewing AR expression;
and respectively executing a determining operation, a canceling operation, a homepage returning operation, a previous-level returning operation and a screen resting operation according to the smiling AR expression, the mouth opening AR expression, the left eye closing AR expression, the right eye closing AR expression and the double-eye closing AR expression.
7. A terminal control device applied to a mobile terminal, comprising:
a head image acquisition module: the head image acquisition device is used for acquiring a head image of a user;
AR expression generation module: the AR expression generation module is used for generating an AR expression according to the feature information in the head image;
an execution module: and the AR expression processing unit is used for executing preset operation corresponding to the AR expression according to the AR expression.
8. The terminal control device according to claim 7, further comprising:
a receiving module: for receiving a wake-up instruction from a user.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the terminal control method of any of claims 1 to 6.
10. A mobile terminal comprising a processor and a memory, the processor being electrically connected to the memory, the memory being configured to store instructions and data, the processor being configured to perform the steps of the terminal control method according to any one of claims 1 to 6.
CN201911379315.0A 2019-12-27 2019-12-27 Terminal control method, device, storage medium and mobile terminal Pending CN111142666A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911379315.0A CN111142666A (en) 2019-12-27 2019-12-27 Terminal control method, device, storage medium and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911379315.0A CN111142666A (en) 2019-12-27 2019-12-27 Terminal control method, device, storage medium and mobile terminal

Publications (1)

Publication Number Publication Date
CN111142666A true CN111142666A (en) 2020-05-12

Family

ID=70521100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911379315.0A Pending CN111142666A (en) 2019-12-27 2019-12-27 Terminal control method, device, storage medium and mobile terminal

Country Status (1)

Country Link
CN (1) CN111142666A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576839A (en) * 2012-07-24 2014-02-12 广州三星通信技术研究有限公司 Facial recognition based terminal operation control device and method
CN104317398A (en) * 2014-10-15 2015-01-28 天津三星电子有限公司 Gesture control method, wearable equipment and electronic equipment
CN104685445A (en) * 2012-09-28 2015-06-03 Lg电子株式会社 Portable device and control method thereof
CN104793744A (en) * 2015-04-16 2015-07-22 天脉聚源(北京)传媒科技有限公司 Gesture operation method and device
CN105022480A (en) * 2015-07-02 2015-11-04 深圳市金立通信设备有限公司 Input method and terminal
CN110321009A (en) * 2019-07-04 2019-10-11 北京百度网讯科技有限公司 AR expression processing method, device, equipment and storage medium
CN110503724A (en) * 2019-08-19 2019-11-26 北京猫眼视觉科技有限公司 A kind of AR expression resource construction management system and method based on human face characteristic point

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576839A (en) * 2012-07-24 2014-02-12 广州三星通信技术研究有限公司 Facial recognition based terminal operation control device and method
CN104685445A (en) * 2012-09-28 2015-06-03 Lg电子株式会社 Portable device and control method thereof
CN104317398A (en) * 2014-10-15 2015-01-28 天津三星电子有限公司 Gesture control method, wearable equipment and electronic equipment
CN104793744A (en) * 2015-04-16 2015-07-22 天脉聚源(北京)传媒科技有限公司 Gesture operation method and device
CN105022480A (en) * 2015-07-02 2015-11-04 深圳市金立通信设备有限公司 Input method and terminal
CN110321009A (en) * 2019-07-04 2019-10-11 北京百度网讯科技有限公司 AR expression processing method, device, equipment and storage medium
CN110503724A (en) * 2019-08-19 2019-11-26 北京猫眼视觉科技有限公司 A kind of AR expression resource construction management system and method based on human face characteristic point

Similar Documents

Publication Publication Date Title
JP7391102B2 (en) Gesture processing methods and devices
CN107613131B (en) Application program disturbance-free method, mobile terminal and computer-readable storage medium
CN110543289B (en) Method for controlling volume and electronic equipment
CN108459797B (en) Control method of folding screen and mobile terminal
JP7081048B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic devices
WO2020042785A1 (en) Application display method and mobile terminal
EP3719612A1 (en) Processing method for reducing power consumption and mobile terminal
JP7403648B2 (en) Synchronization method and electronic equipment
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
CN111324235A (en) Screen refreshing frequency adjusting method and electronic equipment
CN108958587B (en) Split screen processing method and device, storage medium and electronic equipment
WO2020062310A1 (en) Stylus detection method, system, and related device
JP2023503281A (en) Energy efficient display processing method and device
CN111182236A (en) Image synthesis method and device, storage medium and terminal equipment
CN112698756A (en) Display method of user interface and electronic equipment
CN111443815A (en) Vibration reminding method and electronic equipment
CN108108079A (en) A kind of icon display processing method and mobile terminal
CN111158815B (en) Dynamic wallpaper blurring method, terminal and computer readable storage medium
CN111405180A (en) Photographing method, photographing device, storage medium and mobile terminal
CN109683768A (en) A kind of operating method and mobile terminal of application
CN109862172A (en) A kind of adjusting method and terminal of screen parameter
CN111966436A (en) Screen display control method and device, terminal equipment and storage medium
CN109672845B (en) Video call method and device and mobile terminal
CN111522613B (en) Screen capturing method and electronic equipment
WO2024032124A1 (en) Method for folding and unfolding scroll screen and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200512