CN111324199A - Terminal control method and device, terminal and readable storage medium - Google Patents

Terminal control method and device, terminal and readable storage medium Download PDF

Info

Publication number
CN111324199A
CN111324199A CN201811528321.3A CN201811528321A CN111324199A CN 111324199 A CN111324199 A CN 111324199A CN 201811528321 A CN201811528321 A CN 201811528321A CN 111324199 A CN111324199 A CN 111324199A
Authority
CN
China
Prior art keywords
user
button
terminal
gesture operation
target gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811528321.3A
Other languages
Chinese (zh)
Other versions
CN111324199B (en
Inventor
黄翊凇
杨疆
梁耿
陈宣励
唐伟帼
黄坤碧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Guangxi Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Guangxi Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Guangxi Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201811528321.3A priority Critical patent/CN111324199B/en
Publication of CN111324199A publication Critical patent/CN111324199A/en
Application granted granted Critical
Publication of CN111324199B publication Critical patent/CN111324199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a terminal control method, a device, a terminal and a readable storage medium, wherein the method comprises the following steps: acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user; identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance. In the invention, each of the at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity when the hand information of the user is identified in the image are reduced, and the accuracy of controlling the terminal is improved.

Description

Terminal control method and device, terminal and readable storage medium
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a terminal control method and apparatus, a terminal, and a readable storage medium.
Background
Touch-sensitive screen is installed mostly to common terminal on the market to the terminal is the cell-phone for the example, and the user can realize the control to the terminal through operating on the touch-sensitive screen, but when user's hand sweats or is stained with water, when carrying out the control at terminal through the touch, the terminal can't accurately sense the operation of user on the touch-sensitive screen, consequently for the convenience of user's operation, provides the implementation scheme that carries out terminal control through non-touch again for the user.
In the prior art, when the terminal realizes non-touch control, gesture operation of a user at a preset distance from the touch terminal is collected, the collected gesture operation is identified, the identified gesture operation is converted into a command which can be identified by a system, and control corresponding to the gesture operation is executed. However, when the terminal is controlled by using the method, the terminal needs to recognize the hand information of the user by sequentially recognizing each pixel point in the image in each image corresponding to the acquired gesture operation, and since the arrangement of the pixel points in the image is not regular, the recognition difficulty and complexity are high when the hand information of the user is recognized in the image, and inaccurate control over the terminal is easily caused.
Disclosure of Invention
The invention provides a terminal control method, a terminal control device, a terminal and a readable storage medium, which are used for solving the problems that in the prior art, the hand information identification difficulty and complexity of a user are high, and the control of the terminal is inaccurate.
The invention provides a terminal control method, which is applied to a terminal and comprises the following steps:
acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user;
identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user;
judging whether the target gesture operation exists in a pre-stored gesture operation set or not;
and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
Further, the acquiring at least two images includes:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on clothes of the user;
if so, storing the information of the at least one button, and acquiring subsequent images.
Further, the acquiring at least two images includes:
at least two images are continuously acquired according to a set time interval.
Further, if the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises the following steps:
the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
Further, if the target gesture operation of the user comprises a rotation operation of a hand of the user on a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in a terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
Further, before the acquiring at least two images, the method further comprises:
judging whether the user authorizes image acquisition;
if yes, the subsequent steps are carried out.
The invention provides a terminal control device, which is applied to a terminal and comprises:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least two images, and each image in the at least two images comprises information of at least one button on clothes of a user;
the determining module is used for identifying hand information of the user on the at least one button in each image and determining target gesture operation of the user;
the judging module is used for judging whether the target gesture operation exists in a pre-stored gesture operation set or not; if yes, triggering an execution module;
and the execution module is used for executing the terminal control instruction corresponding to the target gesture operation which is stored in advance.
The invention provides a terminal, which comprises a memory and a processor;
the processor is used for reading the program in the memory and executing the following processes: acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user; identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
Further, the processor is specifically configured to, if an image is acquired for the first time, determine whether the image acquired for the first time includes information about at least one button worn by the user; if so, storing the information of the at least one button, and acquiring subsequent images.
Further, the processor is specifically configured to continuously acquire at least two images at a set time interval.
Further, if the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises the following steps: the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
Further, if the target gesture operation of the user comprises a rotation operation of a hand of the user on a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in a terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
Further, the processor is further configured to determine whether the user authorizes image acquisition; if so, at least two images are acquired.
The present invention provides a terminal, including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory has stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of any of the methods described above.
The present invention provides a computer readable storage medium storing a computer program executable by a terminal, the program, when run on the terminal, causing the terminal to perform the steps of any of the methods described above.
The invention provides a terminal control method, a device, a terminal and a readable storage medium, wherein the method comprises the following steps: acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user; identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance. In the invention, each of the at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity when the hand information of the user is identified in the image are reduced, and the accuracy of controlling the terminal is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a terminal control process according to embodiment 1 of the present invention;
fig. 2 is a schematic flowchart of terminal control according to embodiment 5 of the present invention;
fig. 3 is a schematic structural diagram of a terminal according to embodiment 6 of the present invention;
fig. 4 is a schematic structural diagram of a terminal according to embodiment 7 of the present invention;
fig. 5 is a schematic diagram of a terminal control device according to an embodiment of the present invention.
Detailed Description
In order to reduce the recognition difficulty and complexity when recognizing the hand information of the user in the image and improve the accuracy of terminal control, the embodiment of the invention provides a terminal control method, a terminal control device, a terminal and a readable storage medium.
The terminal comprises a hardware layer, an operating system layer running on the hardware layer, and an application layer running on the operating system.
The hardware layer includes hardware such as a Central Processing Unit (CPU), a Memory Management Unit (MMU), and a Memory.
The operating system may be any one or more computer operating systems that implement terminal control through a Process (Process), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system.
In the embodiment of the present invention, the terminal may be a handheld device such as a smart phone and a tablet computer, or a terminal device such as a desktop computer and a portable computer, which is not particularly limited in the embodiment of the present invention, as long as the terminal control can be realized by running a program recorded with codes of the terminal control method in the embodiment of the present invention.
The execution main body of the terminal control in the embodiment of the present invention may be a terminal, or a functional module capable of calling a program and executing the program in the terminal.
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1:
fig. 1 is a schematic diagram of a terminal control process provided in an embodiment of the present invention, where the process includes the following steps:
s101: the method comprises the steps of collecting at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user.
The terminal control method provided by the embodiment of the invention is applied to the terminal.
The terminal can collect at least two images, and specifically, the terminal comprises an external image capturing component, such as a front or rear camera, an infrared detector and the like, which is used for capturing external images, capturing the motion of the hand of the user and the like.
Each of the at least two images includes information of at least one button on the user's clothing, if the external image capturing component includes a camera, the information of the at least one button on the user's clothing included in the image may be image characteristic information of the at least one button, and if the external image capturing component includes an infrared detector, the information of the at least one button on the user's clothing included in the image may be identification information of the marked at least one button, such as position information or label information.
Preferably, the external image capturing component comprises a camera.
The acquiring of the at least two images comprises:
at least two images are continuously acquired according to a set time interval.
The set time interval may be any, and of course, in order to ensure the continuity of the gesture operation of the user, the set time interval may be smaller.
Specifically, the number of the at least two images may be any, and in order to ensure consistency of the acquired gesture operation of the user, the number of the images may be large.
S102: identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user.
The terminal is capable of recognizing the hand information of the user on at least one button in each image, and specifically, the recognition of the hand information of the user may be performed by a GPU (Graphics Processing Unit) included in the terminal.
The acquired at least two images respectively comprise information of at least one button on a user's clothes, so the terminal determines the hand information of the user on the at least one button according to the information of the at least one button in each image, for example, the hand information of the user is identified in a set area of the button for each button, the button of the clothes worn by the user is used as an image element for assisting gesture identification, the pattern background is simplified from the aspect of digital graph analysis, the difficulty and complexity of hand information identification of the user are reduced, the richness of the mobile phone on trigger events of gesture identification is increased, the clothes worn by the user and the button of the clothes are used as auxiliary gesture identification elements, the user experience can be improved, and gesture actions of the user without reference conditions are reduced.
After the terminal determines the hand information of the user in each of the at least two images, the terminal may determine the target gesture operation of the user according to the hand information of the user in the at least two images.
The process of determining the target gesture operation of the user according to the hand information of the user of the at least two images can be realized by adopting the prior art, and details are not repeated in the embodiment of the invention.
S103: judging whether the target gesture operation exists in a pre-stored gesture operation set or not; if so, S104; if not, proceed to S105.
The terminal may be configured to store a gesture operation set in advance, and specifically, the gesture operation set may be stored in advance in a storage component for storing data in the terminal.
The pre-saved set of gesture operations in the terminal may be understood as a set of gesture operations for triggering an event controlling the terminal.
The terminal can judge whether a target gesture operation exists in the gesture operation set, namely whether the target gesture operation can be found in the gesture operation set.
And executing different steps according to different judgment results.
S104: and executing a terminal control instruction corresponding to the target gesture operation which is stored in advance.
And a terminal control instruction corresponding to the gesture operation in the gesture operation set is also prestored in the terminal, and the terminal control instruction is used for triggering the corresponding event for controlling the terminal.
Therefore, after the terminal determines that the target gesture operation exists in the gesture operation set, the terminal control instruction corresponding to the target gesture operation can be determined, and the terminal control instruction corresponding to the target gesture operation is executed.
The process of executing the terminal control command by the terminal belongs to the prior art, and is not described in detail in the embodiment of the present invention.
After the terminal executes the terminal control instruction corresponding to the target gesture operation, the executed result can be displayed, the terminal includes a display content component, the display content component includes but is not limited to an lcd screen, an AMOLED screen or a mobile phone screen manufactured by other technologies, and the display content component is used for displaying the executed result, the picture content of the terminal and the like.
S105: the terminal control instruction is not executed.
And if the terminal does not have the target gesture operation in the gesture operation set, the terminal does not execute the terminal control instruction.
Of course, in order to realize the control of the terminal, the terminal can also prompt the user to re-input the gesture operation.
In addition, the terminal can also comprise an internet module which is used for connecting the internet and calling data materials of the server side and the like.
In the embodiment of the invention, each image of the at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of controlling the terminal is improved.
Example 2:
on the basis of the above embodiment, in the embodiment of the present invention, the acquiring at least two images includes:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on clothes of the user;
if so, storing the information of the at least one button, and acquiring subsequent images.
In order to realize the control of the terminal, if the terminal collects the image for the first time in the process of collecting the image, when the information of at least one button is identified in the image, the terminal collects the subsequent image.
When the terminal collects the image for the first time, whether the image collected for the first time contains information of at least one button on clothes of the user is judged. The process of identifying the information of the button in the image by the terminal can be realized by adopting the prior art, and is not described in detail in the embodiment of the invention.
After the terminal recognizes the information of the at least one button on the user's clothing in the image, the information of the at least one button may be stored, and the information of the at least one button stored in the terminal may include identification information and/or position information of the at least one button, and the like.
If at least one button is at least two buttons, the at least two buttons may be at least two buttons with regular arrangement, such as button features from top to bottom, from bottom to top (top to bottom in the captured image), from left to right, from right to left (left and right in the captured image), and so on.
When the terminal determines that the image acquired for the first time contains the information of at least one button on the clothes of the user and stores the information of the at least one button, the terminal can acquire the subsequent image, and the acquisition of the subsequent image can be performed according to a set time interval.
The gesture operation of the user can be acquired by acquiring the subsequent images, so that the terminal is controlled subsequently.
The following specific embodiment describes the embodiment of the present invention, an external image capturing component of the terminal captures the image of the user's real-time clothing, and extracts the image features to transmit to an arithmetic component (for the global computation of the present invention, including a central processing unit CPU, etc. for data budgeting) in the terminal, 3. if the image processor gpu judges that there is an image element in the captured image that meets the threshold of the top-to-bottom deduction sub-features, the element is saved as an image UI cache; the first and second buttons of the user from top to bottom are identified, the rough pixel range, namely the position information, is marked in the image UI cache and is stored in the storage component, the first button is marked as an image UI element Clothes _ buttons (1), and the second button is marked as an image UI element Clothes _ buttons (2), namely the identification information. If the image processor gpu determines that there are no image elements within the captured image that meet the top-to-bottom syndrome threshold, then a prompt is returned indicating a failure to identify.
In the embodiment of the invention, the terminal acquires the subsequent images after determining that the first acquired image contains the information of at least one button on the clothes of the user, so that the control on the terminal is realized.
Example 3:
on the basis of the above embodiments, in the embodiment of the present invention, if each image includes information of two buttons on the user's clothing, the target gesture operation of the user includes:
the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
In order to improve the flexibility of terminal control, if each image contains information of two buttons on the clothes of the user, the target gesture of the user can be the operation on a single button or a plurality of buttons.
The target gesture operation of the user includes a rotation operation of the first button of the two buttons by the hand of the user, and it can be understood that the user grasps the buttons (1) with one hand and rotates several angles.
The target gesture operation of the user includes a rotation operation of the second button of the two buttons by the hand of the user, and it can be understood that the user grasps the buttons (2) with one hand and rotates several angles.
The target gesture operation of the user includes a movement operation of the hand of the user from the first button to the second button of the two buttons, which is understood as that the user grasps the buttons (1) with one hand and then shifts the fingers from the buttons (1) to the buttons (2) from top to bottom, taking the above-mentioned embodiment as an example.
The target gesture operation of the user includes a movement operation of the hand of the user from the second button to the first button of the two buttons, which is understood as that the user grasps the buttons (2) with one hand and shifts the fingers from the buttons (2) to the buttons (1) from bottom to top, as in the above-mentioned embodiment.
If each image contains information of one button on the clothes of the user, the target gesture of the user may include a rotation operation of the hand of the user on the one button, and specifically, may further include a clockwise rotation operation of the hand of the user on the one button, and/or a counterclockwise rotation operation of the hand of the user on the one button.
If the information of more than two buttons on the user's clothing is all contained in every image, then user's target gesture can include the user hand to the rotatory operation of first button in two buttons, the second button … … nth button (N is the quantity that more than two buttons correspond), the user's hand is to the removal operation of two and more than two buttons in more than two buttons etc. promptly with the user's target gesture when including the information of two buttons on the user's clothing similar, give unnecessary details here.
Through the gesture recognition element for the operation of the user button, the operations of a rotary button, a touch button, a progress bar in the forward and reverse directions and the like can be simulated, and the motion capture can be realized by using motion capture hardware with low precision and low frequency scanning, so that the hardware cost is reduced.
Example 4:
on the basis of the above embodiments, in the embodiments of the present invention, if the target gesture operation of the user includes a rotation operation of a first button of the two buttons by a hand of the user, the terminal control instruction corresponding to the target gesture operation includes a start instruction of a first function in a terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
In order to improve the flexibility of terminal control, if each image contains information of two buttons on clothes of the user, the user executes a corresponding terminal control instruction through a target gesture.
When each image contains information of two buttons on clothes of a user, if target gesture operation of the user comprises rotation operation of a hand of the user on a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in the terminal, the starting instruction of the first function in the terminal can be regarded as triggering of an event 1 of the control terminal, and the user needs to start a certain button or continuous action operation, such as starting a 'photographing key' or starting a certain application and selecting a certain function in the application.
When each image contains information of two buttons on clothes of the user, if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of the second function in the terminal, and the starting instruction of the second function in the terminal can be regarded as the trigger of an event 2 of the control terminal and is regarded as that the user needs to start a certain button or continuous behavior operation. The second function may be the same as the first function or may be different from the first function, and if different, the second function may be the opposite of the first function, such as considered clicking an "exit back" button.
When each image contains information of two buttons on a user's clothing, if a target gesture operation of the user includes a moving operation of a hand of the user from a first button to a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation includes updating a first set value of a parameter value of a first function parameter, the updating of the first set value of the parameter value of the first function parameter can be regarded as the change of a numerical value of a progress bar in a currently started application or the change of a numerical value of a certain option in the application, for example, when the user is currently using a music APP, the updating of the first set value of the parameter value of the first function parameter is to reduce the music volume, when the user is currently using a text reading type APP, the updating of the first set value of the first function parameter is to display text on a page from top to bottom, and the like.
When each image contains information of two buttons on a user's clothing, if a target gesture operation of the user includes a moving operation of a hand of the user from a second button to a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation updates a second set value of a parameter value of a second function parameter, the updating of the first set value of the parameter value of the first function parameter can be regarded as the change of a numerical value of a progress bar in a currently started application or the change of a numerical value of a certain option in the application, the first function parameter and the second function parameter can be the same or different, the first set value and the second set value can be the same or different, the second function parameter and the first function parameter are the same, for example, when the user is currently using a music APP, the parameter value of the first function parameter updates the first set value to increase the music volume, when the user is currently using the text reading APP, the parameter value of the first function parameter updates the first set value to drag and display the text on the page from bottom to top, and the like.
When each image includes information of one or more buttons on the clothing of the user, the terminal control instruction corresponding to the target gesture operation may be similar to the information including the two buttons, and may be set according to the actual use requirement of the user, which is not described herein any more.
Example 5:
on the basis of the foregoing embodiments, in an embodiment of the present invention, before the acquiring at least two images, the method further includes:
judging whether the user authorizes image acquisition;
if yes, the subsequent steps are carried out.
In order to realize the flexibility of terminal control and meet different implementation requirements of a user, when the user authorizes image acquisition, the terminal acquires at least two images.
The terminal may have stored therein an identification of whether the image capturing is authorized, in particular the authorization of the image capturing may be understood as the authorization of whether the user's real-time clothing is allowed for image capturing.
The following description will be made with reference to a specific embodiment, as shown in fig. 2, 1. when a user uses a mobile phone device having the features of the present invention, an external image capturing component captures an image of the user's real-time clothing by user authorization, and extracts image features to transmit to an arithmetic component;
2. the arithmetic component invokes the image processor gpu to identify whether the user's clothing has top-down clasp characteristics. If the image processor gpu determines that there are no image elements within the captured image that meet the top-to-bottom syndrome threshold, then a prompt is returned indicating a failure to identify.
3. If the image processor gpu judges that an image element which meets the upper-to-lower button feature threshold value exists in the captured image, the element is saved as an image UI cache; the first and second buttons of the user from top to bottom are identified, the approximate pixel range is marked in the image UI cache and stored in the storage component, the first button is marked as image UI element clothesbuttons (1), and the second button is marked as image UI element clothesbuttons (2).
4. The operation component gives the authority to trigger certain operation of the mobile phone to the following user actions according to a preset scheme or a scheme selected by a user:
4-1: grasping the clothesjbuttons (1) with one hand of the user and rotating a plurality of angles is regarded as starting the trigger event 1;
4-2: grasping the clothesjbuttons (2) with one hand of the user and rotating a number of angles is regarded as a starting trigger event 2;
4-3, grasping the buttons (1) with one hand of the user, and displacing the fingers from the buttons (1) to the buttons (2) from top to bottom to be regarded as a trigger event 3;
4-4: grasping the buttons (2) with one hand of the user, and then displacing the fingers from the buttons (2) to the buttons (1) from bottom to top to be regarded as a trigger event 4;
5. the external image capturing component continuously performs gesture recognition capturing on the user action at certain periodic intervals, and sends an image stream to an image processing chip GPU for recognition;
6. and the GPU of the image processing chip calls a storage assembly in the storage assembly, compares the storage assembly with the captured image stream data, and if the images captured in a certain period of time are judged to meet any event threshold value, informs the operation assembly to perform trigger event operation of a preset scheme.
7. For confirming the trigger event, the following processing is carried out:
7-1. trigger event 1. the present invention regards the user as requiring to activate a certain button or continuous action operation, such as pre-grasping the buttons (1) and turning several angles as activating the "take key", or activating an application and selecting a certain function in the application.
7-2. trigger event 1. the present invention regards as a user requesting to activate a certain button or continuous action operation, which may be the same as or different from event 1. For example, it is assumed that the user clicks the "exit back" button by grabbing the buttons (2) and rotating several angles;
7-3, triggering an event 3, namely that the user starts a certain progress bar control corresponding to the currently used APP, the numerical value of the control progress bar is reduced, and the numerical value of a corresponding option which can be set by the application is reduced, for example, if the user is currently using a music APP, the event 3 triggers the invention to reduce the music volume, and if the user is currently using a text reading APP, the event 3 triggers the invention to drag and display the text on the page from top to bottom.
7-4, triggering an event 4, wherein the method regards that a user starts a certain progress bar control corresponding to the currently used APP, the numerical value of the control progress bar is increased, and the numerical value of a corresponding option which can be set by the application is increased, for example, if the user is currently using a music APP, the event 4 triggers the method to increase the music volume, and if the user is currently using a text reading APP, the event 4 triggers the method to drag and display the text on the page from bottom to top, and the like.
8. If the GPU of the image processing chip judges that a plurality of continuous frames of images captured within a certain period of time do not reach any event threshold value, the next step of processing is not carried out;
9. step 5 is repeated until the user has not selected to exit the service of the invention.
Example 6:
on the basis of the foregoing embodiments, an embodiment of the present invention further provides a terminal, as shown in fig. 3, including: a processor 301 and a memory 302;
the processor 301 is configured to execute the program in the read memory 302, and perform the following processes:
acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user; identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
Based on the same inventive concept, the embodiment of the present invention further provides a terminal, and as the principle of solving the problem of the terminal is similar to the terminal control method, the implementation of the terminal may refer to the implementation of the method, and repeated details are not repeated.
In fig. 3, the bus architecture may include any number of interconnected buses and bridges, with one or more processors represented by processor 301 and various circuits of memory represented by memory 302 being linked together. The bus architecture may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. The transceiver 303 may be a number of elements including a transmitter and a receiver providing a means for communicating with various other apparatus over a transmission medium. The processor 301 is responsible for managing the bus architecture and general processing, and the memory 302 may store data used by the processor 301 in performing operations.
Alternatively, the processor 301 may be a CPU (central processing unit), an ASIC (application specific integrated Circuit), an FPGA (Field Programmable Gate Array), or a CPLD (Complex Programmable Logic Device).
The processor 301 is specifically configured to, if an image is acquired for the first time, determine whether the image acquired for the first time includes information about at least one button worn by the user; if so, storing the information of the at least one button, and acquiring subsequent images.
The processor 301 is specifically configured to continuously acquire at least two images at a set time interval.
If the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises the following steps: the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
If the target gesture operation of the user comprises the rotation operation of the hand of the user on the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in the terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
The processor 301 is further configured to determine whether the user authorizes image acquisition; if so, at least two images are acquired.
Example 7:
on the basis of the foregoing embodiments, an embodiment of the present invention further provides a terminal, as shown in fig. 4, including: the system comprises a processor 401, a communication interface 402, a memory 403 and a communication bus 404, wherein the processor 401, the communication interface 402 and the memory 403 complete mutual communication through the communication bus 404;
the memory 403 has stored therein a computer program which, when executed by the processor 401, causes the processor 401 to perform any of the embodiments described above.
The communication bus mentioned in the above terminal may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface 402 is used for communication between the above-described terminal and other devices.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Alternatively, the memory may be at least one memory device located remotely from the processor.
The processor may be a general-purpose processor, including a central processing unit, a Network Processor (NP), and the like; but may also be a Digital instruction processor (DSP), an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
Example 8:
on the basis of the foregoing embodiments, an embodiment of the present invention further provides a computer storage readable storage medium, where a computer program executable by a terminal is stored in the computer storage readable storage medium, and when the program runs on the terminal, the terminal is caused to execute any of the above embodiments.
The above-mentioned computer readable storage medium may be any available medium or data storage device that can be accessed by a processor in a terminal, including but not limited to magnetic memory such as a flexible disk, a hard disk, magnetic tape, magneto-optical disk (MO), etc., optical memory such as CD, DVD, BD, HVD, etc., and semiconductor memory such as ROM, EPROM, EEPROM, nonvolatile memory (NANDFLASH), Solid State Disk (SSD), etc.
Fig. 5 is a schematic diagram of a terminal control device according to an embodiment of the present invention, which is applied to a terminal, and the device includes:
the acquisition module 501 is configured to acquire at least two images, where each of the at least two images includes information of at least one button on a user's clothing;
a determining module 502, configured to identify hand information of the user on the at least one button in each image, and determine a target gesture operation of the user;
a judging module 503, configured to judge whether the target gesture operation exists in a pre-stored gesture operation set; if so, the execution module 504 is triggered;
and the execution module 504 is configured to execute a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
The acquiring module 501 is specifically configured to, if an image is acquired for the first time, determine whether the image acquired for the first time includes information about at least one button worn by a user; if so, storing the information of the at least one button, and acquiring subsequent images.
The acquisition module 501 is specifically configured to continuously acquire at least two images at a set time interval.
If the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises the following steps:
the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
If the target gesture operation of the user comprises the rotation operation of the hand of the user on the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in the terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
The acquisition module is also used for judging whether the user authorizes the image acquisition; and if so, acquiring at least two images.
In the embodiment of the invention, each image of the at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of controlling the terminal is improved.
For the system/apparatus embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
It is to be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or operation from another entity or operation without necessarily requiring or implying any actual such relationship or order between such entities or operations.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (15)

1. A terminal control method is applied to a terminal, and the method comprises the following steps:
acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user;
identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user;
judging whether the target gesture operation exists in a pre-stored gesture operation set or not;
and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
2. The method of claim 1, wherein said acquiring at least two images comprises:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on clothes of the user;
if so, storing the information of the at least one button, and acquiring subsequent images.
3. The method of claim 1 or 2, wherein said acquiring at least two images comprises:
at least two images are continuously acquired according to a set time interval.
4. The method as claimed in claim 1 or 2, wherein if the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises:
the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
5. The method of claim 4, wherein if the target gesture operation of the user comprises a rotation operation of a first button of the two buttons by a hand of the user, the terminal control instruction corresponding to the target gesture operation comprises an starting instruction of a first function in a terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
6. The method of claim 1 or 2, wherein prior to acquiring the at least two images, the method further comprises:
judging whether the user authorizes image acquisition;
if yes, the subsequent steps are carried out.
7. A terminal control apparatus, applied to a terminal, the apparatus comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least two images, and each image in the at least two images comprises information of at least one button on clothes of a user;
the determining module is used for identifying hand information of the user on the at least one button in each image and determining target gesture operation of the user;
the judging module is used for judging whether the target gesture operation exists in a pre-stored gesture operation set or not; if yes, triggering an execution module;
and the execution module is used for executing the terminal control instruction corresponding to the target gesture operation which is stored in advance.
8. A terminal comprising a memory and a processor;
the processor is used for reading the program in the memory and executing the following processes: acquiring at least two images, wherein each image of the at least two images comprises information of at least one button on clothes of a user; identifying hand information of the user on the at least one button in each image, and determining a target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the target gesture operation, which is stored in advance.
9. The terminal of claim 8, wherein the processor is specifically configured to, if an image is first captured, determine whether the first captured image includes information about at least one button worn by the user; if so, storing the information of the at least one button, and acquiring subsequent images.
10. A terminal as claimed in claim 8 or 9, wherein the processor is particularly adapted to acquire at least two images consecutively at set time intervals.
11. The terminal according to claim 8 or 9, wherein if the information of two buttons on the clothes of the user is contained in each image, the target gesture operation of the user comprises: the hand of the user rotates a first button of the two buttons; or
The hand of the user rotates a second button of the two buttons; or
A moving operation of the hand of the user from a first button to a second button of the two buttons; or
The movement operation of the hand of the user from the second button to the first button of the two buttons.
12. The terminal of claim 11, wherein if the target gesture operation of the user comprises a rotation operation of a first button of the two buttons by a hand of the user, the terminal control instruction corresponding to the target gesture operation comprises an instruction for starting a first function in the terminal;
if the target gesture operation of the user comprises the rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises the movement operation of the hand of the user from a first button to a second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises the step of updating a parameter value of a first function parameter to a first set value;
and if the target gesture operation of the user comprises the movement operation of the hand of the user from the second button to the first button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating the parameter value of the second function parameter to a second set value.
13. The terminal of claim 8 or 9, wherein the processor is further configured to determine whether a user authorizes image capture; if so, at least two images are acquired.
14. A terminal, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory has stored therein a computer program which, when executed by the processor, causes the processor to carry out the steps of the method of any one of claims 1 to 6.
15. A computer-readable storage medium, characterized in that it stores a computer program executable by a terminal, which program, when run on the terminal, causes the terminal to carry out the steps of the method according to any one of claims 1 to 6.
CN201811528321.3A 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium Active CN111324199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811528321.3A CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811528321.3A CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Publications (2)

Publication Number Publication Date
CN111324199A true CN111324199A (en) 2020-06-23
CN111324199B CN111324199B (en) 2023-04-28

Family

ID=71170350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811528321.3A Active CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN111324199B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013109795A1 (en) * 2012-01-17 2013-07-25 Blast Motion Inc. Intelligent motion capture element
US20150153926A1 (en) * 2012-06-22 2015-06-04 Crambo Sa Activation of an application on a programmable device using gestures on an image
CN104978144A (en) * 2015-06-26 2015-10-14 中国工商银行股份有限公司 Gesture password input device and system and method for transaction based on system
US20150346820A1 (en) * 2014-06-03 2015-12-03 Google Inc. Radar-Based Gesture-Recognition through a Wearable Device
CN105717900A (en) * 2016-04-26 2016-06-29 华南理工大学 Smart home control gloves and home control, custom control gesture method thereof
CN106648347A (en) * 2016-09-14 2017-05-10 上海石肤网络科技有限公司深圳分公司 Method for operating and moving target objects on touch terminal in touch mode
CN106886275A (en) * 2015-12-15 2017-06-23 比亚迪股份有限公司 The control method of car-mounted terminal, device and vehicle
CN107450839A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Control method, device, storage medium and mobile terminal based on blank screen gesture
CN107577411A (en) * 2017-07-19 2018-01-12 陈宗林 Touch gestures identify motor speed regulating method
CN107831987A (en) * 2017-11-22 2018-03-23 出门问问信息科技有限公司 The error touch control method and device of anti-gesture operation
US20180136903A1 (en) * 2015-06-18 2018-05-17 Googale (2009) Ltd. A computerized system including rules for a rendering system accessible to non-literate users via a touch screen
CN108139855A (en) * 2015-08-19 2018-06-08 Lg电子株式会社 Watch style mobile terminal
CN108298078A (en) * 2013-07-31 2018-07-20 深圳市大疆创新科技有限公司 Long-range control method and terminal
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013109795A1 (en) * 2012-01-17 2013-07-25 Blast Motion Inc. Intelligent motion capture element
US20150153926A1 (en) * 2012-06-22 2015-06-04 Crambo Sa Activation of an application on a programmable device using gestures on an image
CN108298078A (en) * 2013-07-31 2018-07-20 深圳市大疆创新科技有限公司 Long-range control method and terminal
US20150346820A1 (en) * 2014-06-03 2015-12-03 Google Inc. Radar-Based Gesture-Recognition through a Wearable Device
US20180136903A1 (en) * 2015-06-18 2018-05-17 Googale (2009) Ltd. A computerized system including rules for a rendering system accessible to non-literate users via a touch screen
CN104978144A (en) * 2015-06-26 2015-10-14 中国工商银行股份有限公司 Gesture password input device and system and method for transaction based on system
CN108139855A (en) * 2015-08-19 2018-06-08 Lg电子株式会社 Watch style mobile terminal
CN106886275A (en) * 2015-12-15 2017-06-23 比亚迪股份有限公司 The control method of car-mounted terminal, device and vehicle
CN105717900A (en) * 2016-04-26 2016-06-29 华南理工大学 Smart home control gloves and home control, custom control gesture method thereof
CN106648347A (en) * 2016-09-14 2017-05-10 上海石肤网络科技有限公司深圳分公司 Method for operating and moving target objects on touch terminal in touch mode
CN107577411A (en) * 2017-07-19 2018-01-12 陈宗林 Touch gestures identify motor speed regulating method
CN107450839A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Control method, device, storage medium and mobile terminal based on blank screen gesture
CN107831987A (en) * 2017-11-22 2018-03-23 出门问问信息科技有限公司 The error touch control method and device of anti-gesture operation
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction

Also Published As

Publication number Publication date
CN111324199B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
CN106547420B (en) Page processing method and device
CN108960163B (en) Gesture recognition method, device, equipment and storage medium
WO2017133498A1 (en) Intelligent device and intelligent device control method
CN107463331B (en) Gesture track simulation method and device and electronic equipment
CN107229911B (en) Fingerprint identification method and mobile terminal
US8149281B2 (en) Electronic device and method for operating a presentation application file
CN107203313B (en) Method for adjusting desktop display object, mobile terminal and computer readable storage medium
CN107748641B (en) Numerical value adjustment control method and device, electronic equipment and storage medium
EP3215900A1 (en) Robotic process automation
CN107765985B (en) Application program control method, user terminal and medium product
CN107506130B (en) Character deleting method and mobile terminal
WO2019085598A1 (en) Method and apparatus for calculating above-the-fold rendering duration of page, and electronic device
CN106445328B (en) Unlocking method of mobile terminal screen and mobile terminal
US9684445B2 (en) Mobile gesture reporting and replay with unresponsive gestures identification and analysis
WO2017001560A1 (en) Robotic process automation
CN105808129B (en) Method and device for quickly starting software function by using gesture
CN115445212A (en) Game gift bag pushing method and device, computer equipment and storage medium
CN108846339B (en) Character recognition method and device, electronic equipment and storage medium
CN107315529B (en) Photographing method and mobile terminal
CN111324199B (en) Terminal control method and device, terminal and readable storage medium
CN109002293B (en) UI element display method and device, electronic equipment and storage medium
US20130250133A1 (en) Electronic devices with motion response and related methods
US9235694B2 (en) Recording medium, authentication device, and authentication method
CN111694498B (en) Interface display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No. 117, Minzu Avenue, Qingxiu District, Nanning City, Guangxi Zhuang Autonomous Region, 530028

Applicant after: GUANGXI BRANCH, CHINA MOBILE COMMUNICATION Group

Applicant after: CHINA MOBILE COMMUNICATIONS GROUP Co.,Ltd.

Address before: 530000 No. 55, Jinhu Road, Nanning, the Guangxi Zhuang Autonomous Region

Applicant before: GUANGXI BRANCH, CHINA MOBILE COMMUNICATION Group

Applicant before: CHINA MOBILE COMMUNICATIONS GROUP Co.,Ltd.

GR01 Patent grant
GR01 Patent grant